does anyone knows how to use the Level option in IO::Compress::Zip?
I have the problem that I am trying to zip a DB Backup file. But after Zip the file is smaller then the original file and cant be used to importing the file on another server. The file is corupt ore something like this...
I am using perl to zip the file like this...
my $zipfile = zip['MYFILE'] => $zipFile, Zip64 => 1, Method => ZIP_CM_STORE
or die "Zip failed: $ZipError\n";
But no Success. For example orig file size is 13.910.216KB and when Zipped its only 13.909.298KB.
I dont know why but i think i need to set the Level option to Z_NO_COMPRESSION. How to do that?
Thanks
As has already been mentioned, by specifying the method ZIP_CM_STORE you are telling IO::Compress::Zip not to compress the file at all.
If you don't specify a Method at all, the code will use ZIP_CM_DEFLATE (which is the standard compression used in practically all zip files)
my $zipfile = zip['MYFILE'] => $zipFile, Zip64 => 1
or die "Zip failed: $ZipError\n";
If you want to change the compression level, use the Level option. By default it will uses Z_DEFAULT_COMPRESSION for the compression level. If you want the best compression, use Z_BEST_COMPRESSION
my $zipfile = zip['MYFILE'] => $zipFile, Zip64 => 1, Level => Z_BEST_COMPRESSION
or die "Zip failed: $ZipError\n";
Based on the conversation thread I tried to reproduce the use-case.
First create a file similar in size to yours
$ truncate -s 13910K test
$ ls -lh test
-rw-rw-r-- 1 xxx yyy 14M Apr 13 12:00 test
Add it to zip file using IO::Compress::Zip
$ perl -MIO::Compress::Zip=:all -e 'zip "test" => "test.zip", Method => ZIP_CM_STORE'
Check the sizes & CRC
$ crc32 test
49769d91
$ ls -l test
-rw-rw-r-- 1 xxx yyy 14243840 Apr 13 12:00 test
$ unzip -lv test.zip
Archive: test.zip
Length Method Size Cmpr Date Time CRC-32 Name
-------- ------ ------- ---- ---------- ----- -------- ----
14243840 Stored 14243840 0% 2020-04-13 11:48 49769d91 test
-------- ------- --- -------
14243840 14243840 0% 1 file
All is as expected. The sizes & CRCs match.
Could you try this on your system please?
Related
I am writing a script in Lua 5.1 for use with a game engine (EDGE).
I need my script to copy about 20 files into a .miz file (which is really a zipped folder with a set structure) and navigate that structure and copy those files in from a non-zipped folder on the hard drive.
Because Windows 11 it the future I need to use NanaZip rather than 7z which isn't W11 supported.
However, all the examples I've found are for using LUA to zip up files, not insert non-zipped files INTO a zip file without unzipping it.
Is this even possible?
Similar to #koyaanisqatsi I tried it with 7z. You didn't comment on our question on why 7z should be avoided nor whether you are even allowed to use os.execute, but it should provide a good starting point:
os.execute("7z a yourZip.zip yourFile.png")
Where a is the flag for Add.
See the manual for other flags like compression: https://linux.die.net/man/1/7z
Windows 11 also have tar that have the option r and u
D:\temp>tar h
tar(bsdtar): manipulate archive files
First option must be a mode specifier:
-c Create -r Add/Replace -t List -u Update -x Extract
Common Options:
-b # Use # 512-byte records per I/O block
-f <filename> Location of archive (default \\.\tape0)
-v Verbose
-w Interactive
Create: tar -c [options] [<file> | <dir> | #<archive> | -C <dir> ]
<file>, <dir> add these items to archive
-z, -j, -J, --lzma Compress archive with gzip/bzip2/xz/lzma
--format {ustar|pax|cpio|shar} Select archive format
--exclude <pattern> Skip files that match pattern
-C <dir> Change to <dir> before processing remaining files
#<archive> Add entries from <archive> to output
List: tar -t [options] [<patterns>]
<patterns> If specified, list only entries that match
Extract: tar -x [options] [<patterns>]
<patterns> If specified, extract only entries that match
-k Keep (don't overwrite) existing files
-m Don't restore modification times
-O Write entries to stdout, don't restore to disk
-p Restore permissions (including ACLs, owner, file flags)
bsdtar 3.5.2 - libarchive 3.5.2 zlib/1.2.5.f-ipp bz2lib/1.0.6
( Above cmd.exe was opened from Lua with: os.execute('cmd') )
You can extract a ZIP with it but not creating one as far as i know.
(tar -xf archive.zip)
But is it a Problem for you to use TAR instead of ZIP?
I'm using SQLCMD to exporting data from SSMS and the I got about 50 million lines in total. All the data is exported to an csv file. However, I found that all the data as and the headers are all seperated into 2 lines in the csv file. For example, instead of having
id name age sex
--- ---- --- ---
1 Andy 20 M
2 Sally 25 F
I'am having this:
id name age
sex
--- ---- ---
---
1 Andy 20
M
2 Sally 25
F
I will need to import these data into another database so this file is obviously not working for me.
My data contains 17 columns and the command that I used is:
sqlcmd -S addr -d database -U user -P psw -Q "an sql command" -o "data.csv" -s"," -w 700
Thanks for the help
I have a number of project folders that all got their date modified set to the current date & time somehow, despite not having touched anything in the folders. I'm looking for a way to use either a batch applet or some other utility that will allow me to drop a folder/folders on it and have their date modified set to the date modified of the most recently modified file in the folder. Can anyone please tell me how I can do this?
In case it matters, I'm on OS X Mavericks 10.9.5. Thanks!
If you start a Terminal, and use stat you can get the modification times of all the files and their corresponding names, separated by a colon as follows:
stat -f "%m:%N" *
Sample Output
1476985161:1.png
1476985168:2.png
1476985178:3.png
1476985188:4.png
...
1476728459:Alpha.png
1476728459:AlphaEdges.png
You can now sort that and take the first line, and remove the timestamp so you have the name of the newest file:
stat -f "%m:%N" *png | sort -rn | head -1 | cut -f2 -d:
Sample Output
result.png
Now, you can put that in a variable, and use touch to set the modification times of all the other files to match its modification time:
newest=$(stat -f "%m:%N" *png | sort -rn | head -1 | cut -f2 -d:)
touch -r "$newest" *
So, if you wanted to be able to do that for any given directory name, you could make a little script in your HOME directory called setMod like this:
#!/bin/bash
# Check that exactly one parameter has been specified - the directory
if [ $# -eq 1 ]; then
# Go to that directory or give up and die
cd "$1" || exit 1
# Get name of newest file
newest=$(stat -f "%m:%N" * | sort -rn | head -1 | cut -f2 -d:)
# Set modification times of all other files to match
touch -r "$newest" *
fi
Then make that executable, just necessary one time, with:
chmod +x $HOME/setMod
Now, you can set the modification times of all files in /tmp/freddyFrog like this:
$HOME/setMod /tmp/freddyFrog
Or, if you prefer, you can call that from Applescript with a:
do shell script "$HOME/setMod " & nameOfDirectory
The nameOfDirectory will need to look Unix-y (like /Users/mark/tmp) rather than Apple-y (like Macintosh HD:Users:mark:tmp).
In Spring-XD the file source detects new files in an input directory and streams their content through the pipeline.
Is there an analogous sink which creates separate result files in an output directory (e.g. with the original file names) and not a single file to which all results were appended, http://docs.spring.io/spring-xd/docs/current/reference/html/#file-sink: "The file sink uses the stream name as the default name for the file it creates, and places the file in the /tmp/xd/output/ directory."?
Scroll down to the options in that document you referenced.
Use --nameExpression=....
If you are using mode=contents; the original file name is available in the file_name header:
--nameExpression=headers[file_name]
mode=lines doesn't currently capture the file name (it will be fixed in the next release).
If you are using mode=ref, you need to set a header.
Minimal Working Example
In Spring-XD
stream create --name test --definition "file --mode=contents | b:file --binary=true --dirExpression='''/tmp/out''' --nameExpression=headers[file_name]" --deploy
than
echo "1111" > /tmp/xd/input/test/file1.txt
echo "2222" > /tmp/xd/input/test/file2.txt
results in
ll /tmp/out/
>
> -rw-rw-r-- 1 rmv rmv 5 Jul 7 10:19 file1.txt
> -rw-rw-r-- 1 rmv rmv 5 Jul 7 10:19 file2.txt
I have a list of files in my current working copy that have been modified locally. There are about 50 files that have been changed.
I am using the following command to copy files that have been modified in subversion to a folder called /backup. Is there a way to do this but maintain the directories they are in? So it would do something similar to exporting a SVN diff of files. For example if I changed a file called /usr/lib/SPL/RFC.php then it would copy the usr/lib/SPL directory to backup also.
cp `svn st | ack '^M' | cut -b 8-` backup
It looks strange, but it is really easy to copy files with tar. E.g.
tar -cf - $( svn st | ack '^M' | cut -b 8- ) |
tar -C /backup -xf -
Why not create a patch of your changes? That way you have one file containing all of your changes which you can timestamp in the name - something like 2012-05-28-17-30-00-UnitTestChanges.patch, one per day.
Then you can roll up your changes to a fresh checkout once you're ready, and then commit them.
FYI: Subversion 1.8 should have checkpointing / shelving (which is what you seem to want to do), but that's a long way off, and might only be added in Subversion 1.9.