Exhaustive Data Compressor Comparison 305
crazyeyes writes "This is easily the best article I've seen comparing data compression software. The author tests 11 compressors: 7-zip, ARJ32, bzip2, gzip, SBC Archiver, Squeez, StuffIt, WinAce, WinRAR, WinRK, and WinZip. All are tested using 8 filesets: audio (WAV and MP3), documents, e-books, movies (DivX and MPEG), and pictures (PSD and JPEG). He tests them at different settings and includes the aggregated results. Spoilers: WinRK gives the best compression but operates slowest; AJR32 is fastest but compresses least."
duh (Score:5, Funny)
small = slow (Score:5, Funny)
Re: (Score:3, Insightful)
Re: (Score:2, Funny)
If there's one thing that brightens my day, is a client sending me a PDF compressed with "Hey-boss-I-fucked-your-wife-ZIP" right on deadline.
Re:duh (Score:5, Informative)
I agree with you on the importance of this article but
Yes, I know it is better than gzip, and it is also supported everywhere. But it is much worst than the "modern" compression algorithms.
I have been using LZMA for some time now for things I need to store longer, and getting good results. It is not on the list, but should give results a little bit better than RAR. Too bad it is only fast when you have a lot of memory.
For short/medium time storage, I use bzip2. Online compression, gzip (zlib), of course.
Re:duh (Score:5, Informative)
Here's a scatterplot [theknack.net] of resulting file sizes and compression times from the text compression data (lower is better), and as my luck would have it, bzip2 is really the only one that's out of line - i.e. the furthest from the pareto frontier [wikipedia.org]. But then, looking at the same data with file sizes plotted in the range of [0.0, 1.0] [theknack.net], it seems like there's a major case of diminishing returns for the expensive algorithms anyways. If you care at all about compression time, good ol' gzip is still a pretty decent choice!
Re:duh (Score:4, Informative)
Re: (Score:2)
Re:duh (Score:4, Informative)
Re:duh (Score:5, Funny)
You compressed the article into that statement. How long did it take to write the comment?
Re: (Score:2, Insightful)
Not really (Score:4, Insightful)
Re: (Score:3, Informative)
What's the point? Same programs compressing same data on a different computer.
I use gzip for big files (takes less time)
I use bzip2 for small files (compresses better)
I use zip to send data to Windows people
I really, really miss ARJ32. It was my favorite on DOS Days.
Re: (Score:2, Insightful)
Now a days it's all RAR for the Usenet and Torrents and such. RAR is really great but it's piss slow compressing anything. It's just so easy to make multipart archives with it.
I really wish Stuffit would go away
Re:duh (Score:5, Interesting)
Missing part 3 of 10? No problem!
Of course, I'm a holder of a license for Rar from way back when. I like it.
Re: (Score:2)
I take it you didn't look at the "Compression Efficiency" graph at the bottom of each page.
Of course they don't seem to reveal their methodology for calculating that graph, but even a glance at the other tables will show that, for example, Stuffit is almost always much faster saves very nearly as much space as 7-Zip (sometimes more). That's why comparisons like this are interesting.
Re:duh (Score:5, Funny)
Re: (Score:3, Funny)
Server Too Busy
Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.
Exception Details: System.Web.HttpException: Server Too Busy
Source Error: An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack t
And slashdotting == no comression at all (Score:2)
Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.
Exception Details: System.Web.HttpException: Server Too Busy
Re: (Score:3, Informative)
I wouldn't be surprised if many of the other compression tools have similar options.
--Pat
Re: (Score:3, Insightful)
What are they actually measuring? (Score:3, Interesting)
Having said that, do I really care in practice that much about if algorithm A is 5% faster than algorithm B? I personally do not, I care if the person receiving them can
WOW! (Score:5, Funny)
Re: (Score:2)
If you really mean quality (as opposed to compression ratio) you've got it backwards. Lossless compression algorithms are generally simpler than lossy ones, especially on the encode side. Lossy algorithms have to do a lot of additional work converting signals to the frequency domain and applying c
Screw speed, size reduction: gimme compatibility (Score:5, Insightful)
Re:Screw speed, size reduction: gimme compatibilit (Score:5, Insightful)
ZIP for cross-platform compatibility (and for simplicity for less technically-minded users).
RAR for everything else (at 3rd in their "efficiency" list, it's easy to see why it's so popular, not to mention ease of use for splitting archives, etc).
Re: (Score:3, Informative)
rar? (Score:2)
Re:rar? (Score:5, Funny)
Here on Non-Windows planet (Score:4, Funny)
Things are less blue.
(And I'm not speaking of the sky)
UHA (Score:3, Insightful)
You can keep Rar and zip and toss out the others, but the UHA extension (or a dummy extension) will probably exist on your computer at some point in time.
Re: (Score:3, Insightful)
Re: (Score:2)
How many systems don't have any form of cat?
Re: (Score:3, Informative)
Depends on the application (Score:2)
I have to agree that for most people (myself included), compatibility is all that matters. I'm so glad Macs now can natively zip. But there are valid reasons to want compression over compatibility.
You might want an interface. (Score:2)
All I want it compatibility with other OSs (i.e., fewest things that have to be installed on a base OS to use it). For that, I'd have to say Zip and/or gzip wins.
Sure, but there's also the issue of finding the files you really want to share and there KDE has very nice front ends. There's a nice find in Konqueror, with switches for everything including click and drool regular expressions. Krename coppies or links files with excellent renaming. Finally, Konqueror has an archive button. The slick interf
Re:Screw speed, size reduction: gimme compatibilit (Score:2, Interesting)
I have to admit I switched over/back to ZIP about a year ago for everything for exactly this reason. yeah, it meant a lot of my old archives increased in size (sometimes by quite a bit), but knowing that anything anywhere can read the archive makes up for it. ZIP creation and decoding is supported natively by Mac a
Re: (Score:2)
How about (Score:5, Funny)
Agreed completely. (Score:5, Interesting)
Getting stuff out of some of those formats now is a real irritation. I haven't run into a case yet that's been totally impossible, but sometimes it's taken a while, or turned out to be a total waste of time once I've gotten the archive open.
Now, I try to always put a copy of the decompressor for whatever format I use (generally just tar + gzip) onto the archive media, in source form. The entire source for gzip is under 1MB, trivial by today's standards, and if you really wanted to cut size and only put the source for deflate on there, it's only 32KB.
It may sound tinfoil-hat, but you can't guarantee what the computer field is going to look like in a few decades. I had self-expanding archives, made using Compact Pro on a 68k Mac, thinking they'd make the files easy to recover later, which didn't help me at all now -- a modern (Intel) Mac won't touch it (although to be fair a PPC Mac will run OS 9 which will, and allegedly there's a Linux utility that will unpack CPP archives, although maybe not self-expanding ones).
Given the rate at which bandwidth and storage space are expanding, I think the market for closed-source, proprietary data compression schemes should be very limited; there's really no good reason to use them for anything that you're storing for an unknown amount of time. You don't have to be a believer in the "infocalypse" to realize that operating systems and entire computing-machine architectures change over time, and what's ubiquitous today may be unheard of in a decade or more.
Re: (Score:2, Insightful)
Don't forget not to go too far (Score:4, Funny)
I keep it simple (Score:5, Funny)
Re:L-Zip (Score:2, Funny)
It was so good at what it did that I bet Microsoft bought them out and are going to incorperate the technology into Windows.
Skip the blogspam (Score:5, Informative)
as its slashdotted
this site
http://www.maximumcompression.com/ [maximumcompression.com]
has been up for years and performs tests on all the compressors with various input sources, much more comprehensive
Re: (Score:2)
http://www.techarp.com.nyud.net:8090/showarticle.
Re: (Score:2)
And of course, there are other factors that these types of comparisons rarely mention or that are harder to quantify: Memory footprint, compression speed while multitasking, both foreground and backgound, single anad dual core, OS/gui integration, cross-platform availability, availability of source code, cost (particularly for enterprise users), backup options (how quie
Re: (Score:3, Insightful)
See this page? http://www.maximumcompression.com/data/summary_mf. php [maximumcompression.com]
What are the headers along the top? let's see..
Pos, Program, Switches used, TAR, Compressed, Compression, Comp time, Decomp time, Efficiency
OMG!.. is that a "time".. as in speed column i see there?
Re: (Score:3, Interesting)
If you have a hard limit, like a single CD or DVD, then the extra time is worth it. Otherwis
Re: (Score:2)
How quick does it compress when slashdotted? (Score:2)
What about LHA, TAR (Score:2, Insightful)
Re:What about LHA, TAR (Score:5, Informative)
Re:What about LHA, TAR (Score:4, Funny)
Interesting, needs better graphs (Score:5, Informative)
I read this earlier today through the firehose. It was interesting, but the graphs are what struck me. It seems to me all the graphs should have been XY plots instead of pairs of histograms. That way you could easily see the relationship between compression ratio and time taken. Their "metric" for showing this, basically multiplying the two numbers, is pretty bogus and isn't nearly as easy to compare. With the XY plot the four corners are all very meaningful. One is slow with no compression, one each good compression/time, and the sweet spot of good compression and good time. It's easy to tell those on two opposing corners apart (good compression vs good time), where as with the article's metric they could look very similar.
Still, interesting to see. The popular formats are VERY well established at this point (ZIP in Windows and Mac (stuffit seems to be fading fast), and GZIP and BZIP2 on Linux). They are so common (especially with ZIP support built into Windows since XP and also built into OS X) I don't think we'll see them replaced any time soon. Of course, with CPU power getting cheaper and cheaper we are seeing formats that are more and compressed (MP3, H264, Divx, JPEG, etc) so these utilities are becoming less and less necessary. I no longer need to stuff files on floppies (I've got the net, DVD-Rs, and flash drives). Heck, if you look at some of the formats they "compressed" (at like 4% max) you almost might as well use TAR.
Re: (Score:2)
And I still zip up multiple files for sending over the internets.
Re: (Score:2)
Of course, with CPU power getting cheaper and cheaper we are seeing formats that are more and compressed (MP3, H264, Divx, JPEG, etc)so these utilities are becoming less and less necessary.
You do realize that you're talking about two different datasets whether you're talking something like .zip and then something like .mp3??? The more and more compressed options you spoke of only work well because they're for specific applications - and they're lossy to boot; the typical compression tools are lossless and for any data set.
I don't think common compression libraries/utilities will ever fade, where there's a data set, there's always a need to get it just a little smaller....
Re: (Score:3, Informative)
Re: (Score:2)
no best compression results? (Score:2)
Poor article. (Score:5, Insightful)
Versions of the programs aren't given, nor the compile-time options (for the open source ones).
Finally, Windows Vista isn't a suitable platform for conducting the tests. Most of these tools target WinXP in their current versions and changes to Vista introduced systematic differences in very basic things like memory usage, file I/O properties, etc.
The idea of the article is fine, it's just that the analysis is half-baked.
Re:Poor article. (Score:5, Insightful)
They also focused on compression rate when I believe they should have focused on decompression rate. I'll probably only archive something once, but I may read from the archive dozens of times. What matters to me is the trade-off between space saved and extra time taken to read the data, not the one-off cost of compressing it.
Re: (Score:2)
No, they didn't. They really should have tested that. Personally, I like 7-zip's compressed filesystem better than WinZip's, but I haven't really tried any of the others.
Hold on...I've just been handed a note. Apparently you don't get to make any real choices in that area - it's zip or nothing. Further, the details of compressing and decompressing is handled whenever the filesystem feels like it, so it can't really be judged against traditional programs. So I gu
What's the point of compressing JPEG,MP3,DivX etc (Score:5, Insightful)
What's the point of compressing JPEG,MP3,DivX etc since they already do the compression? The streams are close to random (with max information) and all you could compress would be the headers between blocks in movies or the ID3 tag in MP3.
Re:What's the point of compressing JPEG,MP3,DivX e (Score:2)
Re:What's the point of compressing JPEG,MP3,DivX e (Score:5, Interesting)
Re: (Score:2)
Re: (Score:3, Insightful)
Still, in both cases, it works; who can argue with that.
Re:What's the point of compressing JPEG,MP3,DivX e (Score:5, Interesting)
Both methods do the same thing: they statistically analyse all the data, then re-encode it so the most common values are encoded in a smaller way than the least common values.
Huffman's main limitation is that each value compressed needs to consume at least one bit. Arithmetic coding can fit several values into a single bit. Thus, arithmetic coding is always better than Huffman, as it goes beyond Huffman's self-imposed barrier.
However, Huffman is NOT patented, while most forms of arithmetic coding, including the one used in the JPEG standard, ARE patented. The authors of Stuffit did nothing special - they just paid the patent fee. Now they just unpack the Huffman-encoded JPEG data and re-encode it with arithmetic coding. If you take some JPEGs that are already compressed with arithmetic coding, Stuffit can do nothing to make them better. But 99.9% of JPEGs are Huffman coded, because it would be extortionately expensive for, say, a digital camera manufacturer, to get a JPEG arithmetic coding patent license.
So Stuffit doesn't have remarkable code, they just paid money to get better compression that 99.9% of people specifically avoid because they don't think it's worth the money.
Re: (Score:3, Informative)
With Arithmic encoding however you'd encode
Re:What's the point of compressing JPEG,MP3,DivX e (Score:4, Insightful)
Default options and stuffit (Score:3, Informative)
I imagine some other codecs also have similar options for specific file types.
Hmm... (Score:2)
english language is mostly fluff (Score:4, Funny)
i wonder how other languages compare, and if there is a way to communicate much more efficiently.
Re: (Score:2)
An anecdotal observation: my wife is better than I am at linguistic tasks, b
Pizzachish: setting a new standard in languages (Score:2, Interesting)
I have been thinking about creating a new language with about 60 or so words. The idea is that you don't need a lot of words when you can figure out the meaning by context. Strong points are that the language would be very easy to pick up, and you would get that invigorating feeling of talking like a primitive cave man.
As an example of the concept, we have the words walk and run. They are a bit too similar to be worth wasting one of our precious few 60 words. Effectively, one could be dropped with have
Re: (Score:2)
"warm warm" = "hot"
"walk walk" = "walk fast/run"
It could greatly reduce the number of adjectives and verbs(and other stuff) you need in the language.
Re: (Score:2)
Re: (Score:3, Interesting)
you might be interested in this:
http://www.tokipona.org/ [tokipona.org]
Re: (Score:2)
Re: (Score:3, Funny)
7zip (Score:5, Insightful)
weak on retarded things to zip like WAV files (use FLAC) mp3's, jpegs and divx movies.
7zip does quite well in documents (2nd) and ebooks (2nd) 3rd on MPEG video, 2nd in PSD
also i expect 7zip will improve in higher end compressions settings, when possible i give it hundreds of megs and unlike commercial apps 7zip can be configured well into the "insane" range
Doesn't really matter (Score:3, Informative)
Yes, having a good compression algorithm is nice, but unless you can get it to partially supplant zip, you'll never make much money off it. Also, most things these days don't need to be compressed. Video and audio are already encoded with lossy compression, web pages are so full of crap that compressing them is pointless, and hard drives are big enough. Although, I haven't seen any research lately about whether compression is useful for entire filesystems to reduce the bottleneck from hard drives. Still, I suspect that it is not worth the effort.
Backups (Score:2)
As for compressing whole fil
Coralized and Hutter Prized (Score:2)
Meanwhile, I noticed they didn't include the latest winner of the Hutter Prize [slashdot.org], which is unfortunate since its latest entry looks like it will come in at nearly a 10% improvement over all prior text compressors using novel semantic modeling techniques [google.com].
Mirrors! (Score:2)
Archive Comparison Test (Score:5, Insightful)
See also: the Archive Comparison Test [compression.ca]. Covers 162 different archivers over a bunch of different file types.
It hasn't been updated in a while (5 years), but have the algorithms in popular use changed much? I remember caring about compression algorithms when I was downloading stuff from BBSs at 2400 baud, or trading software with friends on 3.5" floppies. But in these days of broadband, cheap writable CDs, and USB storage, does anyone care about squeezing the last few bytes out of an archive? zip/gzip/bzip2 are good enough for most people for most uses.
Exhaustive?! (Score:5, Informative)
I did some comparison last year, and found 7-zip to do the best job for what I needed (great compression ratio without requiring days to complete). It also doesn't take into account the network speed at which the file is going to be transmitted. I use 7-zipfor pushing application updates and such to remote offices (most over 384k/768k WAN links). Compressing w/ 7-zip has saved users quite a bit of time compared to winrar or winzip.
I would definitely recommend checking out maximumcompression.com (As others have, as well) over this article. It goes into a lot greater detail.
poor sample data choices (Score:2, Redundant)
It's actually pretty sensible (Score:2)
It makes a lot of sense, considering how my eyelids feel after reading what the article is about.
Didn't have Tridge's rzip... (Score:3, Interesting)
http://samba.org/junkcode/ [samba.org]
Tridge is one of the smart guys behind samba. And rzip is pretty clever for certain things. Just ask google.
how about non-windows platforms anyone? (Score:4, Insightful)
The article conveniently forgets to mention whether the conpression tools are cross-platform (OSX, Linux, BSD) and/or open source or not.
That makes a lot of them utterly useless for lots of people. Yet another windows-focussed review, bah.
There Are Only A Few Really Useful Algorithms (Score:3, Informative)
With the exception of compressors that use arithmetic coding (which has patents out the wazoo covering just about every form of it), virtually all compressors use some form of Huffman compression. In addition, many use some form of LZW compression before executing the Huffman compression. That is pretty much it for general purpose compression.
Of course, if you know the nature of the data you are compressing you can come up with a much better compression scheme.
For instance, with XML, if you have a schema handy, you can do some really heavy optimization since the receiving side of the data probably already has the schema handy which means you don't need to bother sending some sort of compression table for the tags, attributes, element names, etc.
Likewise, with FAX machines, run length encoding is used heavily because of all the sequential white space that is indicative of most fax documents. Run length encoding of white space can also be useful in XML documents that are pretty printed.
Most compression algorithms that are very expensive to compress are usually pretty cheap to decompress. If you are providing a file for millions of people to download, it doesn't matter if it takes 5 days to compress the file if it still only takes 30 seconds for a user to decompress it. However, when doing peer to peer communication with rapidly generated data, you need the compression to be fast if you use any at all.
Nevertheless, most generaly purpose lossless compression formats are more or less clones of each other once you get down to analyzing what algorithms they use and how they are used.
Re:There Are Only A Few Really Useful Algorithms (Score:4, Informative)
Fax machines use a static Huffman encoding. They've never used run-length encoding. Run-length encoding is nothing compared to how efficiently LZ77 or LZ78 would handle pretty-printed XML.
Compression algorithms vary on both their compression and decompression speed. LZ77 is slow to compress and fast to decompress. Arithmetic coding and PPM are slow both compressing and decompressing.
SMP hardware? (Score:3, Insightful)
Of all the main compression utils I use, 7-zip, RAR and bzip2 (in the form of pbzip2) all have modes that will utilise multiple chips, often giving a pretty huge speedup in compression times. I'm not aware of any SMP branches for gzip/zlib but seeing as it appears to be the most efficient compressor by miles it might not even need it
It's mainly academic for me now though anyway, since almost all of the compression I use is inline anyway, either through rsync or SSH (or both). Not sure if any inline compressors are using LZMA yet, but the only time I find myself making an archive is for emailing someone with file size limits on their mail server. All of the stuff I have at home is stored uncompressed because a) 90% of it is already highly compressed and b) I'd rather buy slightly bigger hard drives that attempt to recover a corrupted archive a year or so down the line. Mostly I'm just concerned about decompression time these days.
Re:/. effect rears its ugly head once again! (Score:5, Funny)
Re: (Score:2)
Re: (Score:2)
That's a bit misleading. For example, PKzip may not be multi-platform, but there are good native Zip compression and decompression programs available for every major platform.
Re: (Score:2, Informative)
> Then you will have exhausted a little more of the compression programs available.
You are aware that all the tools tested are general purpose compressors, and FLAC is not, aren't you?
Otherwise, you would also have to talk about Wavepack, Monkey Audio, Shorten and others.
And those are only the loseless audio codecs. What about lossy codecs?
What about all those different formats for pictures? They compress data as well.
And what abou
Re:This is nothing new (Score:4, Funny)