Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Google Graphics Open Source Software Technology

Google Publishes Zopfli, an Open-Source Compression Library 124

alphadogg writes "Google is open-sourcing a new general purpose data compression library called Zopfli that can be used to speed up Web downloads. The Zopfli Compression Algorithm, which got its name from a Swiss bread recipe, is an implementation of the Deflate compression algorithm that creates a smaller output size (PDF) compared to previous techniques, wrote Lode Vandevenne, a software engineer with Google's Compression Team, on the Google Open Source Blog on Thursday. 'The smaller compressed size allows for better space utilization, faster data transmission, and lower Web page load latencies. Furthermore, the smaller compressed size has additional benefits in mobile use, such as lower data transfer fees and reduced battery use,' Vandevenne wrote. The more exhaustive compression techniques achieve higher data density, but also make the compression a lot slower. This does not affect the decompression speed though, Vandenne wrote."
This discussion has been archived. No new comments can be posted.

Google Publishes Zopfli, an Open-Source Compression Library

Comments Filter:
  • Overhyped (Score:4, Informative)

    by Anonymous Coward on Friday March 01, 2013 @05:00PM (#43049073)

    This team is clearly just trying to make a name for themselves. It improves over gzip by a mere 3% or so, but takes an order of magnitude longer to compress.

    Their underlying implemented might be cool research. But it's practical merit is virtually nil.

    Now, cue the people who are going to do some basic arithmetic to "prove" me wrong, yet who probably don't even bother using gzip content-encoding on their website right now, anyhow.

  • Re:Overhyped (Score:5, Informative)

    by sideslash ( 1865434 ) on Friday March 01, 2013 @05:18PM (#43049245)

    It improves over gzip by a mere 3% or so, but takes an order of magnitude longer to compress [...] it's practical merit is virtually nil.

    Maybe it's useless to you as a developer(?), and to most people. However, you benefit from this kind of technology all the time. Compare this to video encoding, where powerful machines spend a heck of a lot of time and CPU power to gain extra 3%'s of compression to save bandwidth and give you a smooth viewing experience.

    This tool could have many useful applications for any kind of static content that is frequently served, including web services, as well as embedded content in mobile games and other apps. Every little bit of space savings helps (as long as it isn't proportionally slower to expand, which the article says it stays comparable).

  • Re:Overhyped (Score:5, Informative)

    by Trepidity ( 597 ) <[gro.hsikcah] [ta] [todhsals-muiriled]> on Friday March 01, 2013 @05:25PM (#43049327)

    One example that comes to mind: Android APKs use the zip format.

  • Re:Overhyped (Score:5, Informative)

    by K. S. Kyosuke ( 729550 ) on Friday March 01, 2013 @05:57PM (#43049619)

    In addition to all the other explanations of how you missed the point, Deflate is also used in PNG. This will allow you to make smaller PNG files, too, which can be quite a significant part of your bandwidth.

    Well, If you're Google and you detect Chrome on the client side, it might be even better for you to serve a WebP version instead. Out of a random sample of 1,000 PNG files, a lossless WebP version was at least 20% smaller in more than 50% of the cases (link [google.com]).

  • Re:Overhyped (Score:3, Informative)

    by Anonymous Coward on Friday March 01, 2013 @06:26PM (#43049879)

    But the decompressors for those algorithms are not available in most web browsers, making them totally unusable for the stated use case.

    But hey, why read the article when you can whine about it blindly on /.?

  • Re:Overhyped (Score:4, Informative)

    by SuricouRaven ( 1897204 ) on Friday March 01, 2013 @06:54PM (#43050231)

    There are tricks to that h264 encoding to squeeze a bit more. You can improve the motion estimation by just throwing power at it, though the gains are asymptotic. Or increase the frame reference limit - that does great thing on animation, if you don't mind losing profile compliance. Things like that. Changing the source is also often of great benefit - if it's a noisy image, a bit of noise-removal filtering before compression can not just improve subjective quality but also allow for much more efficient compression. Interlaced footage can be converted to progressive, bad frame rate conversions undone - progressive video just compresses better. It's something of a hobby of mine.

    I wrote a guide on the subject: http://birds-are-nice.me/publications/Optimising%20x264%20encodes.htm [birds-are-nice.me]

    You're right about Zopfli though. Regarding h264, it changes nothing.

The hardest part of climbing the ladder of success is getting through the crowd at the bottom.

Working...