Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Google Graphics Technology

Google Upgrades WebP To Challenge PNG Image Format 249

New submitter more writes with news that Google has added to its WebP image format the ability to losslessly compress images, and to do so with a substantial reduction in file size compared to the PNG format. Quoting: "Our main focus for lossless mode has been in compression density and simplicity in decoding. On average, we get a 45% reduction in size when starting with PNGs found on the web, and a 28% reduction in size compared to PNGs that are re-compressed with pngcrush and pngout. Smaller images on the page mean faster page loads."
This discussion has been archived. No new comments can be posted.

Google Upgrades WebP To Challenge PNG Image Format

Comments Filter:
  • NIH (Score:3, Insightful)

    by Anonymous Coward on Friday November 18, 2011 @01:19PM (#38100222)

    Why not update the png format? See subject.

    • Re:NIH (Score:5, Insightful)

      by retech ( 1228598 ) on Friday November 18, 2011 @01:26PM (#38100298)
      Because that requires a committee and would take 10x as long, if ever, to get done.
      • by sstamps ( 39313 )

        It shouldn't. It's not like they set the PNG format up to be extensible this way.

        If it truly is a significant innovation, it should sail through the standards approval process as a recognized extension.

        • Re:NIH (Score:4, Insightful)

          by Anonymous Coward on Friday November 18, 2011 @01:42PM (#38100560)

          TIFF exists. The world doesn't need another file format where most clients don't implement the full standard and the user can never expect a file in that format to be reliably readable everywhere.

        • Re: (Score:3, Insightful)

          by Anonymous Coward

          If it truly is a significant innovation, it should sail through the standards approval process

          Hahahahahahahahahahahahahahahahaaaaaaa

          Wow, you've never actually dealt with a standards body before, have you?

        • Re:NIH (Score:5, Insightful)

          by Anonymous Coward on Friday November 18, 2011 @01:44PM (#38100596)

          But extensions are good for adding information, not removing it. You could probably implement whatever compression enhancements Google made to WebP in PNG through extensions, but probably not in a way that makes old versions of libpng still produce usable results while still having a reduced filesize. At which point it doesn't really matter if you add it to WebP or PNG, the backward compatibility benefit of PNG extensions can't be exploited either way

        • Re:NIH (Score:5, Insightful)

          by Kjella ( 173770 ) on Friday November 18, 2011 @01:44PM (#38100600) Homepage

          If it truly is a significant innovation, it should sail through the standards approval process as a recognized extension.

          Which is not actually that helpful, because then you have tons of PNG-capable applications that can't read PNGs. TIFF used to be this way, where TIFF actually means it can be compressed like ten different ways and support was very mixed. If you have a significant new non-backwards compatible format, just releasing it as a new format is maybe just as easy.

          • Re:NIH (Score:5, Interesting)

            by 0123456 ( 636235 ) on Friday November 18, 2011 @02:05PM (#38100832)

            Which is not actually that helpful, because then you have tons of PNG-capable applications that can't read PNGs. TIFF used to be this way, where TIFF actually means it can be compressed like ten different ways and support was very mixed.

            Only ten different ways? Back in the early 90s I was creating TIFF files that I doubt anyone can display these days; we had our own TIFF tags assigned and could compress files however we wanted to.

            This is why TIFF was:

            1. Very useful for app developers.
            2. A total disaster for interoperability.

            • by TWX ( 665546 )

              Was interoperability a real goal of the complete standards set though?

              I can see for some antitampering features for support files for software user interfaces, using a quasi-standard format that makes development less expensive but makes it harder for users to screw up would actually be a good thing in some situations.

          • Re:NIH (Score:4, Interesting)

            by pclminion ( 145572 ) on Friday November 18, 2011 @06:42PM (#38104228)

            TIFF used to be this way, where TIFF actually means it can be compressed like ten different ways and support was very mixed.

            TIFFs still are this way, it just seems like everything is "all better" because people throw up their hands and use libtiff, which actually handles a large fraction of the weird shit that's out there. If there were not a peculiar group of masochists who decided this was something worth tackling things would seem quite different. If you wanted to sit down and write a TIFF library yourself that was actually capable of loading a majority of TIFF files out there, you'd spend years doing it.

            If you don't believe me, look inside libtiff. "Initialize Thunderscan! To the turrets!"

        • Re:NIH (Score:5, Informative)

          by petermgreen ( 876956 ) <plugwash.p10link@net> on Friday November 18, 2011 @02:02PM (#38100794) Homepage

          One of the key design features of PNG was that any PNG should be able to be read by any decoder. That is why PNG has relatively few options on how the core data is encoded*

          Adding optional stuff is ok (unless it's animation......) but if you want to make a key change to the core of the format I suspect the PNG guys would tell you to go make your own format based on PNG but with it's own specification, file extension and "magic number" (as was done for MNG, and JNG).

          * a handful of filter types all of which are easy to implement, one compression algorith, one byte order standard, 15 allowed color/bitdepth combintions (the majority of which represent very comon combinations and all of which can be easilly mapped to 24-bit RGB).

    • Re: (Score:3, Insightful)

      by Trillan ( 597339 )

      WebP lossy may not catch on, but it isn't pointless. Compared to JPEG, in return for a muddier image (to my eyes, at least) you get alpha support. As Google is one of the biggest distributors of images on the Internet, I think the real purpose is to pay less for licensing JPEG.

      WebP lossless seems much less useful to me. Unless there's licensing issues I'm not aware of, it seems pretty pointless.

      • Re:NIH (Score:5, Insightful)

        by BitZtream ( 692029 ) on Friday November 18, 2011 @01:36PM (#38100452)

        You don't have to pay for a JPEG license, try again.

      • Re:NIH (Score:5, Interesting)

        by Guspaz ( 556486 ) on Friday November 18, 2011 @02:03PM (#38100806)

        JPEG XR produces images similar to JPEG-2000 while having complexity similar to JPEG, supports transparencies, requires support for lossless compression (unlike JPEG) since lossless is just a quantizer setting, and it's already supported by IE9.

        That last bit is probably the most important part. IE's marketshare is shrinking, but it's still big enough that any format it doesn't support is unlikely to see widespread support as the only format available for a site. I doubt IE will ever support WebP, and as such, no website will ever really be able to use WebP. Not unless they do browser detection, and most sites won't bother with multiple image compression formats, they're going to pick the best common one they can, which is currently PNG or JPEG.

        Remember PNG alpha support... Until IE supported it, nobody really used it. Once IE did, it became mainstream.

        • most sites won't bother with multiple image compression formats,

          Really? I'd think sites would enjoy a 50% reduction in bandwidth in supported browsers, even if they don't get it for IE.

          • by Guspaz ( 556486 )

            Big ones might. But a lot of people aren't bandwidth limited, or won't think it's worth the effort of converting all their images, storing it in multiple formats, and then writing code to dynamically change which images are fed back based on the browser.

            The savings for most sites would likely be minimal anyhow: images get cached. If you're a site like Slashdot, for example, loading the front page will get you the cast majority of images, loading subsequent pages will likely not have much images to download.

    • by sstamps ( 39313 )

      Yeah, I was thinking that it sounds like a good candidate for the long-awaited compression method 1. :P

    • by yog ( 19073 ) *

      Google has open-sourced the WebP code and utilities, so (I think) this format will not be encumbered by patents or licensing issues. That is a great contribution in itself. I continue to be amazed by Google and its ability to make money while giving stuff away.

      • Re: (Score:2, Insightful)

        by BitZtream ( 692029 )

        As opposed to PNG and JPEG which are both open and have no patent or license issues either?

        • by yog ( 19073 ) *

          Not as opposed to PNG and JPEG, simply as a (possibly superior) alternative. JPEG did have some patent issues which fortunately have been resolved (google "jpeg patent"). Probably every graphical format that is successful will attract the attention of lawyers of patent holders and patent trolls. Hopefully Google has thoroughly vetted this technology. Pretty pathetic the hoops one has to jump through these days to create something as abstract as a computer image standard.

      • Open source != patent free.

        I'm not saying that it is patent encumbered, but just pointing out the flaw in your assumption. So long as there is a guarantee that it will be free to use forever, I see no reason why modern browsers shouldn't implement it. What's the downside?

        • So long as there is a guarantee that it will be free to use forever, I see no reason why modern browsers shouldn't implement it. What's the downside?

          Extra code that has to be written, loaded, run, tested, and maintained. This leads to application size bloat, larger memory footprints, and more work for developers.

          It may be worth it, if the format is a significant advance. But it's not cost-free, to either the developers or the users.

      • open source != free for a reason...
        If they license the patents involved together with the source code in a FOSS compatible way, good.
        If they won't try to pull an android and divide users into the group with the latest and greatest implementation (chrome users) and the others, great.
        I should not be criticizing them ahead of time, so let's see.

    • Re:NIH (Score:5, Insightful)

      by timeOday ( 582209 ) on Friday November 18, 2011 @01:53PM (#38100694)

      Why not update the png format?

      Recycling a name for a new incompatible format is a terrible idea. If I have a png image and software that supports pngs, I should be able to read that image, period.

    • PNG8 is here (Score:5, Informative)

      by porneL ( 674499 ) on Friday November 18, 2011 @03:42PM (#38102060) Homepage

      I'm working on it -- or rather squeezing every last drop of the existing format.

      http://pngmini.com/vs-webp/ [pngmini.com]

      With good PNG8+alpha quantization you can get compression in the same league as WebP (although WebP is still better) and basically 100% browser support (it degrades well in IE6).

  • Awesome (Score:3, Insightful)

    by Anonymous Coward on Friday November 18, 2011 @01:26PM (#38100304)

    Another unsupported format from Google.

    It's interesting how successful they are at dominating/directing so many areas of the Internet, but they seem so ineffectual in other areas like this and the video format they are trying to get the world to switch to.

    • Re:Awesome (Score:4, Interesting)

      by Xanny ( 2500844 ) on Friday November 18, 2011 @02:46PM (#38101324)
      They are converting all of youtube to WebM, and it is the only royalty free web video codec. I'm pretty sure they will beat h.264 in the long run because free wins in the end. The fact the encoding is behind the scenes doesn't matter. In a decade html5 video will be defined by webm because no one wants to license h.264 for encoding products.
      • Re:Awesome (Score:4, Informative)

        by PwnzerDragoon ( 2014464 ) on Friday November 18, 2011 @03:01PM (#38101534)

        and it is the only royalty free web video codec.

        Except Theora [theora.org]. Though from what I've seen WebM has the edge in video quality.

      • by westlake ( 615356 ) on Friday November 18, 2011 @05:07PM (#38103116)

        They are converting all of youtube to WebM, and it is the only royalty free web video codec. I'm pretty sure they will beat h.264 in the long run because free wins in the end.

        The key word here is "converting."

        H.264 is a core technology in digital video with 1,081 licensees. AVC/H.264 Licensees [mpegla.com]

        Studio production.

        Broadcast, cable and satellite distribution. Industrial applications. Home video.

        You can play Google's YouTube transcode in your browser. WebM may find an anchorage in video chat.

        But that is pretty much all you can do with WebM right now.

        There is no such thing as amatuer or studio grade production hardware. No such thing as a WebM security camera.

        • Don't forget hardware-accelerated H.264 playback. Particularly mobile devices have media playback hardware of this sort.
  • by Jackie_Chan_Fan ( 730745 ) on Friday November 18, 2011 @01:28PM (#38100344)

    ... because Chrome is STILL NOT color managed.

    • by DarkXale ( 1771414 ) on Friday November 18, 2011 @02:02PM (#38100800)
      >"Last month we announced WebP support for animation, ICC profile, XMP metadata and tiling."
      I assume thats a 'yes'.
  • by Tastecicles ( 1153671 ) on Friday November 18, 2011 @01:36PM (#38100446)

    ...doesn't anyone think it might be time to revisit fractal image compression [ucsd.edu] and maybe look at ways of improving iterated function systems and their associated algorithms (I might give Mike Barnsley a call and ask him how his IFS patents are developing if you're nice and mod me up)?

    • by Trixter ( 9555 )

      ...doesn't anyone think it might be time to revisit fractal image compression [ucsd.edu] and maybe look at ways of improving iterated function systems and their associated algorithms?

      Considering that the best results were obtained using college grads as the compression engine [dogma.net], probably not.

    • by tepples ( 727027 )
      Barnsley style IFS compression is just vector quantization using a smaller version of the image itself as the codebook. I don't see what advantage it has over the more common DCT based methods.
  • by larry bagina ( 561269 ) on Friday November 18, 2011 @01:46PM (#38100616) Journal
    block google analytics.
    • No kidding. I started putting those at the very bottom of the pages and that seemed to help somewhat. I'll take improvements where I can get them. :p
      • I started putting those at the very bottom of the pages...

        That's where everyone else was already putting the Google Anylitcs code. That's where Google suggests you put it...

        It's a service like any other that Google offers: If it's not useful to *YOU*, don't use it on your site.

        There are alternatives, but they too effect load time.

        If you make a significant $$ off of AdSense, Google Analytics can be very useful.

        Just like no one is forcing anyone to post gobs of personal stuff on Facebook, no one is forcing Webmasters to use Google Anything

        • How about you calm down there, buddy? I was commenting most on the fact that Analytics is a little slow. Also for your information on where it goes, I can say with certainty that when I did it, all the help files said to put it at top. They've since changed it (probably because they too realize it holds up page loading). I'm not claiming I've invented some genius process...most people that know anything about web pages would have figured out that trick in the 3 seconds it took me. You're a bit combativ
  • by art6217 ( 757847 ) on Friday November 18, 2011 @01:48PM (#38100638)

    Why, with today's bright screens, no one implements high dynamic range imaging [wikipedia.org] in both GUI environments and common image formats?

    "Paper white" is still "all bits on"...

    • by Guspaz ( 556486 )

      Most modern screens can't display deep colour. My Dell U2711 can do it, but you really have to buy a high-end or professional display like it to get 10-bit colour support. Since most displays can't show it, there's not all that much demand to support it.

      • by Ichijo ( 607641 )

        Since most displays can't show it, there's not all that much demand to support it.

        It's a chicken-and-egg problem. What's the cheapest way to break out of the loop: make HDR displays, or give WebP support for HDR?

        • by Guspaz ( 556486 )

          Giving WebP support for HDR won't break out of the loop, so the answer is "make HDR displays".

          There are existing formats that support deep colour, and there are existing applications (like Photoshop) that can take advantage of it, but HDR displays are still rare. Adding support for HDR to WebP, a format that nobody uses, won't change anything.

    • by gl4ss ( 559668 )

      ..why wouldn't it be? that's the info that goes to the monitor.

      you want hdr on your desktop, use a hdr-composited image as a background.

    • by nomel ( 244635 )

      I think you're confused about hdr. Max brightness will always be "all bits on". It's only paper white because that's the absolute max brightness your display can show!

      You need an HDR display to view HDR images, otherwise you're just doing tonal mapping. The examples show in that wiki are not HDR images, they're tonal mapped images. Their dynamic range is exactly the dynamic range of all the other pictures you've seen today, 3 color channels, 256 bits per color channel. High dynamic range displays require br

  • by Twinbee ( 767046 ) on Friday November 18, 2011 @01:49PM (#38100644)
    As someone who would love to use variable transparency (translucency) pictures on my own website, this story is very cool news. For one thing, it allows pictures to have drop shadows on varied backgrounds, without having to be forced to save as full 32bit PNG.

    I'm now somewhat disappointed PNG didn't get this far sooner. It's served its purpose well over time, but I didn't realize there was still so much room for compression.

    Congrats to Google, and I hope the other browser quickly adopt this apparently great picture format. I wonder how its animation side compares to APNG or MNG. The PC has always been gasping for decent lossless animation support, even though the Amiga 20 years ago had seemingly a dozen animation formats to choose from. Also, web browsers have (or at least had) great difficulty in playing animations at higher than around 16-25fps (apart from flash). It's a pretty sad state of affairs all round really.
  • CSS3 will soon eliminate the need for rounded corner images and gradient backgrounds, and even smartphone bandwidth is increasing to reasonable speeds. Most ads these days are displayed with flash, and the quality of thumbnail images really isn't that important either
    • CSS3 will soon eliminate the need for rounded corner images

      If all you want is single-radius rounded corners on rectangles, yes. While this fits most design processes, it falls well short of the flexibility offered by an alpha channel. On the up side, it's independent of image resolution (in the case of bitmaps) so the rounded corners are nice and smooth no matter the zoom level.

      and gradient backgrounds

      Again, only for simple gradients - yes, you can stack multiple divs together to get something more complex

    • by Hatta ( 162192 ) on Friday November 18, 2011 @03:27PM (#38101882) Journal

      CSS3 will soon eliminate the need for rounded corner images and gradient backgrounds

      There never was any need for rounded corners and gradient backgrounds.

    • by Sloppy ( 14984 )

      What is the world coming to, when the only use for image files that people can think of, is "be displayed on a web page?"

    • by Spad ( 470073 )

      That's the kind of attitude that got us lumbered with IE6 for years. There's nothing wrong with progress for the sake of progress; it might end up being more useful than you expected.

  • From the article thet seem to be targeting both lossy and lossless:

    WebP was proposed as an alternative to JPEG, with 25–34% better compression compared to JPEG images at equivalent SSIM index.

    and

    Our main focus for lossless mode has been in compression density and simplicity in decoding. On average, we get a 45% reduction in size when starting with PNGs found on the web, and a 28% reduction in size compared to PNGs that are re-compressed with pngcrush and pngout. Smaller images on the page mean faster page loads

    So their aim is to reduce bandwidth, which is admirable, yet the video side of Google is choosing to avoid H.264 which, AFAIU, has been shown to be better "bang for the bit" than VP8, and surely video is a far bigger consumer of bandwidth these days. (I'm not sure the unencumbered argument would stand up to close scrutiny).

  • by rlwhite ( 219604 ) <rogerwh&gmail,com> on Friday November 18, 2011 @02:11PM (#38100896)

    As someone who rooted for the adoption of JPEG2000, I wonder, have we reached the point where the existing major image formats are 'good enough' and so established that new standards are unlikely to unseat them?

    • As someone who rooted for the adoption of JPEG2000, I wonder, have we reached the point where the existing major image formats are 'good enough' and so established that new standards are unlikely to unseat them?

      If I remember correctly, JPEG2000 is patent encumbered.

    • by Sloppy ( 14984 ) on Friday November 18, 2011 @05:08PM (#38103124) Homepage Journal

      It doesn't have to unseat anything. Google is in the interesting position of having some websites with a significant amount of traffic and a web browser with a significant number of users. All they have to do is have Chrome send it in the Accept header and have their sites pay attention to that header. Instant n% reduction of bandwidth used by images.

      Right there, technological progress can stop and Google still comes out ahead. (Ignoring what they've paid to people to come up with WebP.) No rival has to be unseated.

      OTOH, once your site starts receiving a significant number of image/webp (or whatever they're using) in the Accept headers from Chrome (and Opera!) users, you have incentive to reconsider taking advantage, and the network effect has started, bouncing back'n'forth between site developers and browser developers.

      JPEG2000 didn't go this way because of the patent issue; from the very get-go, everyone knew they weren't allowed to use it. With WebP, it's either a mystery (if you're cautious) or allowed (because you trust that Google did a good patent search). Unlike JPEG2000, nobody has stepped forth and shown for sure that the tech needs to be sequestered for a couple decades. The default assumption about its legality is different.

  • by rossdee ( 243626 ) on Friday November 18, 2011 @03:36PM (#38101984)

    The people of Papua New Guinea are not impressed

  • 4% Compression? (Score:3, Interesting)

    by Anonymous Coward on Friday November 18, 2011 @04:30PM (#38102662)

    Using 37 topographic map png images ranging from 307K to 1.6M, the best compression I got was 4.09%

    Typical compression was roughly 1.7%

    In no way was I able to get 24% or anything close to that. But maybe I'm doing it wrong...

  • by kikito ( 971480 ) on Friday November 18, 2011 @04:36PM (#38102750) Homepage

    The graph showing the additional compress ratio is a png.

  • by j-turkey ( 187775 ) on Friday November 18, 2011 @04:41PM (#38102828) Homepage
    Please don't forget to leave in support for metadata (e.g. EXIF). If PNG had this from the start, it's very unlikely that we would still be using JPEGs.
  • TFA! Read it! (Score:4, Informative)

    by Zephiris ( 788562 ) on Friday November 18, 2011 @08:46PM (#38105092)

    Then linked from the original article is the study is basing it on. http://code.google.com/speed/webp/docs/webp_lossless_alpha_study.html [google.com]
    It's essentially saying that nearly the entire reason it's a fraction smaller in lossless mode is because there's no alpha support. Combining it the "optional" alpha mode with the "optional" lossless mode merely makes it near-identical in size to PNG, according to them.

    The more features you take out, and the more you degrade the pictures, the smaller they are in comparison to the original. Is this somehow surprising?

Understanding is always the understanding of a smaller problem in relation to a bigger problem. -- P.D. Ouspensky

Working...