Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Google The Internet

Google Rolls Out New 'Jpegli' JPEG Coding Library (infoworld.com) 81

Google has introduced a new JPEG library called Jpegli, which reduces noise and improves image quality over traditional JPEGs. Proponents of the technology said it has the potential to make the Internet faster and more beautiful. InfoWorld reports: Announced April 3 and accessible from GitHub, Jpegli maintains high backward compatibility while offering enhanced capabilities and a 35% compression ratio at high-quality compression settings, Google said. Jpegli works by using new techniques to reduce noise and improve image quality. New or improved features include adaptive quantization heuristics from the JPEG XL reference implementation, improved quantization matrix selection, calculation of intermediate results, and the possibility to use more advanced colorspace.

The library provides an interoperable encoder and decoder complying with the original JPEG standard and its most convenient 8-bit formalism and API/ABI compatibility with libjeg-turbo and MozJPEG. When images are compressed or decompressed through Jpegli, more precise and psycho-visually effective computations are also performed; images will look clearer and have fewer observable artifacts. While improving on the density ratio of image quality and compression, Jpegli's coding speed is comparable to traditional approaches such as MozJPEG, according to Google. Web developers can thus integrate Jpegli into existing workflows without sacrificing coding speed, performance, or memory use.

Jpegli can be encoded with 10-plus bits per component. The 10-bit encoding happens in the original 8-bit formalism and the resulting images are interoperable with 8-bit viewers. The 10-bit dynamics are available as an API extension and application code changes are necessary to apply it. Also, Jpegli compresses images more efficiently than traditional JPEG codecs; this can save bandwidth and storage space and make web pages faster, Google said.

This discussion has been archived. No new comments can be posted.

Google Rolls Out New 'Jpegli' JPEG Coding Library

Comments Filter:
  • by xack ( 5304745 ) on Friday April 05, 2024 @06:24AM (#64371862)
    The code is open source, all it would take is one git commit to enable it for over 3 billion Chrome users, stop ignoring the will of the developer community and "don't be evil" just this once. If a minority browser like Pale Moon can do it, so can a 2 trillion dollar company.
    • by Barny ( 103770 ) on Friday April 05, 2024 @07:47AM (#64371972) Journal

      Hahaha!

      This isn't about improving JPEG or compatibility. This is about controlling enough standards that they can bury all other browsers by requiring them to "keep up" as Google implements them on all their sites, and can claim "Your browser is slower here because you don't support the latest standards!"

      • Re: (Score:3, Insightful)

        by markdavis ( 642305 )

        ^^^^ THIS

        BINGO +1

        This is more about control than function. Almost nobody cares about an ever-so-slightly "better" JPEG compared to something that ALWAYS WORKS EVERYWHERE. This is clearly going to break compatibility with something that has been and should remain a universal standard.

        Wake up people- you are letting this happen by using (and sometimes REQUIRING the use of) Google's products. It is far from the first example, and won't be the last. We are doomed to a web completely controlled by a single

      • by swillden ( 191260 ) <shawn-ds@willden.org> on Friday April 05, 2024 @12:32PM (#64372634) Journal

        Hahaha!

        This isn't about improving JPEG or compatibility. This is about controlling enough standards that they can bury all other browsers by requiring them to "keep up" as Google implements them on all their sites, and can claim "Your browser is slower here because you don't support the latest standards!"

        Since Jpegli produces output that is fully compliant with the existing JPEG standards, the ones that every browser already supports, this comment makes no sense to me. Please explain what I'm missing.

      • Sounds like Microsoft during the Internet Explore 4-6 years. This is monopolistic behaviour, yet people keep up a fuss about Apple and want to give Google more free rein!

        • by sl3xd ( 111641 )

          I agree. They yanked JPEG XL out of Chrome a year ago, which will prevent adoption of the format even though Safari and Firefox support it.

          It's pretty clear Google wants to push their own VPx-derived .webp (and further derived .avif) image formats. A fair amount has been written about the quality issues when repurposing a video codec to encode still image data - while they can produce decent results, it does not produce results as good as a dedicated still image format.

          Here, they're making a 'backwards-comp

    • by AmiMoJo ( 196126 ) on Friday April 05, 2024 @07:51AM (#64371980) Homepage Journal

      I've been working with JPEG-XL via libjxl, the reference implementation. It's not ready for prime time. Look at the release notes for the last two versions:

      0.10.2: bugs in (lossless) encoding, bugs in streaming mode
      0.10.1: fixing a significant speed regression present since 0.9.0

      It's not stable, and performance even with the fixes is still not great. I can see why Google removed it. Maybe once it reaches 1.0 and is proven to be solid.

      Meanwhile, if you read the summary, you can see that they have adopted some of the new techniques invented for JPEG-XL, and applied them to JPEG. It also has 10 bit support for HDR. They get a lot of the benefit, while remaining full compatibility with JPEG. That makes the benefits of JPEG-XL for the web somewhat marginal.

      • That makes the benefits of JPEG-XL for the web somewhat marginal.

        Wake me when standard JPEG files support alpha channels. The "powers that be" should have solved this 30 years ago.

    • by RedK ( 112790 )

      Since it's open source, why don't you make that commit for them and submit a pull request ?

    • by Rujiel ( 1632063 )
      Not offering browser support for this would be the most tepid example of google being "evil" that I've ever seen, but to be fair there are a lot of them
      • This is early alpha level code. Not offering browser support would be "common sense" at this point.

    • If a minority browser like Pale Moon can do it, so can a 2 trillion dollar company.

      A minority browser may be willing to ship alpha code to customers. A $2tn company doesn't normally throw release 0.10.2 early alpha stuff into production code used by close to a billion people the world over.

  • by iAmWaySmarterThanYou ( 10095012 ) on Friday April 05, 2024 @06:32AM (#64371870)

    "Google abandons library, removes from all versions of Chrome source, says working on replacement to be used in Google's next gen quatum AI blockchain dark matter library".

  • by bubblyceiling ( 7940768 ) on Friday April 05, 2024 @06:44AM (#64371892)
    If it actually does what it claims on the box, this should be pretty cool
  • Worse than Nothing (Score:5, Interesting)

    by gabebear ( 251933 ) on Friday April 05, 2024 @06:56AM (#64371902) Homepage Journal
    Compression roughly equivalent to JPEG XL, slightly worse than WebP, and much worse than heic/heif/avi? why?

    Google can’t be trusted with numbers; it made similar claims when launching WebP and it’s the compression comparisons were fairly misleading. https://siipo.la/blog/is-webp-... [siipo.la]

    This seems like it will get ramrodded through into Chrome, used on a handful of websites, and then support will get gradually be removed as Google abandons it for the next hotness.

    heic/heif/avif files are already better on every metric and this new standard is WORSE than Google’s current WebP standard.
    • Re: (Score:3, Insightful)

      > heic/heif/avif files are already better on every metric

      How do I display them on software that only supports GIF, PNG, and JPEG?

      • Not supporting HEIC/HEIF is a silly political decision at this point. All decent phones, not just iPhones, have been able to use HEIF for some time. Google even added HEVC support to Chrome some time ago too. The world hasnâ(TM)t ended.

    • by AmiMoJo ( 196126 ) on Friday April 05, 2024 @07:59AM (#64371998) Homepage Journal

      JPEG-XL's reference implementation is beta quality at best. HEIC etc. are even less supported that JXL, and can't be losslessly transcoded to. WebP is better supported, but you still can't losslessly transcode to it.

      There is a reason why there are still billions of JPEGs on the internet, even though there are theoretically better formats that are supported by all major browsers.

      • by piojo ( 995934 )

        What does lossless transcoding mean in this context? Do you mean it should be able to store existing jpeg data without reencoding?

        • by AmiMoJo ( 196126 )

          I mean that the output image when decoded with the reference decoder is exactly the same.

          • by piojo ( 995934 )

            Thanks. So lossless compression. Have we figured out how to do that in a way that's useful? I mean the status quo is (or should be) that we lossily encode images only when we need them smaller or we modify them. Assuming (in the case of image modification) storing/serving the original plus layers of vector modifications isn't computationally friendly, what's the alternative to generational loss?

            • by AmiMoJo ( 196126 )

              A lot of images get uploaded as JPEG, the original lossless version is not available. Some might simply have been lost too, e.g. I wonder if the original assets for this website are still around.

              JPEG compression uses run length encoding, and can easily be improved by simply switching to Huffman coding instead. But that would break compatibility with the standard, so other options include things like losslessly rotating the image so that the run length encoding works better. JPEG has certain properties that

              • by piojo ( 995934 )

                Thanks for explaining that. It sounds like the other direction for improved lossy compression is to make the compression less coupled with the data simplification?. That way an image can be re-saved (even if cropped) and not take additional space, so long as lossy optimizations aren't reapplied. (Lossy optimization followed by lossless compression, or at least compression such that `decompress(compress(data))` is as compressible as `data` without needing the lossy step to be reapplied.)

                Forgive me if I'm far

        • Lossy transcoding means each time you save/convert it, you have to destroy data.
          You have a webpage. The webpage shows a jpeg. And it shows a second generation jpeg in a few different resolutions to save on bandwidth. Some image service caches it, creating a third generation jpeg. The image service can leech of other image services, creating a cycle.
          The 11th gen jpeg is now posted on Facebook which.... converts it into the facebook optimized version of jpeg on top of some losses.

          Now.... the original webpage

      • What use case do you have for lossless transcoding from JPEG? I don’t get that point at all. As you pointed out, JPEG is used because it’s very compatible and easy to encode which are the two things dropped with jpegli(only mostly backward compatible and takes many times more processing power and exponentially more memory.

        In the context of smaller lower-fidelity images used on the web Jpegxl/jpegli do considerably worse webp and way worse than HEIC/AVIF. JPEGXL and HEIC are both ONLY supporte
    • by Midnight Thunder ( 17205 ) on Friday April 05, 2024 @08:13AM (#64372020) Homepage Journal

      > heic/heif/avif files are already better on every metric and this new standard is WORSE than Google’s current WebP standard.

      They may be better technically, but they still require a license fee to use. As much as Google annoys me with pushing standards , at least they push royalty free compression technologies.

    • by Rei ( 128717 ) on Friday April 05, 2024 @08:16AM (#64372028) Homepage

      On what planet is JPEG XL worse than WebP?

      In most cases, JPEG XL is better than AVIF (if I remember right, it's only at very low bitrates that AVIF wins out), which is better than WebP. As a general rule, JPEG XL will be better than WebP in pretty much all cases, usually dramatically so. It's also much faster than AVIF, and can do lossless JPEG transcoding.

      (And yes, I've done my own comparisons, not just trusted external benchmarks)

      XPEG XL is a great format. This sounds like a way to try to resurrect support for its tech by integrating it into JPEG itself.

      • by Rei ( 128717 )

        Oh, and if we want to talk about patent risk, JPEG XL is much *lower* than AVIF. AV1 (the video format from which AVIF is derived) is under attack from patent troll Sisvel. They're not currently going after AVIF, but they could at any time.

      • by kriston ( 7886 )

        Speaking of JPEG XL, Whatever happened to JPEG 2000?

        • by sl3xd ( 111641 )

          Part of it is the same thing as what happened to h.265/HEVC and .heic: patent headaches. HEVC adoption suffered because a couple of patent holders broke off of the main patent pool and decided they weren't going to give FRAND license terms. If you couldn't be sure your license terms would be the same as the next guy, would you want to license HEVC? That did HEVC no favors. Time will tell if lessons were learned and applied for h.266/VVC.

          JPEG 2000 had 17 organizations that you had to license the patents from

          • by kriston ( 7886 )

            Reminds me of AOL's Johnson-Grace ART encoding.
            Too bad Johnson-Grace died with AOL.

      • (And yes, I've done my own comparisons, not just trusted external benchmarks)

        Well done.

    • Compression roughly equivalent to JPEG XL, slightly worse than WebP, and much worse than heic/heif/avi? why?

      Backward compatibility. Not having to create two of every image for browsers that don't support it.

      I mean look at browser support for WebP. HEIC is a bit patent encumbered to go in browsers as a primary image format.

    • by ceoyoyo ( 59147 )

      Just like every other JPEG replacement. This isn't the first time Google has tried. JPEG works well enough, and it's definitely not what's slowing down the Internet.

      10 bit JPEG is not terribly useful, at least on the web. If someone could hack in a backward compatible alpha channel they might actually have something worth using.

      • by AvitarX ( 172628 )

        Why isn't 10 bit useful on the web? Reasonably priced displays are catching up, especially on mobile.

        This is like the amb64 replacement for x86, it's backwards compatible. It seems to me it could likely take off where the "better" solutions have failed the same way amd64 eventually became the standard for 64 bit home computers.

        There doesn't seem to be much not to like about this (except for the lack of alpha channel).

        • by ceoyoyo ( 59147 )

          10-bit displays are still much more expensive than 8-bit (or 6-bit). Many of the cheaper "10-bit" displays are actually 8-bit with dithering. Even so, if you haven't calibrated your display, which almost nobody has, the benefit of the extra couple of bits is very questionable. 12+ bit colour depth in image files is useful, but if you care about that you're probably not using JPEG.

          Yeah sure, people might use this library, if Google makes it very easy to do. But just as JPEG. I seriously doubt any of the new

    • by gweihir ( 88907 )

      Google cannot be trusted. Period.

      All they care about is their profits, nothing else. Sure, they started a lot better than, say, Microsoft (which never had good engineering, ever), but Google is now slowly degrading.

    • Compression roughly equivalent to JPEG XL, slightly worse than WebP, and much worse than heic/heif/avi? why?

      Did you miss the whole "Backwards compatible" part? That already makes it better than literally ever alternative to JPEG you can propose, because ... JPEGs are still the most widely used format.

  • Improving quality of jpegs by improving its compression is a good thing. I still use jpegs a lot and I prefer them over PNG because PNG files usually are notably larger for the work I do. I hope GIMP ads this algorithm to their codebase ASAP.

    • The problem is that it claims to improve quality by applying new noise reduction techniques to the image as a pre-processing stage. This is supposedly a better version of what older paint programs did by blurring the image before saving as a JPEG. In practice, I've found that this does not work very well if you want a good compromise between quality and compression.

      Before adding a feature "ASAP", it might be best to consider whether it's a good idea or just marketing bullshit. I call the second point.

  • No (Score:4, Insightful)

    by quonset ( 4839537 ) on Friday April 05, 2024 @07:05AM (#64371916)

    it has the potential to make the Internet faster and more beautiful.

    Those are incompatible. If you make things faster but then stuff more crap through it, it will no longer be faster. This is like programming no longer having tight, clean code because processor and RAM speeds have soared and disk space equally expanded.

    All this will do is continue the enshitification of the web by increasing web site crap.

    • by AmiMoJo ( 196126 )

      Didn't read the summary I see. They state that for existing JPEGs, the way they decode them is based on a better psycho-visual model, i.e. a better modelling of the way humans perceive images and notice artefacts. They are saying that even if you don't do anything at all to the source images, they will look better when decoded with their library.

      That has always been the case for JPEG. You get different results depending on the library that is used to decode them. For example, the Windows one is crap so your

    • by Rei ( 128717 )

      You think the 1990s web looked just as good as today, due to bloat?

  • Worries... (Score:5, Insightful)

    by bradley13 ( 1118935 ) on Friday April 05, 2024 @07:37AM (#64371948) Homepage

    If this is a JPEG library, then it should support all existing JPEG files, and other JPEG decoders should be able to display files it creates. These words are worrying: "maintains high backward compatibility". That implies that it is *not* fully compatible. In which case, the library is not a JPEG library, but something else. We do not need yet-another-standard for images. There are plenty already.

    Also, the justifications of long web-page load times are flimsy. The biggest problem there are the crazy numbers of JS-files, often driven by ads and trackers. Of course, Google doesn't want to touch those, so...

    • by Rei ( 128717 )

      These words are worrying: "maintains high backward compatibility".

      You seem to enjoy inventing ways to frighten yourself.

      • Because anyone with a brain sees that as lawyer/weasel speak for "not 100% compatible." That means some images will break in current decoders, but they aren't telling us why or which ones.
        • by dgatwood ( 11270 )

          Because anyone with a brain sees that as lawyer/weasel speak for "not 100% compatible." That means some images will break in current decoders, but they aren't telling us why or which ones.

          The JPEG standard is pretty simple structurally (compute the DCT of a macroblock, apply the quantization matrix, read the values in a zigzag, and apply run-length encoding to the resulting stream of bits), so I have a hard time believing that it would even be possible to create something that works with some decoders and breaks with others. It's more likely that what they mean is "We haven't tested every possible image in the world, so it's possible that some image we haven't compressed yet might result in

          • Re:Worries... (Score:4, Informative)

            by Rei ( 128717 ) on Friday April 05, 2024 @01:02PM (#64372758) Homepage

            I'm not sure why everyone is speculating, when there's a link going into detail about what they do. Which is a lot of different things, but mainly:

            * They run the encoding algorithm in a higher bitspace floating point (including with support for 16-bit and 32-bit input images) instead of an 8-bit int colorspace, putting off the conversion to a low bitspace until it's absolutely required. On decoding, they immediate jump to high bitspace floats as well.
            * They offer some slightly smarter dequantization on decoding - instead of just assuming that all AC DCT values should just be converted into floats of the same value, they slightly skew them to account for a Laplacian distribution of how they were rounded during the encoding process (e.g. there's not equal odds of rounding down as rounding up during encoding).
            * They use modern human visual similarity algorithms for maximizing perceived quality during encoding instead of old simplistic ones
            * In particular, in deciding what bits to null out for RLE encoding, instead of just nulling out the lowest energy components, they weigh how different choices would impact human ratings vs. how much theoretical compression they could provide, not just on a single-block scale, but on multi-block scales.
            * A similar psychovisual approach is used in guetzli, but it's very slow; jpegli is as fast as old, simplistic encoders.
            * They offer 10-bit support (legacy decoder libraries only see the base 8 bits).
            * JPEG supports defining ICC color profiles. They use the efficient XYB colorspace.

            No, it's not JPEG XL repackaged and called "JPEG". It can't do a lot of JPEG XL's great techniques to save space, because they would break backwards compatibility. But it does take a lot of the things from JPEG XL's encoder and decoder that you can shoehorn into a standard JPEG format.

            It's also interesting to me in that it offers a platform for steadily squeezing more new "extras" into the JPEG format, using the 10-bit encoding as an example: those with jpegli can use extra data in the file to see a higher quality image, while legacy readers still see a basic image. It seems that there could be near-limitless potential for this, that they could incrementally work in with subsequent releases - it's just a question of how to simultaneously maximize legacy-reader quality in an efficient manner. You don't want jpegli viewers to see a near-lossless masterpiece and legacy viewers to see blocky garbage because 90% of the bytes are in the supplemental data, after all.

            Lastly, it's interesting because - since it can be dropped straight into workflows and offers immediate benefits while retaining backwards compatibility - it has the potential to become widespread. But it's part of the JPEG XL package - it's literally libjxl/lib/jpegli. So it has the potential to make JPEG XL support become widespread at the sametime. And JPEG XL is a great format, esp. for high fidelity images (AVIF is slightly better at low-fidelity images, but worse on high fidelity images, and is PAINFULLY slow... as well as having some patent risks)

        • by Rei ( 128717 )

          Or, you know, instead of trying to interpret a summary through the worst possible lens, you could read the actual link?

          All the new methods have been carefully crafted to use the traditional 8-bit JPEG formalism, so newly compressed images are compatible with existing JPEG viewers such as browsers, image processing software, and others."

    • by gweihir ( 88907 )

      Enshittification at work. Google only sees its own stuff these days and Google engineering is nothing to write home about anymore. All they are interested in is increasing their profits and to hell with anybody else and who cares if they do significant damage to others. I know a few people working at Google and they all say that it is just a regular (crappy) large enterprise these days, no magic to be found anywhere.

  • by v1 ( 525388 ) on Friday April 05, 2024 @08:14AM (#64372026) Homepage Journal

    webp has been a headache for me and a lot of other people. While I can agree that it tends to be about half the file size, these are already usually small-ish files, and I don't really need the savings. All it does for me is cause my older apps to go "what is this?" and refuse to display them as inline images or have other compatibility problems. There was a big push for webp but it seems to have gotten enough pushback to have caused a lot of places to back off and return to the old reining king JPEG.

    I don't see anything different happening with JPEGLI. A momentary annoyance for a lot of people with very little functional benefit.

    • I don't see anything different happening with JPEGLI. A momentary annoyance for a lot of people with very little functional benefit.

      Unless I'm misunderstanding the summary, Jpegli encodes to standard JPEG, just does it smarter. So it shouldn't create any headaches.

    • by gweihir ( 88907 )

      Google has become like Microsoft in that regard: They only see their own use-cases and anything else they just do not realize may even be there. One of the reasons why interoperation with anything Microsoft is a huge pain and often impossible to do well. Not good engineering by any measure.

  • by JamesTRexx ( 675890 ) on Friday April 05, 2024 @09:10AM (#64372092) Journal

    How about making clean and simple websites without a bunch of 3rd party connections.

    Looking very much at you, Google spyware.

    • A million times this! Not once in the past 20 years have I ever pointed the finger at an image as the reason why a page is bogged down. Since I installed script blocking, slow sites are almost always showing an attempt to access *dozens* of 3rd party domains that I restrict by default. This site? It's only using 2 other domains, fsdn and cloudfront. Slashdot loads just as fast as it always has.

  • Deadpool got jpegli'd by his girlfriend on International Women's Day.
  • Reducing noise is not something an image compression algorithm should be doing. What if the purpose of your images is to compare the amount of noise that your camera generates at different ISO settings? What if you specifically wanted noise in order for your image to appear grainy and old-fashioned? Why don't they enhance the colors while they're at it and automatically blur the background? I'm sure that would also reduce file size. Doesn't anyone care about image fidelity? We don't need the internet "beaut

    • by ceoyoyo ( 59147 )

      Don't use JPEG for the first one, probably not for the other ones either. Lossy compression has to lose something, and the first choice for that something is the noise. If you care about image fidelity, don't use lossy compression.

      Realistically, Google probably doesn't mean "noise," they mean "error."

      • by Rei ( 128717 )

        Ultimately, all compression is based on the notion that not all images are equally likely. A photograph of a cat is a lot more likely than random RGB noise, and even if you had a photo of random RGB noise, it's extremely unlikely that you cared about a specific arrangement of random RGB noise verses some other one.

        The ability to compress in turn relies on how well you have a model of the appearance of the world. Ultimately, thus, we'll be headed to latent representation for compression, since by definition

    • by gweihir ( 88907 )

      Indeed. While they should add as little noise as possible, _reducing_ noise is image processing, not image compression and a compressor that does it is broken by design.

    • The whole point of JPEG compression is that high frequency components (grainy noise) can be discarded while maintaining the image similar for practical purposes. In a lossy compression scheme, some things must go, and this was a sensible choice at the time. These days, decent compression schemes will recognize grainy noise and regenerate it upon decoding.
  • I can't wait to get to discover all the steganographic injections and bugs for a whole new library all over again!
  • I use PNG for all my web design. There are still people using lossy assets?

I have yet to see any problem, however complicated, which, when you looked at it in the right way, did not become still more complicated. -- Poul Anderson

Working...