Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Mozilla The Internet

Mozilla Doubles Down on JPEG Encoding with mozjpeg 2.0 129

An anonymous reader writes: Mozilla today announced the release of mozjpeg version 2.0. The JPEG encoder is now capable of reducing the size of both baseline and progressive JPEGs by 5 percent on average (compared to those produced by the standard JPEG library libjpeg-turbo upon which mozjpeg is based). Mozilla today also revealed that Facebook is testing mozjpeg 2.0 to see whether it can be used to improve the compression of images on Facebook.com. The company has even donated $60,000 to contribute to the ongoing development of the technology.
This discussion has been archived. No new comments can be posted.

Mozilla Doubles Down on JPEG Encoding with mozjpeg 2.0

Comments Filter:
  • Sorry, it is hard to get excited about marginal improvements in still raster image compression. Yes, rasters are really important and everything, but at the same point images themselves are so absurdly small already compared to the likes of video technologies, and video is the vast majority of the internet. Seems like a better use of their time to focus on making HEVC or VP9 more capable.
    • by Anonymous Coward on Tuesday July 15, 2014 @01:10PM (#47459199)

      5% of image bandwidth saved for someone like Facebook is millions of dollars in operating expense. Get a clue.

      • by tysonedwards ( 969693 ) on Tuesday July 15, 2014 @01:24PM (#47459309)
        And when Facebook is saying that only 1.48% of their bandwidth is going towards images. That puts said reduction 5% reduction at a new percent of 1.41% at the expense of increased CPU time to transcode all existing images, which is itself not free. It is a marginal savings, even for an organization the size of Facebook. It certainly adds up over time, which is great, but when there is really great low hanging fruit like cutting the 37% of their bandwidth used on videos by 20-30% by getting HEVC or VP9 really working well (would then be 26% total), then that is a way to save significant money not just in Bandwidth but in Disk Space for retention as well.
        • But what fraction of videos on Facebook are actually served by Facebook?

          Most of what I see on Facebook is just links to external sites, and Facebook doesn't host it at all.

        • by omnichad ( 1198475 ) on Tuesday July 15, 2014 @01:41PM (#47459423) Homepage

          ...low hanging fruit like cutting the 37% of their bandwidth used on videos by 20-30% by getting HEVC or VP9 really working well

          If they wanted to tackle the low-hanging fruit, why not stop auto-playing video at all?

          • If they wanted to tackle the low-hanging fruit, why not stop auto-playing video at all?

            Ad revenue.

            • There is no ad revenue to be made from a video recorded by a friend starting to play automatically in my feed. It plays the video, not an ad.

        • by Charliemopps ( 1157495 ) on Tuesday July 15, 2014 @02:02PM (#47459665)

          And when Facebook is saying that only 1.48% of their bandwidth is going towards images. That puts said reduction 5% reduction at a new percent of 1.41% at the expense of increased CPU time to transcode all existing images, which is itself not free. It is a marginal savings, even for an organization the size of Facebook. It certainly adds up over time, which is great, but when there is really great low hanging fruit like cutting the 37% of their bandwidth used on videos by 20-30% by getting HEVC or VP9 really working well (would then be 26% total), then that is a way to save significant money not just in Bandwidth but in Disk Space for retention as well.

          I deal with this sort of thing all day at work... you're not appreciating the scale of small adjustments.
          For example: I recently got asked to approve an upgrade to internet explorer on an enterprise network.
          I tested, and replied back that In one application, there was a 3 second delay in opening records. I declined approval and said this issue would have to get fixed before I could sign off on it.

          Lots of managers had your attitude... it's only a 3 second delay!

          So I had to present my reasoning in a meeting to explain:
          We have approximately 1000 users that will be affected.
          They each open, on average, 100 records per day.
          They get paid an average of $15/hr
          1000 users times 100 records = 100,000 records per day
          Times 3 seconds = 300,000 seconds
          Divided by 60 = 5000 minutes
          Divided by 60 again = 83.33hrs
          Times $15/hr = $1250

          Not fixing that issue would cost the company roughly $1250 per day!
          It's nearly a half a million dollar per year problem!
          The fix is an increase in memory that would cost the company a 1 time charge of less than $20k.

          Scale matters.

          • by makq ( 3730933 )
            And this is a great demonstration of why people hate IT departments.
            • Luckily we don't care, because we're right.

              • Luckily we don't care, because we're right.

                No, you are wrong. This kind of analysis was debunked before I got to college 30 years ago.

                Just because something on the computer takes more or less time doesn't mean the user isn't adapting and overlapping other behaviors during those 3 seconds. Do a controlled experiment and come back when you have real data.

                If your analysis is so great why aren't you advocating moving to technologies that take less time to bring up a record? Or pre-loading the next record, or anything to save your oh-so precious 300 seco

                • this kind of thing probably does help, but, yeah, not because of the magical linear time bank.

                  rather, something that keeps the user's attention often means less coffee breaks and chit-chat. waiting sucks.

                • by tlhIngan ( 30335 )

                  Just because something on the computer takes more or less time doesn't mean the user isn't adapting and overlapping other behaviors during those 3 seconds. Do a controlled experiment and come back when you have real data.

                  3 seconds is long enough to be annoying, and short enough that an annoyed user will take longer than 3 seconds.

                  After the update, they'll waste 3 seconds each time. Then after a week, they'll use those 3 seconds to respond to a text on their cellphone, or an IM, or browse Facebook. So now th

                  • If the user opens a record and waits 3 seconds then spends 30 seconds staring at the record or typing in it or talking to a customer, then the UI is messed up.
                    The computer is pretty idle in the subsequent 30 seconds and can be loading the next form in the background.

                    The people who use my PoS like it not because of it's neato use of python and ncurses. They like it because it responds instantly, is very non modal and never required more than two or three keystrokes to complete a transaction. If you are payin

          • by Belial6 ( 794905 )
            Your math is a lie.
        • by radish ( 98371 )

          Don't forget storage. Bandwidth is one thing, but image storage is a big deal for sites like FB. They often store multiple copies of each one (e.g. at different sizes) and then you also have copies cached on CDNs etc, which also costs money. 5% isn't going to make or break the company, but it's worth investigating.

        • Who's to say they don't have a few coders working on better video codecs? They seem at least to be hiring for video chat: https://www.facebook.com/caree... [facebook.com] Every bit counts with a site their size. Google doesn't even close its tags.
        • so, apparently it's only worth $60k to them.

        • Why would they transcode all existing images? They could re-scale the thumbnails, but they wouldn't dare touch the originals unless they were so massive it made sense to give it a go.

          Thumbnails for new images, ignoring existing ones, would more than pay off in just one more world cup, simply by replacing the existing implementation with this one.

          So let me re-phrase. Mozilla open source people who can work on stuff because they want to, or can attribute some generic benefit, have teamed up with one of the

        • One fundamental difference. Improving the JPEG encoding used by Facebook would reduce their bandwidth use without requiring users or browser developers to do anything. Moving to H.265 or VP9 video encoding requires two things: that browsers add support for those video formats (some have already done that), and users upgrade to browser versions that include the support.

          Most likely, Facebook would not transcode existing images; unless they saved the originally uploaded files (rather than the transcoded ones t

      • Any company that can save millions from something this small is so big that millions don't mean anything.

        • Even the biggest companies care about million(s). Reduction in operating expenses are always a good thing and if facebook can shave off million dollars a year without significant downsides they will take it gladly. Big companies didnt get big by being inefficient.

      • by Selur ( 2745445 )

        so donating 60k isn't really much,..

    • and video is the vast majority of the internet

      And yet, you're posting that on a web site which is mostly text.

      I suspect a 5% decrease in size yields a very large percentage in bandwidth savings over time.

      And there will always be people with slower links who will benefit from this.

    • by Sigma 7 ( 266129 )

      video is the vast majority of the internet. Seems like a better use of their time to focus on making HEVC or VP9 more capable.

      Most videos (at least those linked to from meme-based image sites) are stored in GIF format, despite them taking twenty times the bandwidth/file size of the Youtube video they're based on.

      Thus the best way to save space/bandwidth is to find a way to optimize compression of .GIF files.

      • Most videos (at least those linked to from meme-based image sites) are stored in GIF format...

        While I don't disagree that the storing videos in GIF format is incredibly inefficient (and annoying), I somehow don't think that "meme-based image sites" are actually a significant fraction of internet bandwidth use compared to websites that use more standard video formats.

        • by Mal-2 ( 675116 )

          Most videos (at least those linked to from meme-based image sites) are stored in GIF format...

          While I don't disagree that the storing videos in GIF format is incredibly inefficient (and annoying), I somehow don't think that "meme-based image sites" are actually a significant fraction of internet bandwidth use compared to websites that use more standard video formats.

          Not to mention that our poster child for "meme-based image sites" now supports webm [4chan.org], and the format has become incredibly popular there.

    • Don't some video codecs use JPEG algorithms to encode the i-frames? This could translate in better video compression too.

    • by Gerald ( 9696 )

      Most of the pages on my web sites have a combination of PNG and JPEG content and almost no video. Smaller images means faster page load times for my users.

    • Yes. I've written a program that achieves a similar saving for PNGs and other lossless formats by making them very slightly lossy (It actually works by slightly adjusting the boundary between quantization bands, so no pixel ever changes value by more than a tiny, definable amount). What happens to it? No-one wants something like that, so it joins all my other dabbling in obscurity.

  • by chrylis ( 262281 ) on Tuesday July 15, 2014 @01:08PM (#47459183)

    and still no merge of the working WebP patch that was proposed four years ago [mozilla.org] because NIH.

    • Google has deliberately killed more technologies than Microsoft ever just let wither and die, and Mozilla has been burned by this more than once. At this point, I'd say it's quite reasonable to demand that Google provide some assurances that it's not going to flake out this time.

      • Google has deliberately killed more technologies than Microsoft ever just let wither and die, and Mozilla has been burned by this more than once.

        Really? You just throw this assertion out there without spending the extra few seconds to name a few of these technologies? I'm really curious.

      • Google has killed services before that cost them money to continue operating, that's not the same thing as an open source codec at all.

    • by roca ( 43122 ) on Tuesday July 15, 2014 @05:54PM (#47462227) Homepage

      The reason we're not merging WebP in a hurry is because it's not very good. The study results linked to in the article show that WebP isn't much better than mozjpeg. (This is especially clear in the second part of the study where mozjpeg is tuned for SSIM.) On the other hand the study shows HEVC *is* much better than WebP/mozjpeg, so we know a much better format than WebP is technically available *now*. We can't simply adopt HEVC as is due to patent licensing issues, but we should be able to create an unencumbered format with similar or better performance (e.g. using VP9 or Daala as a base). It doesn't seem like a good idea to try to move to WebP when we know a better format is coming fairly soon (probably within a couple of years).

      • Re: (Score:3, Insightful)

        by rrp ( 537287 )
        One problem with your logic is that WebP isn't just a replacement for jpeg. Sure it can be used that way, but WebP also supports alpha channels and animations. And yes, you can argue that we can just use a HTML5 video for that (except I've only heard of Chrome supporting transparent videos at the moment...), but it's much more complicated than creating a WebP with those features, and it can be shown on a website with a simple img tag, IMHO. And being able to take for example a 10 MB animated gif and shri
        • by roca ( 43122 )

          We agree that alpha channel and animations are required features for any next-gen image format, but that doesn't change the analysis. It doesn't make sense to try to get WebP support everywhere for those features or for small compression gains, when in a couple of years or less we could introduce a new format with those features and big compression gains.

          • by rrp ( 537287 )
            WebP already has big compression gains. Just not in comparison to a static jpg. But compared to an animated gif it's a huge savings. I know I'm not going to convince you or Mozilla to change your positions, but to outsiders, who aren't involved in the browser wars, it seems rather silly that you guys won't add support for this just because something better might come alone a few years down the road, when there's nothing else available to do it now.
      • by Njovich ( 553857 )

        so we know a much better format than WebP is technically available *now*. [...] It doesn't seem like a good idea to try to move to WebP when we know a better format is coming fairly soon (probably within a couple of years).

        So what you are saying is that for the next few years (plus the past 4 years), WebP was available for you to use, better than mozjpeg is even now (and having a bunch of extra features like lossless compression, animations, alpha channel), but you will not use it because you are waiting

      • by AmiMoJo ( 196126 ) *

        So what if WebP is only slightly better than mozjpeg? That isn't a reason not to include the patch. Even a slight improvement is worthwhile for site owners and people on limited speed connections. What reason is there NOT to include it?

        • by olau ( 314197 )

          I think the reason is that once it becomes widespread, all browsers will have to carry the support forever, not just distributing the code but also maintaining it (fixing security bugs etc.).

    • False equivalence. Facebook could use that $60k to fund/bribe inclusion of WebP, or maintain their own build of AssWeasel or whatever fork they want to call it.

      Or they could entice users with "if you want to see pictures, click here". That works really well, and the facebook using billions might convert to something other than firefox.

      Or, they could be supportive of this new tech and not use their massive market share to clobber open source into submission.

      On the other side of the argument, NIH is a terri

  • Comment removed based on user account deletion
    • by mwehle ( 2491950 )
      Yes, but this is still 2014, not the future, silly!
      • by gstoddart ( 321705 ) on Tuesday July 15, 2014 @01:23PM (#47459295) Homepage

        Yes, but this is still 2014, not the future, silly!

        Correct, now is now.

        But back then, now was then, and now was in the future. Of course, now 'then' was back then, and was the past. In the future, now will also be in the past, as will then. But by then, then will be now and then will be the present. The future present, not the current present.

        So, soon, when now is later, the then now will be then, and now will still be in the past. But by then we won't have to worry, because it will be now, and I've already told you what happened.

        Every now and then you need to remember which now, which then, how long until then is now, or how long ago then then was now.

        Then you can say that you did know now what you knew then. Of course, when you say now then, it will be a different now than now, because it will be then.

        It's all very complicated now, but by then it will make perfect sense.

        • by mythosaz ( 572040 ) on Tuesday July 15, 2014 @01:35PM (#47459381)

          It's all very complicated now, but by then it will make perfect sense.

          Colonel Sandurz: Try here. Stop.
          Dark Helmet: What the hell am I looking at? When does this happen in the movie?
          Colonel Sandurz: Now. You're looking at now, sir. Everything that happens now, is happening now.
          Dark Helmet: What happened to then?
          Colonel Sandurz: We passed then.
          Dark Helmet: When?
          Colonel Sandurz: Just now. We're at now now.
          Dark Helmet: Go back to then.
          Colonel Sandurz: When?
          Dark Helmet: Now.
          Colonel Sandurz: Now?
          Dark Helmet: Now.
          Colonel Sandurz: I can't.
          Dark Helmet: Why?
          Colonel Sandurz: We missed it.
          Dark Helmet: When?
          Colonel Sandurz: Just now.
          Dark Helmet: When will then be now?
          Colonel Sandurz: Soon.
          Dark Helmet: How soon?

          • by Guspaz ( 556486 )

            Corporal: Sir.
            Dark Helmet: What?
            Corporal: We've identified their location.
            Dark Helmet: Where?
            Corporal: It's the Moon of Vega.
            Col. Sandurz: Good work. Set a course, and prepare for our arrival.
            Dark Helmet: When?
            Corporal: Nineteen-hundred hours, sir.
            Col. Sandurz: By high-noon, tomorrow, they will be our prisoners.
            Dark Helmet: WHOOOOOOO?!?!?

    • by Desler ( 1608317 )

      That's because those nerds were idiots.

    • Comment removed based on user account deletion
    • Any "nerd" who posited that bandwidth and storage concerns would be so totally irrelevant that we'd happily waste 10-20x as much of them for practically zero benefit was not so much a "nerd" as a total idiot. Having more bandwidth means you want to do more with it, not waste it for no reason.

      Real "nerds" worth the cred understand that not only does lossy compression provide great results at small fractions of the sizes of the best lossless representations, but research into lossy compression also helps us u

      • by jensend ( 71114 )

        that'd be this link [sspnet.org]

        that'll teach me to use preview esp. when I've been spending too much time on sites where the article discussions use bbcode

    • by Lehk228 ( 705449 )
      I remember back when images over 15k needed to have a damned good reason for being over 15k
  • G+ is making FF crash its ass off these days. It would be a fun paranoid speculation to imagine that Google knows of bugs in FF which can be triggered with mundane code to make my browser asplode, but even if it were true it would be irrelevant because mundane code shouldn't blow up Firefox.

    • Start Firefox up in safe mode and see if it still happens. Chrome did right doing extensions the way they did... the way Firefox did it, it's near impossible to tell the difference between a problem caused by the browser or one of its addons.
    • Comment removed (Score:4, Insightful)

      by account_deleted ( 4530225 ) on Tuesday July 15, 2014 @01:49PM (#47459487)
      Comment removed based on user account deletion
    • by Megane ( 129182 )
      I instead have a non-paranoid speculation that the FireFox people keep adding new buggy features faster than Google can swerve to avoid the bugs. So I use SeaMonkey instead.
      • Or, alternately, Google introduces the suck faster than anybody else can counterbalance it.

        Just sayin'.

    • G+ is making FF crash its ass off these days.

      If only there was a simple workaround for this, but it's just not coming to me.

    • Why indeed would Mozilla waste their resources on this when stability and security on web clients ought to be their greater concern?

      If it were up to me, I would start with self-contained date formats like JPEG that browsers handle frequently, and put that code through a formal verification process. Eventually, maybe even HTML rendering and the browser could be subject to formal verification. This could strengthen computer security dramatically.

  • Take all landscape photos and crop 33% of the space on the left side and 33% on the right side. I see they already do this to widescreen videos taken on smart phones.

    I just saved 66% of their bandwidth, and made the images more appealing to hipsters and guidos!

    • Bah, you could replace 33% of all Facebook pictures with cute cat photos, and nobody would even know the difference.

      • Here is how their new compression method works. It reduces all images down to a single one or zero. If the bit read in is one, it display a picture of a cat. If it is zero, it displays a picture of Peter Griffin farting. And as a bonus, if no bit can be read, it displays the goatsex guy.

  • AMD recently presented HSA-enabled jpeg decoding. That would also be an interesting addition. Make these shaders work a little...

    http://developer.amd.com/resou... [amd.com]

  • If Facebook really wanted to help reduce global bandwidth and was willing to play hardball, they would just switch their images to webP suddenly, and display a message to update your browser if you cant see them. Microsoft would have to fold if suddenly their browser didn't show images. Not sure if FB is large enough to survive the backlash, but if they are we could see new codecs in IE within the month.
    • Microsoft would have to fold if suddenly their browser didn't show images.

      With Apple holding a monopoly on web browser engines on 36 percent of tablets, it arguably has room to play hardball with Facebook.

    • I'm not convinced webP is better, I've done a quick comparison and to my eyes JPEG beats it on like for like file sizes:

      40k jpeg [imgur.com] vs 40k webP [imgur.com] (images converted to png for your viewing convenience)

      Compared to the lossless original [imgur.com] (one of Google's own webP comparison images) the webP version has lost more chroma resolution, leading to desaturation in parts and blurring of strong colour details like the red arm band.

      It would be nice as a replacement for PNGs with alpha channels though.

  • What we really need is a new container format that combines the image data of a JPEG with the 8-bit transparency layer of a PNG image.

    • It exists, it's called JNG but support for it is poor :(

      • And even if support for it was added tomorrow, we probably couldn't use it for another decade or so because of Internet Explorer (insert version numbers still in use today).

  • by Anonymous Coward

    As long as the renderers support it, and it doesn't come saddled with unFRANDly baggage, it will probably do fine.

    However, Mozilla has a questionable track record [wikipedia.org], when it comes to support for standard media.

    As a Web developer, it rankles me to have to store double the size of the video on my server, just so that stubborn developers can have their way.

  • JPEGmini [jpegmini.com] is a proprietary implementation of the same concept : fully compatible implementation of a JPEG compressor that does better than libjpeg at the cost of much higher CPU use.

This restaurant was advertising breakfast any time. So I ordered french toast in the renaissance. - Steven Wright, comedian

Working...