Slashdot stories can be listened to in audio form via an RSS feed, as read by our own robotic overlord.

 



Forgot your password?
typodupeerror
Graphics Software

JPEG2000 Coming Soon 489

Posted by michael
from the faster-pr0n dept.
Sonny writes "In a few months time, internet users will be able to make use of the JPEG2000 standard which, its developers claim, enables web graphics to be downloaded much faster than is currently possible. This will not only make graphics-heavy web pages easier to download, it will also preserve image quality. The JPEG standard compresses image files which are then transmitted across the web faster than uncompressed files. Now, researchers at universities around the world have developed JPEG2000, the next-generation image-compression technology under the auspices of the International Standards Organisation. It is the first major upgrade of the standard since it first appeared in the early '90s. What is also important about the technology is its ability to send files without loss of data, which is not the case with current JPEG files. To take advantage of a JPEG2000, web browsers will need a Plug-In for either Internet Explorer or Netscape browsers. These free plug-in's are expected to be available later this year. The extension for the new files will be ".jp2"."
This discussion has been archived. No new comments can be posted.

JPEG2000 Coming Soon

Comments Filter:
  • Wow! (Score:5, Funny)

    by Starship Trooper (523907) on Sunday April 07, 2002 @02:59PM (#3300045) Homepage Journal
    And only two years late, too!
    • Re:Wow! (Score:4, Funny)

      by iamdrscience (541136) <michaelmtripp&gmail,com> on Sunday April 07, 2002 @03:01PM (#3300055) Homepage
      that was exactly what I was thinking. An alternate obvious witty comment would be "I didn't even know they made JPEG2 through JPEG 1999!"
    • LuraTech has been shipping a JPEG 2000 broswer plugin (and Photoshop plugin) for a while. I've had problems with their jp2 file format. It's a new and very complex standard so expect some startup pains. http://www.luratech.de/index_e.html

      Dr. David Taubman was one of the authors of the JPEG 2000 spec. His book on JPEG 2000 is now available. He has also released a very nice SDK for compressing, decompressing, manipulating and streaming JPEG 2000. It's called Kakadu and you can read more at http://www.kakadusoftware.com/. The source code for Kakadu is packaged with the book. There are demo images and software at this site also.
  • Excellent! (Score:4, Funny)

    by Anomolous Cow Herd (457746) on Sunday April 07, 2002 @03:00PM (#3300050) Journal
    I hear that the early adopters at goatse.cx will be implementing this very soon for better compression and the deeper, richer reds supported by the standard.

    Oh, and FP, BITCHES!

  • by Saeculorum (547931) on Sunday April 07, 2002 @03:00PM (#3300054)
    More pr0n, quicker.
  • by checkitout (546879) on Sunday April 07, 2002 @03:04PM (#3300066)
    If we aren't all using PNG right now, there's no way we're gonna be using jp2

    I think we're just stuck with jpeg and gif for about the next 5-10 years, until browsers in general get reinvented.
    • by NanoGator (522640) on Sunday April 07, 2002 @03:10PM (#3300094) Homepage Journal
      "If we aren't all using PNG right now, there's no way we're gonna be using jp2 "

      You're talking about the difference between 300k and 20k. The reason that .PNG wasn't adopted in the internet world is that it didn't compress enough. Also, it's alpha channel was never really utilized. There are those in the 3D-Art world that think .PNG is a god-send, however.

      JPEG2000 has a few things going for it:

      - Familiar Name
      - Familiar Standard
      - Smaller filesizes
      - Likely to be better supported by IE and other browsers

      • While you don't often see it "in the wild" as it were--except in some Japanese manga newsgroups where it's the standard since it's lossless--it's very often used when working with graphics. Unless you need to save with seperate layers, why use bulky formats like .PSD or .PSP when doing graphics work and needing to save your work losslessly, when .PNG is usually so much smaller? I save everything as .PNG for future work, and then when it goes up on the Web I can batch output crappier-quality .JPEGs.

        So, the PNG format is a resounding success among those who work with images, and then we dumb it down to JPEG or GIF for the end users.

        All that said, I don't see the JPEG2k standard really succeeding--now that more and more users are getting faster and faster connections, there's not that much need for the smaller filesizes. Combine that with the fact that users have to install a plugin to see them--and God only knows which browsers and what versions they may be using--and I don't see webmasters clamoring to adopt it, and if they don't adopt it in large numbers, the format will never catch on. So it's very iffy at the moment, IMHO, as to whether the new standard will ever replace the old--certainly not in 5 months from now, and probably not even in 5 years from now, since the need for file size savings keeps slowly evaporating.
        • All that said, I don't see the JPEG2k standard really succeeding--now that more and more users are getting faster and faster connections, there's not that much need for the smaller filesizes.

          However, at the same time that endusers are getting faster connections, more and more webmasters are starting to feel the very real pain of bandwidth costs. I think the adoption of smaller image formats will come more from their side.
          If jpeg2000 makes enough of a difference, I can see at least a very large percentage of graphics oriented sites switching - almost forcing the users to follow suit.
          • Interesting point. (Score:5, Insightful)

            by Chasing Amy (450778) <asdfijoaisdf@askdfjpasodf.com> on Sunday April 07, 2002 @04:55PM (#3300560) Homepage
            You have an interesting point--bandwidth is getting more dear, now that the pyramid-scheme banner advertising revenue outfits have been going tits-up. However, I just don't see most website owners risking their livelihoods by implementing an image format which most of their customers, plugins or no, may not be able to read--particularly since so much of web layout is done using images these days instead of text.

            Think about it--how many users are set to automatically download plugins as needed? Almost none because of security reasons. herefore, some active decision is needed on behalf of the user to actually install the plugin or not. What will be the user's reaction if he goes to the site of WidgetCo, doesn't know what to do with this dialogue box about installing stuff (especially if he's been told be friends or company that installing strange software can be dangerous, or if he's been molested by the likes of CometCursor), says "No", and gets a page of big X's where all the buttons and banners should be? Well, it might well be to go to the site of WidgetBiz instead to get his widgets there.

            This is why I really don't see JPEG2k taking off. It's a risk most companies won't take--you don't want your users not being able to use your site. Look how long it took Flash to become as common as it is today--many years, and then only because it started shipping by default with Windows.

            I have no doubt that IE7 will have JPEG2k support--poor and half-hearted support. As with most Microsoft products it'll probably take the until the second major release to get it right, so let's say IE8 will have fully implemented JPEG2k support out-of-the-box. How many years will it be until that's out? And how much further along will available bandwidth be by then?

            I could well be wrong, but I just don't see this taking off. Unlike Flash did, it doesn't bring anything spectacularly new to the table--a few people have been talking about the visual effects you can get using wavelet images, but those same effects are common (if poorly implemented) Flash effects today, in addition to the many other effects Flash does. So that leaves us with the better compression over JPEG as its big marketing point...and I just don't see that being enough to get website owners to risk alienating end users. So *at least* until great JPEG2k support ships with IE out of the box, and that version of IE is common, I don't see JPEG2k going anywhere except into some niche markets.
            • > a risk most companies won't take--you don't
              > want your users not being able to use your site.

              Judging by the state of usability of the vast majority of sites, I'm thinking that most companies couldn't care less whether people can use their sites or not.
          • by Yorrike (322502)
            As a webmaster and web designer, I see the wide spread adoption of SVG and PNG as far more important steps than a new JPEG.

            SVG is the most fantastic vector based graphics format ever created. Not only is it fully scalable to what ever size you care to scale it to, but it's all done in XML, which means scripting graphics creation is nigh on basic.

            Since SVG is basically text, the file sizes are tiny! Add to this the fact that svgz (gzipped svgs) are part of the svg standard, and you end up being able to create fully interactive vector based animations weighing in at less than 1K (try this [xml.com] -it's a perfect example of how cool SVG is)

            As it stands now, SVG can only be viewed on IE and NS4 with a plug-in, but Mozilla supports it natively if you enable it ;) It's a more important standard to propergate than JP2 IMHO.

            On the PNG front: PNG is so much better than any other format for layout graphics on web pages. It's alpha tranparency and colour pallet is all you need (it runs circles around GIF). PNGs should be the internet standard for non vector graphics, but alas, IE does not render them properly (the colours get twisted and changed as far as I've experienced). If MS could stick to standards, it'd make the internet a whole lot better.

            Anyway, in conclusion, JP2 may sound nice, but there are much more important formats out there that need to be adopted before JP2, which will not only cut down the transfer sizes of graphics, but make web development just that much easier for people like me.

            • by Tet (2721)
              SVG is the most fantastic vector based graphics format ever created.

              SVG is a good vector format for the arena it was designed to serve (primarily, the web). For other uses, the text based markup is a tad bloated, and the fact that it's easily scriptable isn't a factor. It's not perfect, but the web needs a good, open vector graphics format, and SVG is a well designed option, in most ways. I just wish they'd get the fonts right [levien.com]. Of course, Flash has been providing web based vector graphics for ages. It's just that it was always aimed at presentation, and didn't take into account accessibility, searching, consistency of navigation and all the other things that we should expect a vector format to provide. In that respect, SVG is a significant step forward, and I hope it starts to gain widespread acceptance soon. But with even Mozilla not supporting it in many of the standard builds, it has a way to go before that stage.

        • by blamanj (253811) on Sunday April 07, 2002 @04:14PM (#3300394)
          But web pages aren't the only things digital images are used in. Think cameras.

          This site [dpreview.com] illustrates the difference in quality between JPEG and JPEG2K. You get essentially a 5x reduction in storage space without losing quality, and the type of artifacts aren't as annoying, either.
    • Speak for yourself. I'm heavily using PNG right now. Why is it important that everyone should use new standards? As long as they are supported in browsers (*) and I am free to use them, I don't care what everyone else is using.

      Sure, shorter download time would be nice, but PNG isn't really providing that. PNG however makes the job of the web developer easier.

      (*) I'm still annoyed that IE doesn't support alpha-transparency though.
      • (*) I'm still annoyed that IE doesn't support alpha-transparency though.

        It does, you just have to use the IE-proprietary AlphaImageLoader filter (it's a CSS extension). I agree this is a pain in the ass, and why they just don't support the alpha channel with regular img tags is beyond me, but at least with a little PHP or Javascript you can make it work.

        Jason.

    • If we aren't all using PNG right now, there's no way we're gonna be using jp2

      Who are you ?
      I use .png very often, for example in my programs (tiles, textures) or when I save scanner output.
      • Look at graphics on websites that you visit. I haven't found one major site that uses pngs (not that I've looked all that often, but wherever I look, it's a .jpg or .gif). I throw my graphics around as .pngs, but that doesn't make it a standard. As an internet community, we aren't using PNG.
  • Stupid extensions (Score:5, Insightful)

    by binner1 (516856) <bdwalton.gmail@com> on Sunday April 07, 2002 @03:04PM (#3300068) Homepage
    Why .jp2??? Why not .jpeg2. This legacy DOS naming convention drives me nuts. Not even Windows is crappy enough to still require 8.3 filenames.

    I still cringe when I see default.htm. It's a frickin' html file, name it properly.

    -Ben
    • by checkitout (546879) on Sunday April 07, 2002 @03:08PM (#3300088)
      Why .jp2??? Why not .jpeg2.

      Because they're latching onto the idea (and popularity) of .mp3 and we dont have a .mpeg3 extension in active use.
    • by Snowfox (34467) <snowfox&snowfox,net> on Sunday April 07, 2002 @03:32PM (#3300210) Homepage
      Why .jp2??? Why not .jpeg2. This legacy DOS naming convention drives me nuts. Not even Windows is crappy enough to still require 8.3 filenames.

      Just because names can be made longer doesn't mean that they should.

      .jp2 is sufficiently clear, and it won't clutter diretory listings. Save the longer, more descriptive extensions for more obscure things.

      • i think .jpg2 would be much clearer than .jp2. most files that start with a .j and arent a .jpg are java or javascript files up till now, at least in webwork. for someone not in the industry it can get confusing fast. why not make use of one more character, and make it completely clear to everyone what it is? anyone that knows what a jpg is will know what a jpg2 is, but they may not know what a jp2 is. just a thought
    • Re:Stupid extensions (Score:3, Interesting)

      by dimator (71399)
      It would be interesting to see how much bandwidth is saved by not sending those two other characters to browsers. I bet it adds up quick. :)

      You're right about 8.3, though. It disgusts me. But this is just the .3. You don't have to stick with the 8 in front, which I can live with.

    • by GSloop (165220)
      How about JUST KILLING THE STUPID EXTENSTION ENTIRELY!

      I really hate Windows' use of extenstions. Why should extensions control the application association. Remember OS/2? It had an EA (extended attribute) that assocated the file with an application.

      I have all sorts of problems with users renaming files so they associate with the wrong thing, then they wonder why Word won't open an EXE properly - or other similarly stupid thing.

      The extension should either not exist, or only exist for additional naming info. It shouldn't have anything to do with application association.

      Windows, it seems has never done it right the first time. Every time we get to be the test subjects for MS's stupid ideas. (How about clippy! DAMN!) Sometime they eventually get it right. (How about Network file share rights and inheritance in NT! Every user that has rights to a folder farther down in the tree must have at least browse rights to every directory higher in the tree to use explorer to navigate to the folder they have rights to...that also allows them to see all the files in the folders they shouldn't!)

      I'll quit ranting now!

      Cheers!
      • The best part is that most users cannot comprehend that the extention doesn't make the file that format. Example:

        for a project I was told I'd need to parse the Excel files that users submitted, and given .xls files. So I looked around for something to allow me to read Excel files from java, found some thing from IBM that actually loaded Excel in the backround to do it, it worked, but crashed all the time. Found something that was supposed to read read the binary format without any MS crap, spent a few hours trying to make it work, no luck - doesn't recognize the format. Of course only then it occured to me to actualy check what the files were, and sure enough, tab delimited text files with .xls extensions (which Excel loaded and converted on the fly, when I was using the IBM thing). I felt very stupid, for a long time.

  • A bit late... (Score:3, Informative)

    by qnonsense (12235) on Sunday April 07, 2002 @03:06PM (#3300075)
    This tech does seem to be a bit late, and I don't just mean behind schedule. I mean LATE. I remember reading about JPEG2000, and other "next-gen" compression types, wavelets, fractal compression, etc., way, way back in the dark old modem days when the big controversy was Flex56k or X2. Nowadays, really what's the point. JPEG (plain) is just fine (and PNG is even better), now that bandwidth, processing power and memory are to be had aplenty. Even freaking cell phones have 8 megs of ram, a fast processor, and 2.4mbits/sec connections. JP2 might be the greatest thing since sliced bread, but face it: the glory days of image compression are over.

    ciao
    • Never late. (Score:3, Insightful)

      by Electrawn (321224)
      If my digital camera can squeeze out 5 extra pictures ...great.

      Many places are STILL on dial up and will continue to be.

      Any new technology that compresses well can only be a good thing.
    • First, contrary to what you may have heard, broadband is not ubiquitous. I live in a completely developed (sub)urban area and my only choice for broadband is wireless that requires an antenna that is not allowed by my HOA. My mother lives a couple of miles away and she can't get a cable modem or DSL either.

      Beyond that I don't know anyone who feels like he has too much bandwidth. There seems to be a principal that a lot of people (especially windows programmers) subscribe to that says "my use of resources should expand to consume the resources available." I don't understand why this is, especially since we live in, and have been living in for some time, an age of multi-tasking.

      -Peter
    • JPEG (plain) is just fine (and PNG is even better), now that bandwidth, processing power and memory are to be had aplenty. Even freaking cell phones have 8 megs of ram, a fast processor, and 2.4mbits/sec connections.

      ... shared with all the other phones in the cell, its contiguous neighbor cells, some cells further on, and the other carriers in the area.

      Bandwidth is not free and unlimited, especially when it's wireless. Want more, use higher frequencies - and pay your piece of the cost for better equipment, plus a hefty profit. Do you stop when the cell towers are flooding the neighborhood with x-rays, or do you keep going into the gammas?

      Better compression means faster downloading at ANY bandwidth, which is significant until the downloads are faster than you can see. Then think movies. Then think movies at HDTV resolutions, in VR total surround video, at flicker-free-during-eye-motion frame rates. Then think separate movies for each of the people in your family at the same time.

      They'll keep implementing better chips. But they'll have a REALLY hard time implementing better physics.

      And even when they do come up with more bandwidth you'll still have to PAY somebody to supply it to you. Right now the CLECs (Competitive Local Exchange Carriers) are about all dead. Without the competition the ILECs (Incumbent Local Exchange Carriers - mostly the fragments of the old Bell monopoly) have no incentive to sell you a broader pipe and undercut their own pricing on the narrower ones you're already paying for. And with the death of the CLECs it will be a long time before anybody invests a few billions to start up another one of 'em.
    • I don't know where you get that from, last number I heard there were only something like 65% of people in the nation online and it was less that 20% of those with broadband.

      I have DSL but only at 144K/sec (fastest availiable because of the phone company here) which is faster than dial up but still large files take time. I welcome some compression on images

    • Better is always, well, better. I mean, if it can compress more, the better it will be for some people. Just think about movie artists (I'm saying this because I watched the audio-commentary of Fight Club yesterday) sometimes have to work with 8k x 8k pictures. That's a LOT of pixels. If they can save a few megabytes for a picture, which is one in a collection of thousands of pictures, the I guess they'll use jpeg2000 format.
  • by DrSkwid (118965) on Sunday April 07, 2002 @03:08PM (#3300085) Homepage Journal
    The JPEG standard compresses image files which are then transmitted across the web faster than uncompressed files.

    excellent, using jpeg2000 increases my bandwidth too!

    There I was thinking they downloaded at the same speed but in less time!
    • Actually that's not strictly true:

      It downloads so quickly, that it goes back in time, creating a prallel universe where jpeg2000 does not exist. This, obviously, causes a normal jpeg file (which has the same image as the jpeg2000 file you were downloading) to appear in our universe, on your computer, at about the same time as when you request the download, but spinning in an opposite direction (also, and I am not sure about this, if you are using IE, the little globe icon also spins in the opposite direction).

  • Slow to change ... (Score:5, Insightful)

    by Jobe_br (27348) <`moc.liamg' `ta' `hturdb'> on Sunday April 07, 2002 @03:09PM (#3300091)
    I don't want to be a nay-sayer in any way, but I predict that this will catch on about as quickly as PNGs replacing GIFs. Most professional quality sites still use GIFs instead of PNG, even though tools such as Adobe's Imageready and Macromedia's Fireworks have supported the PNG format alongside GIFs for a while now AND most major browsers support PNGs natively (which wasn't the case not too long ago, with IE4, I believe).

    Until the .jp2 format doesn't require a plugin for 99% of the browsers out there, it won't be widely used, IMHO. Of course, I could be wrong and the .jp2 format might not even be meant for wide-spread adoption, and mainly for particular niche uses (such as viewing hubble images or replacing the need for lossless TIFFs).

    Just my $0.02.
    • by Darren Winsper (136155) on Sunday April 07, 2002 @03:25PM (#3300175) Homepage
      The major blocker for PNG is the fact that IE does not support its alpha channel. I originally used PNGs with alpha channels on a web site I made, but then had to replace them when I found out IE didn't support the alpha channel. This was a pain in the arse because the end result looks a tad crappy.
      • Let's be honest, IE has a rubbish renderer, full stop. It's not just this.

        Two things scream out at me. One, I can reliably set up layouts with nested tables which, under IE, display in a way which is indisputably incorrect. Two, we have a bunch of machines at work which muck about with several sites we've produced. Essentially, IE doesn't render the images right. Occasionally it doesn't render an image at all (rare but has happened, and it doesn't leave a placeholder because it doesn't realise it hasn't shown it up) until the image is clicked on or scrolled off the screen and back on. More commonly, it sticks a suprious transparency in place of white on some images, or even effectively makes the whole image slightly opaque. When there's an image background, that gets messy...

        Bottom line, it's sloppy.
    • I'm using IE5 on OS X. Every time I load up a page that contains PNG images, IE tries to save the files.... :-( Obviously it doesn't know what to do with PNG images.I think the browsers still have some catching up to do.
  • by angryargus (559948) on Sunday April 07, 2002 @03:11PM (#3300101)
    What is also important about the technology is its ability to send files without loss of data, which is not the case with current JPEG files.

    JPEG does support a lossless mode, it's just that no one uses it. To paraphrase, JPEG supports a lossless spatial algorithm that operates in the pixel domain. Some amount of prediction is used, resulting in about 2:1 compression, but the error terms for the predictions are included in the data stream (encoded using either Huffman or arithmetic coding), resulting in no net errors.

    What's a lot more exciting is JPEG2000's use of wavelet compression, which isn't mentioned at all.
    • You mean someone is actually going to use Wavelets for something???? Egads, all I've ever seen is endless r&d on it and it never seems to go anywhere, even though they claim that it would revolutionize compression in the image world!

      Horray for wavelets! Now if only someone would re-explain them to me. I didn't catch it the first time and no one has said anything high level enough since (I'm not interested in the nitty gritty at this point)
      • afaik .ogg uses wavelets, and gets better sound at lower bitrate (subjective..) than FFT -based mp3.
        • Vorbis (the sound codec; keep in mind that ogg is just the container format!) currently only uses the DCT, the same transform used in current JPEGs as well as MP3s. However, according to their FAQ, wavelets will be supported in the future (I'm too lazy to find the link; sorry!).


          Ogg Tarkin (the video codec) will be using wavelets from the beginning, though.

        • Ogg Vorbis uses MDCT and not wavelets.

          See the FAQ [xiph.org] for details.
      • by MWright (88261) on Sunday April 07, 2002 @03:50PM (#3300294)
        I'll give a really quick, basic explanation.


        The lifting algorithm (one way of computing the wavelet transform; actually, the simplest to understand and one of the fastest) works by splitting the original signal into two (in the case of a 1d signal) subsignals. One of these is the "trend" signal. It's sort of a compact version of the original one. Using only this signal, the original signal can be reconsturcted pretty well, but not perfectly. That's where the other signal, the "detail" signal, comes in. It contains the information needed to reconstruct the original signal perfectly. If the trend signal is a good prediction of the original signal, the detail signal will be very small, and can be compressed well.


        But, there's no need to stop there. The whole process can be applied to the trend signal again, and even to the detail signal if it's necessary.


        I'll give a more concrete example, the Haar wavelet transform. In this case, the trend signal is simply the averages of the original signal. So, if we start with


        1,3,4,5


        The trend signal would be


        2,4.5.

        If we were to reconstruct the signal with only this information, we'd get


        2,2,4.5,4.5,


        which is not too bad. The detail signal would contain the information needed to get from this to the original signal; differences between pairs of consecutive terms will work (Note that these pairs shouldn't overlap; that would just be redundant. Therefore, the detail signal, as well as the trend signal, are each half as long as the original one). So, the detail signal in this case is


        2,1.

        It's easy to see that if the original signal is constant, the detail signal will be all zeros. It's possible to construct different ways of making the trend and detail signals such that the detail signal will be zero if the original signal is linear, for example.


        A good paper about this (that explains it better than I do!) is Building Your Own Wavelets at Home [bell-labs.com]

      • MWright had a good technical introduction, so I'll just outline a few of the practical areas where wavelets make JPEG2000 rock.

        First, compression efficiency. Although lossless and near-lossless quality isn't hugely better, data rates for "good enough" quality (defined as where the image is understandable and artifacts aren't too distracting) are much, much lower. Unlike the old Discreet Cosine Transformation (DCT) method of JPEGs, which get blocky at high compression, wavelets get softer, which is much less obvious. So, this might not help things with pro digital cameras much (which were lightly compressed in the first place), it will help a lot with web images and such, or consumer digital cameras.

        The other nice thing about wavelets is that they are constructed in bands. First, the base image is encoded at a low resolution. Then this is used as prediction for the next resolution, and that band only has to store how the image is different from the prediction.

        This is groovy, because you can decode the individual bands as they're transmitted, giving a low-resolution proxy image once only a few percent of the file is transmitted, getting progressively higher quality over time. While progressive/interlaced JPEG/GIF gets the same effect, wavelets do it more efficiently.

        Many years ago, Intel created a video file format that used these properties, IVF. Didn't ever get any market traction. You can still get the tools from Ligos.

    • JPEG does support a lossless mode, it's just that no one uses it.

      Lossless JPEG uses patented IBM stuff (I think the rest of JPEG uses various patents as well, but everyone agreed to freely license them, IBM didn't agree for the lossless stuff). I think that is the big reason pretty much nobody uses it.

  • porn sites get more efficent, making more money, or will they lower their fees...

    tough choice.
  • by Wonderkid (541329) on Sunday April 07, 2002 @03:12PM (#3300110) Homepage
    About time! JPEG 2000 was mentioned in Electronic Engineering Times many years ago. The next revision of Artwalker.com (where you explore the world though landscape paintings) will be completely displayed using JP2 because it has one vital characteristic: Images can be scaled in real time (via the server). For example, instead of displaying a thumbnail of say 50x60 pixels, and having the user click the thumbnail to view the full size image (say, 640x480), a JP2 image can be made to display as a percentage of the total size of the display window (or browser width) in a similar fashion to a vector graphic, such as that generated by Flash. This will be excellent for mobile devices with differing screen resolutions and make for some very cool ZOOM tools on browsers and in Photoshop etc. We have been waiting for this since 1996, when we launched Artwalker! Soon it will be time to get going on converting all our high resolution images to JP2. A lot of work!
  • Mozilla & jpeg2000 (Score:5, Informative)

    by Majix (139279) on Sunday April 07, 2002 @03:13PM (#3300111) Homepage
    See this bugzilla entry [mozilla.org] for Mozilla's jpeg2000 progress.

    Doesn't seem too promising:
    If you look at appendix L of the jpeg2000 draft, there are 22 companies who believe that implementing the spec may require use of their patents.

    PNG still hasn't taken off despite being supported in all major browsers (now if only IE did proper alpha, any year now...), how much chance does an image format that requires third party plugins have?
    • "Sorry, links to Bugzilla from Slashdot are disabled."

      Bwahaahaha!! I sugested doing this before when slashdot linked to bugzilla directly from the story description, i never thought they would actualy do this!

      Very nice way to prevent the slashdot effect ;-)
    • by WasterDave (20047)
      ...meaning, of course, that in order to be able to implement JPEG2000 we need a patent licencing authority - JPEG-LA, anyone? Of course, they would want to charge 0.001c/download and everyone goes running back to DCT.

      Seriously, this is exactly the position in video compression right now. Dire, in other words.

      Dave
    • The Relevant Entry (Score:5, Interesting)

      by Sentry21 (8183) on Sunday April 07, 2002 @05:41PM (#3300749) Journal
      ------- Additional Comment #14 From tor@acm.org [mailto] 2001-07-31 10:47 -------

      Here's a summary of the jpeg2000 situation that I wrote up, but never made it into bugzilla:

      You might want to ask Tom Lane, head of the Independent JPEG Group, for his opinion.


      It seems that adding jpeg2000 support would get us involved in a legal mess. If you look at appendix L of the jpeg2000 draft, there are 22 companies who believe that implementing the spec may require use of their patents. From http://www.jpeg.org/CDs15444.htm :

      Particular attention is drawn to the use within the standard of selected technology or algorithms which are claimed to be protected by national and/or international patents. In the case of technology submitted for consideration by JPEG for incorporation in Part 1 of the standard, the JPEG committee believes that the individual organisations concerned will make available licences to use this intellectual property, on a royalty- and fee-free basis, under specified conditions which may apply only to conforming implementations of the standard. These conditions are available on application to the organisations concerned, which are listed in an Annex to the document.


      It is of course still possible that other organisations or individuals may claim intellectual property rights that affect implementation of the standard, and any implementors are urged to carry out their own searches and investigations in this area. The JPEG committee requests any organisations or individuals claiming (or being aware of claims) that any of the committee drafts available for download here infringes other intellectual property rights to provide information and/or evidence to substantiate their claim to the JPEG Convener in the first instance.


      Moving on to more practical considerations, there is one open (sort of) C implementation of the jpeg2000 standard that I'm aware of, Jasper:

      http://www.ece.ubc.ca/~mdadams/jasper/ [ece.ubc.ca]

      The licensing terms are specified in this document:

      http://www.ece.ubc.ca/~mdadams/jasper/LICENSE-1.00 0 [ece.ubc.ca]

      While I'm not a lawyer, the impression I get is that once ISO officially publishes part 5 of the jpeg200 standard we're free to use the library as we like.
  • by Gothmolly (148874) on Sunday April 07, 2002 @03:13PM (#3300115)
    In unconfirmed reports, the developers were said to be using the lzip [sourceforge.net] algorithm. As quoted on C|Net:
    We plan to integrate this into an ActivePlugin plugin for Internet Explorer 7, which will allow users to set their compression preferences, and the browser will request a given compression level. Initial testing indicates that this works, but we're experiencing some data loss. We'll address this with the developers and lzip, and probably get Microsoft involved. They have lots of experience with data corruption.

  • by Zuna (317219)
    Don't think that just because it causes the user to download a plugin that web developers will be afraid to use it. After all, just look at Flash.

    However, I think it'll really catch on whenever the next versions of the browsers are released with standard support for JPEG2000.
  • by Anonymous Coward on Sunday April 07, 2002 @03:15PM (#3300127)
    I thought this [aware.com] was a good comparasion between JPEG and JPEG2000.

    • by Ungrounded Lightning (62228) on Sunday April 07, 2002 @04:17PM (#3300404) Journal
      I thought this [aware.com] was a good comparasion between JPEG and JPEG2000.

      Good one. Thanks for the link.

      Looks like JPEG2000 finally got things right for the human eye:

      - Higher compression ratios just gently blur details, rather than creating artifacts. Losing the extra information leaves the part that DID get through intact.

      - The text says the compression allows for progressive downloading. This implies that the coding scheme does something like working upward in spatial frequency - encoding the basic stuff first then sending progressively finer deltas. For a given compression ratio just stop the downloading (or file saving) when you have enough.

      - The compression seems to match eye processing so well that highly compressed (100:1) images actually look BETTER than the basic image. The features important to the eye (facial stuff, especially eyes) gets through or even enhanced, while unnecessary detail - including sampling artifacts - gets selectively blurred out. Something like the soft-focus filter in portrait photography. The only thing in the samples that got noticably worse at high-compression is hair, which just gets a bit blurry. (Meanwhile, JPEG looked like it had pixelated measles.)

      Of course the images selected for the demo could have been optimized for the compression scheme. B-)
  • by JohnA (131062) <johnanderson@gm[ ].com ['ail' in gap]> on Sunday April 07, 2002 @03:16PM (#3300130) Homepage
    According to this EE Times article [eetimes.com], there are several patents that are licensed "royalty free" to implementers of the JPEG2000 Part 1 specification. Sound familiar?

    I remember a similar promise made about LZW compression in the GIF standard by Compuserve. What is to stop these companies from requiring license fees at some arbitrary point in the future once the technology is widely used?

    Additionally, there doesn't seem to be very much due dilligence performed in regards to other patents over the techniques utilized in the standard. Even if all of the known patents are licensed royalty-free, there exists the very real possiblity that a submarine patent will be exposed, after the standard is widely utilized, of course.

    Of course, this won't matter once all of our PCs are replaced with sealed, SSSCA-compliant, government issued "convergence appliances"... :-)

    • I remember a similar promise made about LZW compression in the GIF standard by Compuserve. What is to stop these companies from requiring license fees at some arbitrary point in the future once the technology is widely used?

      I don't recall Compuserve ever promising that, in fact at the time they made GIF nobody really thought software patents were workable...except the patent office.

      Plus if CompuServ tells me I can use Unisys's patented crud, why should I believe them? I only trust what IBM says about IBM's patents. Likewise for JPEG2000, I'll believe I can get a royalty free license only if the patent holders sign for it, not 3rd parties.

      If you look at RAMBUS you will see they made a similar promise when they were at the JDEC meetings that eventually produced SDRAM, and while they did sue, when someone finally decided not to settle RAMBUS got spanked. Hard. So while it ain't perfect, there is some reason to believe it will work out Ok.

    • And what's more... (Score:2, Interesting)

      by KarmaSafe (560887)
      There was a dead link about "watermarking" on the page. Does this mean we'll be seeing "copy protection" built into images?

      I say we just refine the .png standard. Human eyes don't see blue well, just make it lossy (one time, saving again doesn't make it worse) in the blue spectrum.
  • by adamwright (536224) on Sunday April 07, 2002 @03:20PM (#3300151) Homepage
    I've been involved in JPEG 2000 for a while now, and come to the conclusion that..

    A) It's an excellent codec, though computationally heavy
    B) The design of the codestream along with JP2/JPX file format has a lot of potential to create a "new" type of image that isn't just a picture. Yes, you've heard this before, but this time it's built in at a codec level. In stream ROI's, very flexible reconstruction and compression controllable through great numbers of options - and that's only the codec (at a *very* rudimentary level :).
    C) It won't succeed without a decent opensource, "IPR free" (as much as is possible) implementation.
    D) Read C again. It's important

    To this end, I've started (with support from others in the JPEG 2000 community), a JPEG 2000 Group (See http://www.j2g.org [j2g.org] - It's very sparse at the moment, but if you're interested, bookmark it and come back in about a month). Tom Lane and IGJ have expressed no interested in JPEG2000, for various reasons (which I don't entirely disagree with, but I'd rather be proactive and try to correct flaws than walk away totally).

    The aims of the JPEG 2000 Group are to create a public, open source (probably BSD license) implementation of "Part 1" (This is the codestream syntax, codec, and file format wrapper). We'll also provide a community JPEG 2000 resource. To facilitate this, we've already attained a Class C liaison with the committee. This grants all members the option of acquiring the standard free of charge. We also get a minimal channel back into the process to give opinions.

    The point of this ever rambling post is this : We need members. The standard is large, and the support around it will be larger. We need volunteers who would be interested in assisting in the creation of the codec. Sadly, "Membership" is going to require some form of contribution and commitment to acquire copies of the texts you'll need - I hate this as much as you, but it was accept it, or don't get any copies at all (without $$$). If you're interested in contributing in any way (code, documents, testing, support), please drop by the at forum [j2g.org] - Even if its only a passing interest, I'd be happy to go into more detail regarding the project (or just JPEG 2000 itself). I'd do it here, but I'd loose all my (low :) karma in offtopics.

    So, rather than bitch about the lack of a free implementation and how late it is, and how it'll never get used, come and help out! You know you (might possibly | maybe | someday) want to!
    • The comparisons pointed to in the main articles (including the PDFs one or two layers deep in) are missing a couple of things:
      • DEcompression speeds - yes, compression is usually the hard part, but it's helpful to know decompression speeds as well, since realistically most images will be decompressed far more than one time. Will it be fast enough for browsers to decompress in real-time on cable modems with 500MHz machines, or will it be dog-slow compared to download time even on 56kbps modems on 5GHz machines? Is it much faster or slower than existing JPEG?
      • Color depths beyond 8 bits - all the reviews used 8-bit color depths, which is fine for the previous generation of scanners and not too bad for monitors, but under-$200 flatbed scanners are now doing 48-bit pixels (so 16 bits/color) as well as more dots per inch. How well do the new compression algorithms support them? How much does this affect compression speed?
      • Conversion from JPEG, for digital cameras and scanners? Many input devices are smart enough to compress their images using JPEG, but especially for cameras, won't have the horsepower to do JPEG-2 compression. How well will the new algorithm support recompressing pictures originally compressed with JPEG?
      • Also, will browsers be able to do progressive decoding, so they can start by displaying rough versions of the image and gradually refining it, or will this be a compression mode nobody bothers using like it seems to be for JPEG?
  • anytime soon that is. To take advantage of a JPEG2000, web browsers will need a Plug-In for either Internet Explorer or Netscape browsers. I don't mind downloading a plug-in to get faster images. but the average user only knows plug-ins as the airfreshener glad makes. Not to mention will a company be willing to switch over to using this format since most average users won't see. Unless IE, netscape, mozilla, etc get support for it by default it won't be used to much.
  • libjpeg (Score:2, Insightful)

    by NotoriousQ (457789)
    web browsers will need a Plug-In for either Internet Explorer or Netscape browsers.

    or some of us that compile our own code and use dynamic and static libraries, the change would be as transparent as recompiling libjpeg.

    just another reason I like open source.
  • It's good to see the porn comments piling up. This is worse than the body count in a bad slasher flick.
  • by chrysalis (50680) on Sunday April 07, 2002 @03:25PM (#3300173) Homepage
    Everyone is still using old formats like GIF and JPEG.

    But there are other, more powerful formats.

    For a non-descructive compression, the PNG format is fortunately getting more and more popular, although the late inclusion in Internet Explorer slows down its wide adoption.

    But when it comes to a destructive compression, there's an excellent (and not new) format made by AT&T and called DjVu. It was one of the first wavelets-based format.

    DjVu is really better than Jpeg. Images are better looking (more contrast, less pixels with odd colors), and files are way smaller. Plus you can smoothly zoom any DjVu image without getting big and ugly blocks.

    DjVu has been available for a while as a plugin for common browsers.

    There's a 100% free implementation of the format called DjVuLibre [sf.net] .

    However, nobody uses it. I don't understand why. Some times ago, it may have been because compression was slow. But nowadays, it's no more a valid point.

    People are enthusiast for Jpeg2000. But why would Jpeg2000 be adopted while DjVu has never been?


    • Djvu is actually quite an impressive file format. We've used it where I work, and it typically gets a 3 meg 200dpi PNG-compressed document image down to around 50-150k, and it is still remarkably as clear as the original. What I understand that it does is separate the background from the foreground, compress the foreground in non-lossy format, and then compress the background using a lossy wavelet-based format. Then the final display decoder merges the two together quite well...

      There's a technical description here [sourceforge.net].

    • by MisterBlister (539957) on Sunday April 07, 2002 @05:52PM (#3300796) Homepage
      Seems that the reference source code implementation of DjVu is GPL (full GPL, not LGPL). I'm not looking to start a debate on the merits of "Free" software, but this situation is the kiss of death for any potential file format. I'm sure if the reference implementation were released under a BSD style license (as is the case with JPG, PNG, etc), the format would be much more widely supported....

      In the real world, companies don't want to either GPL their software (required if they use this GPL library), or reinvent all the code from scratch based on the spec, unless there's huge demand for it (which there won't be due to chicken & egg scenerio)... So, don't expect to see any support for DjVu anytime soon.

  • Now that everyone has broadband and can play streaming video.....
  • It's always good when the submitted story is more up-to-date than the site it links to. The current press release" [jpeg.org] on the site is dated August, 2000.

    Could this story be submitted by an insider? Hmmm... I know, I know, Slashdot != "investigative journalism"
  • by big.ears (136789) on Sunday April 07, 2002 @03:29PM (#3300195) Homepage
    According to this pdf [jpeg.org],
    the report compares 4 compression codecs, and found for a small sample found:

    MEAN LOSSLESS COMPRESSION RATIOS (big is good)
    ------------------
    JPEG 2000: 2.5
    JPEG-LS: 2.98
    L-JPEG: 2.09
    PNG: 3.52

    JPEG-LS is was usually the best, but PNG had a few really good sample that pushed its average up. Actually, these outliers appear important, because that is what really separates the codecs on this metric.

    Lossless Decoding Times, relative to JPEG-LS (big is bad)
    -----------------
    JPEG 2000: 4.3
    JPEG-LS: 1
    L-JPEG: .9
    PNG: 1.2

    This doesn't make JPG2K appear too impressive. What it does offer, however, is features. Like Region Of Interest (ROI) coding, good lossy compression, random access, and other goodies that some people may really care about. The report claims that png doesn't do lossy encoding, which is news to me, but it does appear to be one of their major selling points for jpeg-2000 over png.
    • This doesn't make JPG2K appear too impressive.

      JPEG-2K is really intended for lossy coding, and that is where it shines. The lossless spec is included primarily because you can use the same algorithm for both lossy and lossless coding. The only real difference is in the choice of wavelet transform, which is irreversible (floating-point) in the lossy case but reversible (integer) in the lossless case.

      A better comparison pits JPEG-2K against the original (lossy) JPEG. According to a figure given in this paper [jpeg.org], J2K provides roughly a 2dB PSNR gain over JPEG for a wide range of bitrates. At the low rate of 0.25 bits per pixel, this gain takes you from 25.5dB to 27.5dB; perceptually, that is a noticeable difference. At low rate, JPEG is also subject to blocking artifacts, so the perceptual problems can be even worse than the PSNR numbers would indicate.

      In other words, JPEG-2K is a Good Thing.

  • by datrus (265707) on Sunday April 07, 2002 @03:33PM (#3300214)
    Hey, I've implemented a JPEG-2000 codec using
    a BSD-style license.
    It's been tested at the MIT biodmedical department already for compression of medical images.
    It's available at http://j2000.org/.
    It would be nice to see this work in my favourite browsers.
  • by Anonymous Coward on Sunday April 07, 2002 @03:34PM (#3300229)
    Several things, besides simply "good compression."

    JP2 uses wavelet compression such that an image is effectively compressed at various resolutions below the originally, independently. Not only does this allow a high level of redundancy removal (which is why wavelets are good in the first place) and thus high compression, but jp2 tags each of these sections (subbands) separately in the compressed file.

    So what? Well, a file with all of these sections is effectively a losslessly compressed image. However, this file can be further compressed (loss-ily) by simply throwing out some of these tagged sections! That is, you can make a "lossless" thumbnail image by keeping all the lower resolution subbands. Or, you can get a lower-quality (but smaller) fullsize version by throwing out some subbands at each resolution.

    Better still, this manipulation can be done without decompressing the original image. Simply using only certain tagged sections of the file.

    Consider this possible application of all this: Digital Cameras. A camera could take images at full resolution and lossless quality until the memory card starts filling up. Then, gradually as more and more room is required, it could quickly reduce the size or quality of previous pictures to make room for new pictures. Thus, you always have "enough" room for more pictures, provided you don't mind the quality reduction.

    Of course, there are numerous uses for web applications -- thumbnails and full-sized images could be the same file, provided the web server knows how to parse the image file. (Little or no computation necessary, just sending parts of the file)

    Anyways, JPEG2000 is very very cool.
  • Am I missing the joke - is this some sort of overdue April Fool's joke? Did this story get sent here by Mallett's time machine from last week?

    Or did /. just regurgitate somebody's press release?

    As far as I can tell with a quick google, nothing has been done with this standard since early 2000 (maybe that's why the standard name hasn't been updated, eh.) I wouldn't hold my breath waiting for widespread adoption any time soon...
  • by Anonymous Coward
    The extension for the new files will be ".jp2"


    I wonder what the Pope thinks about the file extensions being callled .jp2


  • This will not only make graphics-heavy web pages easier to download

    Oh great! Even more websites designed with the idea that Photoshop is a webdesign tool and that the best way to make a webpage is lots of massive images instead of text and styling.

    Mumble mutter grouse.

    Good thing it looks like it'll take ages to catch on.

  • Unfortunately, one effect of better compression will be more bloat -- web pages with more graphics and more advertising. This is because with better compression, there is more information transmitted per byte, thus giving more value, so the decision about whether or not to include pictures tips in favor of more and bigger images. Thus the average size of a web page and all its components will increase.

    If you think the images are valuable, as many web designers seem to for incomprehensible reasons, that is a good thing. But if you do not value lots of images, that is a bad thing. So, better compression harms those with slower links, those who detest advertising clutter, and those who seek concise information rather than flashy presentations.

    (I am not opposed to better compression, just pointing out an unintended consequence.)

  • by bons (119581) on Sunday April 07, 2002 @04:23PM (#3300432) Homepage Journal
    If there's anything NEW (ie, less than a year old) on that site, I missed it.

    If there's any indication that this will actually be out in a few months, I missed it.

    If there's anything indicating JPEG2000 support for Mozilla, The Gimp, Paint Shop Pro, or Photoshop in the near future, I missed it.

    I've yet to see anything that indicates there are no more patent issues and that people can support this format without patent issues (Read "Can the Gimp ship with this?")

    Regarding Exploer PNG support:
    AlphaImageLoader Filter [microsoft.com]:
    Displays an image within the boundaries of the object and between the object background and content, with options to clip or resize the image. When loading a Portable Network Graphics (PNG) image, tranparency--from zero to 100 percent is supported.

    Just because I do miss it, I still see almost no support for the beloved fractal image format *.fif I think it's now part of LizardTech's [altamira-group.com] line of image compression/fractal tools. If you think jpeg200 offers compression, then you missed the fif format completely.

  • ...make graphics-heavy web pages easier to download...

    Nah, it'll probably make easy to download web pages more graphics heavy, if I know today's web designers...
  • New IE Exploit (Score:2, Insightful)

    by AX.25 (310140)
    I can't wait for the new IE exploit using the jpeg2000 activeX control (sense IE now doesn't support netscape style plugins).
  • by zecg (521666)
    Comparisons with png/gif history are NOT valid here, although such ARE usually a foolproof tao of the karma-gatherer ;). Those who make them are probably not aware of the difference in quality/size ratio between wavelet compression and the current (pathetic) jpeg. It is not only "beefed up" jpeg, people. It is qualitatively different, using completely different philosophy and algorithms so smart that the FBI people probably got them from aliens in exchange for not drilling their testicles. They lied, of course - but the aliens delivered.

    What thrills me even more is the possible application of wavelet compression algorithms in 3D.

    See this for a dramatic flash-based demonstration of advantages - notice that it says 4kb!:
    http://www.luratech.com/index_e.html
  • Metadata Section (Score:4, Interesting)

    by Camel Pilot (78781) on Sunday April 07, 2002 @05:32PM (#3300719) Homepage Journal
    I hope they have added a metadata section where data like author, date, etc could be attached internally to the image.

    I always thought it would be cool if your digital camera could include the settings (fstop, exposure time, ISO, etc, compression ratio) along with data, time and author directly in the image file.

"The way of the world is to praise dead saints and prosecute live ones." -- Nathaniel Howe

Working...