Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Graphics Software

JPEG2000 Coming Soon 489

Sonny writes "In a few months time, internet users will be able to make use of the JPEG2000 standard which, its developers claim, enables web graphics to be downloaded much faster than is currently possible. This will not only make graphics-heavy web pages easier to download, it will also preserve image quality. The JPEG standard compresses image files which are then transmitted across the web faster than uncompressed files. Now, researchers at universities around the world have developed JPEG2000, the next-generation image-compression technology under the auspices of the International Standards Organisation. It is the first major upgrade of the standard since it first appeared in the early '90s. What is also important about the technology is its ability to send files without loss of data, which is not the case with current JPEG files. To take advantage of a JPEG2000, web browsers will need a Plug-In for either Internet Explorer or Netscape browsers. These free plug-in's are expected to be available later this year. The extension for the new files will be ".jp2"."
This discussion has been archived. No new comments can be posted.

JPEG2000 Coming Soon

Comments Filter:
  • by GauteL ( 29207 ) on Sunday April 07, 2002 @04:10PM (#3300099)
    Speak for yourself. I'm heavily using PNG right now. Why is it important that everyone should use new standards? As long as they are supported in browsers (*) and I am free to use them, I don't care what everyone else is using.

    Sure, shorter download time would be nice, but PNG isn't really providing that. PNG however makes the job of the web developer easier.

    (*) I'm still annoyed that IE doesn't support alpha-transparency though.
  • by angryargus ( 559948 ) on Sunday April 07, 2002 @04:11PM (#3300101)
    What is also important about the technology is its ability to send files without loss of data, which is not the case with current JPEG files.

    JPEG does support a lossless mode, it's just that no one uses it. To paraphrase, JPEG supports a lossless spatial algorithm that operates in the pixel domain. Some amount of prediction is used, resulting in about 2:1 compression, but the error terms for the predictions are included in the data stream (encoded using either Huffman or arithmetic coding), resulting in no net errors.

    What's a lot more exciting is JPEG2000's use of wavelet compression, which isn't mentioned at all.
  • by Wonderkid ( 541329 ) on Sunday April 07, 2002 @04:12PM (#3300110) Homepage
    About time! JPEG 2000 was mentioned in Electronic Engineering Times many years ago. The next revision of Artwalker.com (where you explore the world though landscape paintings) will be completely displayed using JP2 because it has one vital characteristic: Images can be scaled in real time (via the server). For example, instead of displaying a thumbnail of say 50x60 pixels, and having the user click the thumbnail to view the full size image (say, 640x480), a JP2 image can be made to display as a percentage of the total size of the display window (or browser width) in a similar fashion to a vector graphic, such as that generated by Flash. This will be excellent for mobile devices with differing screen resolutions and make for some very cool ZOOM tools on browsers and in Photoshop etc. We have been waiting for this since 1996, when we launched Artwalker! Soon it will be time to get going on converting all our high resolution images to JP2. A lot of work!
  • by adamwright ( 536224 ) on Sunday April 07, 2002 @04:20PM (#3300151) Homepage
    I've been involved in JPEG 2000 for a while now, and come to the conclusion that..

    A) It's an excellent codec, though computationally heavy
    B) The design of the codestream along with JP2/JPX file format has a lot of potential to create a "new" type of image that isn't just a picture. Yes, you've heard this before, but this time it's built in at a codec level. In stream ROI's, very flexible reconstruction and compression controllable through great numbers of options - and that's only the codec (at a *very* rudimentary level :).
    C) It won't succeed without a decent opensource, "IPR free" (as much as is possible) implementation.
    D) Read C again. It's important

    To this end, I've started (with support from others in the JPEG 2000 community), a JPEG 2000 Group (See http://www.j2g.org [j2g.org] - It's very sparse at the moment, but if you're interested, bookmark it and come back in about a month). Tom Lane and IGJ have expressed no interested in JPEG2000, for various reasons (which I don't entirely disagree with, but I'd rather be proactive and try to correct flaws than walk away totally).

    The aims of the JPEG 2000 Group are to create a public, open source (probably BSD license) implementation of "Part 1" (This is the codestream syntax, codec, and file format wrapper). We'll also provide a community JPEG 2000 resource. To facilitate this, we've already attained a Class C liaison with the committee. This grants all members the option of acquiring the standard free of charge. We also get a minimal channel back into the process to give opinions.

    The point of this ever rambling post is this : We need members. The standard is large, and the support around it will be larger. We need volunteers who would be interested in assisting in the creation of the codec. Sadly, "Membership" is going to require some form of contribution and commitment to acquire copies of the texts you'll need - I hate this as much as you, but it was accept it, or don't get any copies at all (without $$$). If you're interested in contributing in any way (code, documents, testing, support), please drop by the at forum [j2g.org] - Even if its only a passing interest, I'd be happy to go into more detail regarding the project (or just JPEG 2000 itself). I'd do it here, but I'd loose all my (low :) karma in offtopics.

    So, rather than bitch about the lack of a free implementation and how late it is, and how it'll never get used, come and help out! You know you (might possibly | maybe | someday) want to!
  • by chrysalis ( 50680 ) on Sunday April 07, 2002 @04:25PM (#3300173) Homepage
    Everyone is still using old formats like GIF and JPEG.

    But there are other, more powerful formats.

    For a non-descructive compression, the PNG format is fortunately getting more and more popular, although the late inclusion in Internet Explorer slows down its wide adoption.

    But when it comes to a destructive compression, there's an excellent (and not new) format made by AT&T and called DjVu. It was one of the first wavelets-based format.

    DjVu is really better than Jpeg. Images are better looking (more contrast, less pixels with odd colors), and files are way smaller. Plus you can smoothly zoom any DjVu image without getting big and ugly blocks.

    DjVu has been available for a while as a plugin for common browsers.

    There's a 100% free implementation of the format called DjVuLibre [sf.net] .

    However, nobody uses it. I don't understand why. Some times ago, it may have been because compression was slow. But nowadays, it's no more a valid point.

    People are enthusiast for Jpeg2000. But why would Jpeg2000 be adopted while DjVu has never been?


  • by Darren Winsper ( 136155 ) on Sunday April 07, 2002 @04:25PM (#3300175)
    The major blocker for PNG is the fact that IE does not support its alpha channel. I originally used PNGs with alpha channels on a web site I made, but then had to replace them when I found out IE didn't support the alpha channel. This was a pain in the arse because the end result looks a tad crappy.
  • by FaithAndReason ( 112179 ) on Sunday April 07, 2002 @04:44PM (#3300271)
    Am I missing the joke - is this some sort of overdue April Fool's joke? Did this story get sent here by Mallett's time machine from last week?

    Or did /. just regurgitate somebody's press release?

    As far as I can tell with a quick google, nothing has been done with this standard since early 2000 (maybe that's why the standard name hasn't been updated, eh.) I wouldn't hold my breath waiting for widespread adoption any time soon...
  • by MWright ( 88261 ) on Sunday April 07, 2002 @04:50PM (#3300294)
    I'll give a really quick, basic explanation.


    The lifting algorithm (one way of computing the wavelet transform; actually, the simplest to understand and one of the fastest) works by splitting the original signal into two (in the case of a 1d signal) subsignals. One of these is the "trend" signal. It's sort of a compact version of the original one. Using only this signal, the original signal can be reconsturcted pretty well, but not perfectly. That's where the other signal, the "detail" signal, comes in. It contains the information needed to reconstruct the original signal perfectly. If the trend signal is a good prediction of the original signal, the detail signal will be very small, and can be compressed well.


    But, there's no need to stop there. The whole process can be applied to the trend signal again, and even to the detail signal if it's necessary.


    I'll give a more concrete example, the Haar wavelet transform. In this case, the trend signal is simply the averages of the original signal. So, if we start with


    1,3,4,5


    The trend signal would be


    2,4.5.

    If we were to reconstruct the signal with only this information, we'd get


    2,2,4.5,4.5,


    which is not too bad. The detail signal would contain the information needed to get from this to the original signal; differences between pairs of consecutive terms will work (Note that these pairs shouldn't overlap; that would just be redundant. Therefore, the detail signal, as well as the trend signal, are each half as long as the original one). So, the detail signal in this case is


    2,1.

    It's easy to see that if the original signal is constant, the detail signal will be all zeros. It's possible to construct different ways of making the trend and detail signals such that the detail signal will be zero if the original signal is linear, for example.


    A good paper about this (that explains it better than I do!) is Building Your Own Wavelets at Home [bell-labs.com]

  • Smart Extensions (Score:1, Interesting)

    by Anonymous Coward on Sunday April 07, 2002 @05:00PM (#3300333)

    Everybody should make it a practice to use 8.3 filenames when possible.

    1. less bandwidth.

    2. Easier to type. One time I had to download a new version of Netscape, and the file name was ns_release_browser_en_foober_fiddle_3.0.1x.tar.gz or something like that. Maybe there was a way to regex this, but sheesh! I had no desire to think about that, I just wanted to download the bloody file.

    3. Backward compatability.

    I realize that sometimes a longer file name is good, but sometimes people use them when they shouldn't. I always try to come up with something that fits in 8.3 without losing meaning, and I'm almost always successful. I cringe when I see people have named the file some big long gobbledegook like bobbys_8th_book_report.doc when bookrep8.doc would have done just fine.

  • by siliconwafer ( 446697 ) on Sunday April 07, 2002 @05:01PM (#3300343)
    I'm using IE5 on OS X. Every time I load up a page that contains PNG images, IE tries to save the files.... :-( Obviously it doesn't know what to do with PNG images.I think the browsers still have some catching up to do.
  • Re:Stupid extensions (Score:3, Interesting)

    by dimator ( 71399 ) on Sunday April 07, 2002 @05:03PM (#3300352) Homepage Journal
    It would be interesting to see how much bandwidth is saved by not sending those two other characters to browsers. I bet it adds up quick. :)

    You're right about 8.3, though. It disgusts me. But this is just the .3. You don't have to stick with the 8 in front, which I can live with.

  • by Fourier ( 60719 ) on Sunday April 07, 2002 @05:06PM (#3300365) Journal

    This doesn't make JPG2K appear too impressive.

    JPEG-2K is really intended for lossy coding, and that is where it shines. The lossless spec is included primarily because you can use the same algorithm for both lossy and lossless coding. The only real difference is in the choice of wavelet transform, which is irreversible (floating-point) in the lossy case but reversible (integer) in the lossless case.

    A better comparison pits JPEG-2K against the original (lossy) JPEG. According to a figure given in this paper [jpeg.org], J2K provides roughly a 2dB PSNR gain over JPEG for a wide range of bitrates. At the low rate of 0.25 bits per pixel, this gain takes you from 25.5dB to 27.5dB; perceptually, that is a noticeable difference. At low rate, JPEG is also subject to blocking artifacts, so the perceptual problems can be even worse than the PSNR numbers would indicate.

    In other words, JPEG-2K is a Good Thing.

  • by Aanallein ( 556209 ) on Sunday April 07, 2002 @05:09PM (#3300378)
    All that said, I don't see the JPEG2k standard really succeeding--now that more and more users are getting faster and faster connections, there's not that much need for the smaller filesizes.

    However, at the same time that endusers are getting faster connections, more and more webmasters are starting to feel the very real pain of bandwidth costs. I think the adoption of smaller image formats will come more from their side.
    If jpeg2000 makes enough of a difference, I can see at least a very large percentage of graphics oriented sites switching - almost forcing the users to follow suit.
  • by Ungrounded Lightning ( 62228 ) on Sunday April 07, 2002 @05:17PM (#3300404) Journal
    I thought this [aware.com] was a good comparasion between JPEG and JPEG2000.

    Good one. Thanks for the link.

    Looks like JPEG2000 finally got things right for the human eye:

    - Higher compression ratios just gently blur details, rather than creating artifacts. Losing the extra information leaves the part that DID get through intact.

    - The text says the compression allows for progressive downloading. This implies that the coding scheme does something like working upward in spatial frequency - encoding the basic stuff first then sending progressively finer deltas. For a given compression ratio just stop the downloading (or file saving) when you have enough.

    - The compression seems to match eye processing so well that highly compressed (100:1) images actually look BETTER than the basic image. The features important to the eye (facial stuff, especially eyes) gets through or even enhanced, while unnecessary detail - including sampling artifacts - gets selectively blurred out. Something like the soft-focus filter in portrait photography. The only thing in the samples that got noticably worse at high-compression is hair, which just gets a bit blurry. (Meanwhile, JPEG looked like it had pixelated measles.)

    Of course the images selected for the demo could have been optimized for the compression scheme. B-)
  • And what's more... (Score:2, Interesting)

    by KarmaSafe ( 560887 ) on Sunday April 07, 2002 @05:24PM (#3300435)
    There was a dead link about "watermarking" on the page. Does this mean we'll be seeing "copy protection" built into images?

    I say we just refine the .png standard. Human eyes don't see blue well, just make it lossy (one time, saving again doesn't make it worse) in the blue spectrum.
  • Metadata Section (Score:4, Interesting)

    by Camel Pilot ( 78781 ) on Sunday April 07, 2002 @06:32PM (#3300719) Homepage Journal
    I hope they have added a metadata section where data like author, date, etc could be attached internally to the image.

    I always thought it would be cool if your digital camera could include the settings (fstop, exposure time, ISO, etc, compression ratio) along with data, time and author directly in the image file.
  • The Relevant Entry (Score:5, Interesting)

    by Sentry21 ( 8183 ) on Sunday April 07, 2002 @06:41PM (#3300749) Journal
    ------- Additional Comment #14 From tor@acm.org [mailto] 2001-07-31 10:47 -------

    Here's a summary of the jpeg2000 situation that I wrote up, but never made it into bugzilla:

    You might want to ask Tom Lane, head of the Independent JPEG Group, for his opinion.


    It seems that adding jpeg2000 support would get us involved in a legal mess. If you look at appendix L of the jpeg2000 draft, there are 22 companies who believe that implementing the spec may require use of their patents. From http://www.jpeg.org/CDs15444.htm :

    Particular attention is drawn to the use within the standard of selected technology or algorithms which are claimed to be protected by national and/or international patents. In the case of technology submitted for consideration by JPEG for incorporation in Part 1 of the standard, the JPEG committee believes that the individual organisations concerned will make available licences to use this intellectual property, on a royalty- and fee-free basis, under specified conditions which may apply only to conforming implementations of the standard. These conditions are available on application to the organisations concerned, which are listed in an Annex to the document.


    It is of course still possible that other organisations or individuals may claim intellectual property rights that affect implementation of the standard, and any implementors are urged to carry out their own searches and investigations in this area. The JPEG committee requests any organisations or individuals claiming (or being aware of claims) that any of the committee drafts available for download here infringes other intellectual property rights to provide information and/or evidence to substantiate their claim to the JPEG Convener in the first instance.


    Moving on to more practical considerations, there is one open (sort of) C implementation of the jpeg2000 standard that I'm aware of, Jasper:

    http://www.ece.ubc.ca/~mdadams/jasper/ [ece.ubc.ca]

    The licensing terms are specified in this document:

    http://www.ece.ubc.ca/~mdadams/jasper/LICENSE-1.00 0 [ece.ubc.ca]

    While I'm not a lawyer, the impression I get is that once ISO officially publishes part 5 of the jpeg200 standard we're free to use the library as we like.
  • by MisterBlister ( 539957 ) on Sunday April 07, 2002 @06:52PM (#3300796) Homepage
    Seems that the reference source code implementation of DjVu is GPL (full GPL, not LGPL). I'm not looking to start a debate on the merits of "Free" software, but this situation is the kiss of death for any potential file format. I'm sure if the reference implementation were released under a BSD style license (as is the case with JPG, PNG, etc), the format would be much more widely supported....

    In the real world, companies don't want to either GPL their software (required if they use this GPL library), or reinvent all the code from scratch based on the spec, unless there's huge demand for it (which there won't be due to chicken & egg scenerio)... So, don't expect to see any support for DjVu anytime soon.

  • by Sheetrock ( 152993 ) on Sunday April 07, 2002 @07:07PM (#3300873) Homepage Journal
    For what it's worth, I recall seeing a demo of this where they mentioned that watermarking was part of the features, though it was a year or two ago and it seemed more about identifying copyright on images in the wild (web pages? porn?) than it was about preventing copies from being made in the first place.
  • by datrus ( 265707 ) on Sunday April 07, 2002 @07:25PM (#3300943)
    I'm pretty sure there are no patent problems.
    All algorithms used in the JPEG-2000 standard can be used freely. (like IBM's MQ-coder)

    About the video codec: I implemented an H.263 and MPEG-4 compliant codec. But I didn't get paid by the opencodex guy, so I aborted the project. Still, if anybody's interested, I have some pretty good MMX and altivec optimised code lying around.
  • The Joint Picture Experts Group [jpeg.org] has a very nice applet available to demonstrate this, but they put it in a ZIP file for some reason, so you can't directly execute it in the browser. I have posted the applet here [thereifs.com], with a slight modification to use an image borrowed from CmdrTaco [cmdrtaco.net]'s really crappy movie [cmdrtaco.net].

    Enjoy!,
    Jouster
  • Smart server format? (Score:2, Interesting)

    by rwa2 ( 4391 ) on Sunday April 07, 2002 @10:37PM (#3301451) Homepage Journal
    We should have a server that sends the appropriate file type depending on the client request. So if Apache gets a request from an old version of Netscape, it'll send the picture compressed as .jpg/gif, if it gets a newer Mozilla, it'll send .jp2/png, if it gets a PDA, it'll send a 16-color greyscale, etc.

    The server is probably a good place to do this (maybe with some mod_rewrite hack), since it could be responsible for caching heavily-requested conversions.

    Anything like this exist? Similarly, I've always wanted to find some nice way to keep my photo-album online in the highest quality, but only send scaled-down images to casual visitors (as well as thumbnails). http://ids.sourceforge.net/ looks like it comes the closest to this kind of thing, but looks a bit too server-heavy (doesn't seem to support caching).
  • by Mika_Lindman ( 571372 ) on Monday April 08, 2002 @02:18AM (#3301903)
    What the hell, plugin for Photoshop costs 79$ ?! Does this mean that there'll never be FREE jpeg2000 encoder? That would kill jpeg2000 before it even gets started. I would never pay that much for image compressor, no matter how good it is.
  • by Anonymous Coward on Monday April 08, 2002 @02:24AM (#3301915)
    We create a toolkit which reads and writes images etc. JPEG 2000 will give you better results on a particular type of image. It does compress the hell out of images, but typically the best you can get is 20% better. This 20% comes at a price however: all of the implementations we've seen are terribly slow. 10 to 20 times slower than any JPEG implementation. While time will allow developers to create faster algorithms, JPEG2k will probably die before it gets off the ground. And that's not the only thing. A lot of people keep their images in JPEG format and don't know how to convert or don't want to pay for tools to convert them. JPEG 2000 will benefit some people immensely. For most users it will not. I disagree with previous posts saying that bandwidth is becoming more precious. In a few years, bandwidth won't be the problem. Neither will storage. JPEG 2000 is a solution to a problem that doesn't exist.
  • by billstewart ( 78916 ) on Monday April 08, 2002 @04:40AM (#3302169) Journal
    The comparisons pointed to in the main articles (including the PDFs one or two layers deep in) are missing a couple of things:
    • DEcompression speeds - yes, compression is usually the hard part, but it's helpful to know decompression speeds as well, since realistically most images will be decompressed far more than one time. Will it be fast enough for browsers to decompress in real-time on cable modems with 500MHz machines, or will it be dog-slow compared to download time even on 56kbps modems on 5GHz machines? Is it much faster or slower than existing JPEG?
    • Color depths beyond 8 bits - all the reviews used 8-bit color depths, which is fine for the previous generation of scanners and not too bad for monitors, but under-$200 flatbed scanners are now doing 48-bit pixels (so 16 bits/color) as well as more dots per inch. How well do the new compression algorithms support them? How much does this affect compression speed?
    • Conversion from JPEG, for digital cameras and scanners? Many input devices are smart enough to compress their images using JPEG, but especially for cameras, won't have the horsepower to do JPEG-2 compression. How well will the new algorithm support recompressing pictures originally compressed with JPEG?
    • Also, will browsers be able to do progressive decoding, so they can start by displaying rough versions of the image and gradually refining it, or will this be a compression mode nobody bothers using like it seems to be for JPEG?
  • by amorsen ( 7485 ) <benny+slashdot@amorsen.dk> on Monday April 08, 2002 @06:02AM (#3302263)
    /usr was where the user files were kept, way back when. Then one day, someone ran out of space on their root disk... The solution was to create a user called "bin", give it a home directory /usr/bin, and put the less important binaries there.

    These days when disks are enormous and logical volume management ensures that you can just add whatever space is necessary where it is needed, we should get rid of /usr. The only argument that still works in favour of /usr is making it a network drive. The people who do that will have to customize a bit when the rest of us drop /usr. I am sure they will hate me, both of them.

Always try to do things in chronological order; it's less confusing that way.

Working...