Forgot your password?
typodupeerror
Bug Graphics Software

Scaling Algorithm Bug In Gimp, Photoshop, Others 368

Posted by kdawson
from the black-is-black-sometimes dept.
Wescotte writes "There is an important error in most photography scaling algorithms. All software tested has the problem: The Gimp, Adobe Photoshop, CinePaint, Nip2, ImageMagick, GQview, Eye of Gnome, Paint, and Krita. The problem exists across three different operating systems: Linux, Mac OS X, and Windows. (These exceptions have subsequently been reported — this software does not suffer from the problem: the Netpbm toolkit for graphic manipulations, the developing GEGL toolkit, 32-bit encoded images in Photoshop CS3, the latest version of Image Analyzer, the image exporters in Aperture 1.5.6, the latest version of Rendera, Adobe Lightroom 1.4.1, Pixelmator for Mac OS X, Paint Shop Pro X2, and the Preview app in Mac OS X starting from version 10.6.) Photographs scaled with the affected software are degraded, because of incorrect algorithmic accounting for monitor gamma. The degradation is often faint, but probably most pictures contain at least an array where the degradation is clearly visible. I believe this has happened since the first versions of these programs, maybe 20 years ago."
This discussion has been archived. No new comments can be posted.

Scaling Algorithm Bug In Gimp, Photoshop, Others

Comments Filter:
  • Oh calm down.. (Score:4, Insightful)

    by Anonymous Coward on Tuesday February 23, 2010 @10:18PM (#31254402)

    Photographs scaled with the affected software are degraded, because of incorrect algorithmic accounting for monitor gamma.

    Seriously!

    I have a theory on why this has gone unnoticed for so long, but I'll keep it to myself...

  • Monitor gamma? (Score:2, Insightful)

    by Yvan256 (722131)

    To display the pictures, it makes sense to use the monitor gamma. But to actually modify the data using that information which is probably flawed in 99.9999999% of cases? That's just wrong.

    • Re: (Score:3, Interesting)

      by Trepidity (597)

      In some cases at least it seems like the primary purpose of scaling is to display the images immediately, in which case it seems like the gamma should be accounted for. For example, when browsers rescale images, it's an unexpected result if they change the perceived brightness while doing so--- shouldn't the browser's scaling be done in the same brightness space as the one it intends to use to display the images?

      • by X0563511 (793323)

        Yes, excepting that this issue is talking about the bug in image editors.

        • Yes, excepting that this issue is talking about the bug in image editors.

          Keep reading the article; this problem exists in web browsers as well.

    • by theskipper (461997) on Tuesday February 23, 2010 @10:25PM (#31254486)

      Excellent point. Just to be safe though, I'm going to take another look through my porn crypt to see if that's true.

      BRB.

    • Re:Monitor gamma? (Score:4, Interesting)

      by poetmatt (793785) on Tuesday February 23, 2010 @10:33PM (#31254560) Journal

      The responses that they post are also inaccurate it seems.

      From

      meanwhile, I see a grey rectangle in firefox, and I still don't get what that signifies.

      • by tagno25 (1518033)

        The responses that they post are also inaccurate it seems.

        From

        meanwhile, I see a grey rectangle in firefox, and I still don't get what that signifies.

        The first one I have a hard time seeing it, but I can.
        For the others have to scale the page to max to see anything other than a gray rectangle (using chrome)

      • Re: (Score:3, Insightful)

        by X0563511 (793323)

        Here:
        http://img43.imageshack.us/img43/2586/scalerbug.png [imageshack.us]

        Look at the third set of images. Had this bug not been present, they should have been nearly identical. As you can see, in my case, they are radically different.

        As the text reads:

        (dali picture)
        All four images in this page are the same one but your browser was instructed to scale it. The one below is scaled 1:2. On Opera and some versions of Internet Explorer it will show a gray rectangle. KDE Konqueror, Firefox and SeaMonkey will display it either pink

      • Re: (Score:2, Insightful)

        by DocHoncho (1198543) *

        meanwhile, I see a grey rectangle in firefox, and I still don't get what that signifies.

        It means the scaling algorithm in your web browser and other commercial softwares is disastrously broken. Duh, didn't you RTFA?

      • Re:Monitor gamma? (Score:5, Informative)

        by rvw (755107) on Wednesday February 24, 2010 @04:31AM (#31256792)

        meanwhile, I see a grey rectangle in firefox, and I still don't get what that signifies.

        Right-click the image, then click view image. You'll see the image full-scale, like the first image. Scaling it down 50% shouldn't make it gray.

    • Re:Monitor gamma? (Score:5, Informative)

      by evanbd (210358) on Tuesday February 23, 2010 @10:39PM (#31254584)

      The data in the pictures is not linear data. It assumes that it will be displayed on a system that introduces a gamma of 2.2. (If your display system does not do that physically, it should correct for this.) That is, a gray 127 should not display as halfway between a white 255 and a black zero, in terms of light output. (It should *appear* halfway between them visually, because your eyes aren't linear — that's (part of) why gamma is in use in the first place.) So, a checkerboard pattern of white / black squares will have half the luminosity of the white squares. When scaling down, software will turn it into a bunch of gray pixels. But they should be gray pixels of value 186, not 127.

      The page is not well written, but his example images make the issue very clear. It's not about your monitor gamma; it's about the "standard gamma" that all image files assume your monitor has.

      • But the file formats don't specify this. The image from a camera will set gray 127 to pretty much half the number of photons hitting the sensor as white 255, won't it?
        • Re:Monitor gamma? (Score:5, Informative)

          by evanbd (210358) on Tuesday February 23, 2010 @10:59PM (#31254778)

          Actually, any well-specified file format will specify the gamma. Not all allow you to set it per-file, but they do specify it. Normally this is a line in the spec that reads something like "color values use the sRGB color space" or similar — which specifies a gamma of 2.2 (roughly). And sRGB with it's nearly 2.2 gamma has become so standard that assuming anything else (in the absence of a clear spec) would be idiotic.

        • Re: (Score:3, Informative)

          by reub2000 (705806)
          Depends. Standard sRGB used in a jpeg file stores the data in non-linear format. If data was stored in a linear format, then a disproportionate part of the gamut would be in the highlights. On the other hand, a raw file contains the data in a linear format.
        • by spitzak (4019)

          Linear data needs a lot more than 8 bits to work correctly. Most "raw" formats use 12 or more.

          So I would think anything storing white as 255 is going to use a gamma curve and 127 is not 1/2 white but some darker value.

        • But the file formats don't specify this. The image from a camera will set gray 127 to pretty much half the number of photons hitting the sensor as white 255, won't it?

          If I understood TFA correctly, the problem is that it won't.

    • Gamma and sRGB (Score:5, Insightful)

      by smasch (77993) on Tuesday February 23, 2010 @11:02PM (#31254802)
      The basic issue here has to do with gamma curves and the way they're being handled (they're not).

      Most image files on your computer (BMP, JPG, PNG, etc.) are stored in the sRGB [wikipedia.org] color space. sRGB defines the use of a gamma curve, which is a nonlinear transformation applied to each of the components (R, G, and B). The issue here is that most scalers make the assumption that the components are linear, rather than try to process the gamma curve. While this does save processing time (undoing the gamma curve then redoing it), it does add some error, especially when the values being scaled are not near each other.

      So does this matter? Well, in some pathological cases where there are repeated sharp boundaries (such as alternating black-white lines or fine checkerboard patterns), this would make a difference. This is because the linear average of the pixels (what most image scalers use) yields a different result than if the gamma value was taken into account. For most images (both photographic and computer generated), this shouldn't be a big problem. Most samples are close in value to other nearby samples, so the error resulting from the gamma curve is very small. Sparse light-dark transitions also wouldn't be noticeable as there would only be an error right on the boundary. Only when you exercise this case over a large area does it become obvious.

      One final point: this gamma scaling effect would occur regardless of the actual scaling algorithm. Bilinear, bicubic, and sinc would all have the same issue. Nearest neighbor interpolation would be unaffected, but in these cases, the output would look far worse.
    • It seems crazy to me to embed a particular Gamma value into an image. Surely the point for Gamma correction is just before it is output to a device, and then the adjustment should be device specific (if possible). It is even crazier these days to use a Gamma based on the attributes of CRTs.

      In fact it seems so crazy I must be missing something. Am I?

      • Re:Monitor gamma? (Score:5, Informative)

        by Idarubicin (579475) <allsquiet.hotmail@com> on Wednesday February 24, 2010 @12:14AM (#31255378) Journal

        It seems crazy to me to embed a particular Gamma value into an image. ...In fact it seems so crazy I must be missing something. Am I?

        The article actually touches on this point. The sensitivity of the human eye isn't linear. If you use a linear scale to store luminosity information for an image, you waste a lot of bit depth at high luminosities - the eye has difficulty distinguishing between very bright and very bright plus a little tiny bit. On the other hand, the eye is very good at telling the difference between very dark and black. You need a lot of finely-graduated steps at low luminosity or else your shadows get jaggy.

        If you uniformly (linearly) space out luminosities on an 8-bit (256-shade) scale, you store a lot of uninteresting information at the high end, and lose out on visible detail at the low end. A scale with gamma of 2.2 (typical these days) fits a full twenty-eight grey values between 0 and 1 on our hypothetical linear scale. To maintain that kind of luminosity resolution (down where it matters), you'd have to store an extra five bits on your linear scale. An extra sixty percent costs.

      • Re: (Score:2, Informative)

        by Anonymous Coward

        Yes, you are missing something. Human perception isn't linear either. Twice the amount of light does not look twice as bright. Our eyes see differences between dark tones more clearly. The result is that we need many more dark tones than light tones for an "evenly" distributed tone curve (which is a tone curve where two neighboring light tones appear to be the same brightness difference as two neighboring dark colors). A physically linear gradient has the perceptual half tone shifted close to the black poin

  • I'm not a pro photosoftware user so I guess none of what's discussed here really affects me. But still, I'm curious about Irfanview and Picasa, the two programs that I use for my photo needs. Are these affected? How do I detect the effect?
    • by tenton (181778)

      I know this is /. and to say "RTFA" is kind of pointless, but, please, RTFA. There's a tweaked sample there for you to try. It will be obvious if you try their sample with whatever graphics program you want to use.

      • Re: (Score:3, Funny)

        by AngryNick (891056)
        Well, I RTFA and I completely understand why this is happening...the so called "sample photo" is covered with annoying little gray lines. No wonder the picture looks bad when you scale it. The dude needs a new camera.
    • by Holmwood (899130)

      If you read the fine article, you'll see they give a sample image you can test against your applications. I tested Irfanview (4.25) and it appears to suffer from the problem. Haven't tried Picasa yet; don't have it installed.

  • short version (Score:5, Informative)

    by Trepidity (597) <delirium-slashdot@@@hackish...org> on Tuesday February 23, 2010 @10:22PM (#31254438)

    Most scaling algorithms treat brightness as a linear space, so e.g. if you're doing downscaling to 1/2 the size in each dimension, collapse 4 pixels into 1 by setting the 1 pixel to the numerical average of the original 4 pixels. But, most images are displayed with an assumption that brightness is a nonlinear space, i.e. gamma > 1. Therefore, scaling changes the perceived brightness, an unexpected result.

  • by Cassini2 (956052) on Tuesday February 23, 2010 @10:25PM (#31254478)

    This is only a bug depending on what you are doing with your final images. One of the things that annoys me is that many image manipulation programs do not actually explain the primitives they are using. The result can be a complete mess depending on what you are trying to accomplish. This article is an example of this effect.

    If you want photo-realistic results, then you need to take Gamma into account. However, very few file formats specify the Gamma, the grey level, the white level, the black level or the colour space of the original image. The result is that the many imaging operations must be wrong, as they can never be accomplished the way intended. For the most part, no one cares. This person found an application where people care.

    • by johndoe42 (179131) on Tuesday February 23, 2010 @10:41PM (#31254614)

      But most people who use images expect to look at them eventually. And most image files are meant to be viewed at gamma 2.2. (Printer drivers will at least approximately emulate a gamma of 2.2, and LCDs emulate it intentionally.) If you view the image at some other gamma, you don't see quite what was intended.

      Another way of looking at it is that most standard image formats are stored with a nonlinear representation, and people who do math should realize that. For an untagged image, gamma=2.2 is a good bet. gamma=1.0 is a terrible bet.

      Of course, if we really want our software to do a good job, then that software should be aware that specifying colors like #FF0000 isn't a good idea -- they look very different on different screens. What the user probably meant to do was specify a particular color, which means that the numbers need to be marked with a color space. (For a great demo, get an HP LP2475w or some other good wide-gamut display, don't install a profile, and look at anyone's photo album. Everyone looks freakishly red-faced.)

    • Re: (Score:3, Insightful)

      by dimeglio (456244)

      Well if it's a 20 year old bug, it should be renamed to a feature and patented. "Method to produce photo realistic scaling without taking gamma into account."

    • Re: (Score:3, Insightful)

      by Waccoon (1186667)

      I'm a cartoonist who runs a web comic, and I've known about this problem for years.

      To cut bandwidth, I have a set palette of indexed colors that I use as a base for each strip, and I pick a few extra colors to finish the anti-aliasing. It doesn't take as much effort or time as it sounds, and it cuts bandwidth considerably compared to a trucolor PNG, and doesn't have the artifacts of a JPEG.

      However, I noticed that when I reduced one of my strips in size, the brightness always changed, so when I applied my c

  • HA! (Score:5, Funny)

    by TheDarkener (198348) on Tuesday February 23, 2010 @10:27PM (#31254504)

    Well, I am SURE glad I'm using Linux^H^H^H^H^HWindows^H^H^H^H^H^H^HMac^H^H^Hshit.

    • Actually, according to TFA, Apple's built-in toolkits (used by Aperture and Pixelmator) seem to be immune to this bug. Photoshop ceased being a mac-like application a very long time ago.

      This is one of many reasons why creative professionals prefer macs over PCs --- and I'm not saying this as platform evangelism -- for one, you'd be hard pressed to disagree that Mac OS X's font-rendering, kerning, and anti-aliasing abilities are far superior to those provided by Windows when presented with side-by-side exam

      • Re:HA! (Score:5, Informative)

        by Mr2001 (90979) on Wednesday February 24, 2010 @01:24AM (#31255804) Homepage Journal

        This is one of many reasons why creative professionals prefer macs over PCs --- and I'm not saying this as platform evangelism -- for one, you'd be hard pressed to disagree that Mac OS X's font-rendering, kerning, and anti-aliasing abilities are far superior to those provided by Windows when presented with side-by-side examples.

        Mac OS X's font rendering is different [joelonsoftware.com], but calling it "far superior" is simply platform evangelism.

        OS X renders text so that the on-screen representation looks more like the printed representation, which is good for tasks like designing print advertisements (where you want to approximate the finished product as closely as possible). Windows takes liberties with the shape and spacing of on-screen text in order to line it up with the pixel grid, which is good for tasks like word processing and programming (where legibility on screen is more important). When you're used to Windows, Mac text looks blurry; when you're used to the Mac, I imagine Windows text looks thin and lanky.

    • Re:HA! (Score:5, Funny)

      by Korin43 (881732) on Wednesday February 24, 2010 @12:28AM (#31255474) Homepage
      Well I tested the site in lynx and I didn't see any problems..
  • by ipquickly (1562169) on Tuesday February 23, 2010 @10:34PM (#31254570) Homepage

    I've been telling people for years that I look better in person.
    I told them that there's something wrong with pictures of me.

    HA!

    Now I know.

    It's the Scaling Algorithm BUG!

    • Re: (Score:3, Funny)

      by grcumb (781340)

      It's the Scaling Algorithm BUG!

      Man you gotta come up with a better line than that. Even 'I just went swimming', or 'It's cold in here' is more convincing than that!

      In any case, your girlfriend will still be disappointed....

      ... If you ever get one.

  • for people with poor quality displays!

    I now have a much better understanding of why I have to constantly adjust the angle of my (laptop) monitor every time I move my head. Some of the demos on that page are great for illustrating the effect of a poor quality display (or poor scaling algorithm) on picture quality. I'll keep that page in mind the next time I shop for a laptop.

  • by Animaether (411575) on Tuesday February 23, 2010 @10:47PM (#31254664) Journal

    Come on, this isn't news...

    Helmut Dersch (of Panorama Tools fame) certainly posted about this before;
    http://www.all-in-one.ee/~dersch/gamma/gamma.html [all-in-one.ee] - Interpolation and Gamma Correction

    There's no factual error in the scaling algorithm, as the /. headline would like you to believe - it's a color space (linearity) issue; you have to do your calculations in linear space which means a typical photo off of a camera/scanner gets the inverse of an sRGB curve applied (a gamma of 0.454545 is 'close enough' if you can't do the proper color bits). Then scale. Then re-apply the curve.

    And no - for real life imagery, nobody really cares - the JPEGs out of the cameras and subsequent re-compression to JPEG after scaling will have 'destroyed' far more data than the linearity issue.

    They're nice example images in the story, but they should be called 'academic'.

    • by evanbd (210358) on Tuesday February 23, 2010 @11:02PM (#31254798)

      The example images that make it really clear are academic examples. But the scaled photos are all enough of a change to be worth noticing and caring about if you're a serious amateur photographer (never mind professional). And they don't look particularly unusual to me (I haven't looked for odd trickery, but I assume he's being honest here).

      • by Animaether (411575) on Tuesday February 23, 2010 @11:13PM (#31254906) Journal

        If you're a *serious* amateur photographer, then you should already know about this and not be using those apps / using them in the color modes (to use Photoshop parlance, as I guess most serious amateur photographers will have a copy (legit or otherwise of that)).

        I guess the argument would hinge on who is a serious amateur photographer and who is just a regular amateur photographer.

        As for the actual examples - sure, you can see the difference.. especially since they're in before/after -style swappable pages. If I presented you a random image off of a random image gallery online, though, would you be able to tell the difference?

        If I showed you an online photo album and pointed at an image's thumbnail, had you click the thumbnail, and open up the full size image.. would you notice that it was scaled to thumbnail incorrectly?

        "Nobody really cares" may have been too broad a statement - but those who really care, already know.. or reasonably should know.
        imho.

        Note that I'm not excusing the software programs from handling this better - certainly not Photoshop - but it's 1. not a new revelation and 2. certainly not a "scaling algorithm bug".

        • Re: (Score:3, Insightful)

          by evanbd (210358)

          Note that I'm not excusing the software programs from handling this better - certainly not Photoshop - but it's 1. not a new revelation and 2. certainly not a "scaling algorithm bug".

          In what sense is it not a scaling algorithm bug? The images look different after scaling than before, when interpreted in accordance with the appropriate specs. It seems to me that the specification for the scale function is something like "returns an image that is as visually similar as possible to the original, but reduced in size by the specified amount." It might be known, and it might be better described as using the wrong algorithm than an algorithm bug, but it's definitely a bug in the program.

          The

    • by Skapare (16644) on Tuesday February 23, 2010 @11:21PM (#31254956) Homepage

      It's basically an implementation issue. The algorithms may be fine as intended ... in linear space. The programmers that implemented them didn't understand linear vs. gamma, or didn't care, or had a fire breathing PHB on their back. Hence we get junk software.

      At least all MY image processing code always works in linear space. Bu merely converting 8-bit gamma to 8-bit linear is no good because that now introduces some serious quantizing artifacts (major banding effects happen). So I convert the 8-bit gammas to at least 30 or 31 bit integer if I need processing speed, or all the way to double precision floating point if I need as close to correct as possible. After processing, then I convert back to 8-bit gammas. Even then, you can't totally eliminate some banding effects that result from being in 8-bit. If you can get more bits from the raw images from your camera, that's the best to use. Apparently many JPEG compressors are also doing their DCT calculations in the non-unit gamma space instead of the linear space, too (which reduces the effectiveness of the compression somewhat, and may add more compression artifacts).

  • Is it just me or do some of the examples look _better_ with the "incorrect" scaling?

    For example, this one of the NASA image:
    http://www.4p8.com/eric.brasseur/gamma_21.html [4p8.com]

    Look right at the center of that picture on the incorrect scaling one, and without moving your eyes switch it to correct. When I do it at least, it feels like my eyes instantly lose focus. With the incorrect scaling, everything looks perfectly crisp and clear. With the corrected one it takes a significant amount of effort to focus anywhere

    • Re: (Score:2, Insightful)

      by BoppreH (1520463)
      That might be true, but it's no reason to turn this into an undocumented and unavoidable feature.
    • by PitaBred (632671)

      Look at the islands and the lakes. They're crisp at the edges, whereas those islands disappear when you use the incorrect scaling. I wouldn't use the incorrectly scaled image for anything important.

  • by drewm1980 (902779) on Tuesday February 23, 2010 @10:57PM (#31254764)

    Gamma is often poorly understood even by people doing scientific and engineering work using images.

    Does your algorithm depend (explicitly or implicitly) on the light intensity -> pixel data mapping?

    If NO: You're probably wrong. Go read about gamma. Just because the picture looks right to you, doesn't mean it looks right to your code.

    If YES:

    Do you have the luxury of taking the pictures yourself?

    If NO: You're stuffed. Pretty much all images on the internet and most public research databases have unknown/unreliable gamma curves.

    If YES:

    1. Spend a lot of time calibrating your camera yourself. This is only cheap if your time is worthless

    or

    2. Buy a machine vision camera. A $600 machine vision camera will have the specs of a $50 consumer camera, but at least you will know the gamma curve.

    or

    3. Ignore the gamma issue, cross your fingers, hope it's not hurting your performance, and publish your results knowing that you're in good company and nobody will call you out.

    • by Anonymous Coward on Wednesday February 24, 2010 @05:01AM (#31256954)

      1. Spend a lot of time calibrating your camera yourself. This is only cheap if your time is worthless

      Typically this should be done during the initial set-up of any new camera that a photographer purchases. A Gretag Macbeth colour checker is cheap, the required shot to evaluate the performance of the sensor is quick and easy to set-up and the processing of this test image is fast with the right tools (like this script for photoshop [rags-int-inc.com]). It should take under an hour to do it right and get it as part of the automatic stage of processing your RAW files (basically setting ACR/Lightroom's demosaicing stage), but the benefit is that every picture taken from then onwards does not need extra calibration. Thus your prints look like your shots, assuming the rest of your workflow is equally as calibrated.

      While your time is valuable, if you do not calibrate like this, you're wasting time further down the line for EACH image, and thus it's more expensive to not do it...

  • by JackHoffman (1033824) on Tuesday February 23, 2010 @11:06PM (#31254840)

    Ok, so he made a very informative page about it, but this is still a well known effect. It affects practically everything you can do in image editing. Blurs, etc. Most people neither notice nor care. It's rooted in the fact that most images come with undefined black and white points and a gamma chosen for artistic effect rather than physical accuracy. Thus correctly converting to linear gamma is hardly ever possible. You can still correct for monitor gamma to avoid some rarely seen inconsistencies and artifacts, but most people don't even notice, so why bother? However, Photoshop does have everything you need to avoid the effect completely, even in the ancient Photoshop 6.0.

    Here's how to properly resize in Photoshop:

    1. Convert mode to 16 bit (to avoid tone aliasing in the next step, no other influence on the calculations)
    2. Convert to profile, select "Custom RGB", set Gamma to 1.0 (this converts the internal image data to linear gamma, no visible change because the image is color managed and corrected back to monitor gamma on the fly)
    3. Image Size
    4. Convert to profile, select "Custom RGB", set Gamma to 2.2 (default)
    5. Convert mode to 8 bit

    Done. You can substitute your favorite image filter for the image resize. Unsharp mask works much better at gamma 1.0, for example. Of course you can use several filters before converting back to monitor gamma and 8 bit.

    • Never heard of this one.... I tried it and found absolutely no difference:

      Open RAW file from a Nikon D2x in Adobe Camera Raw 5.6
      Open another copy - no changes in ACR (both are native size for camera)
      Copy 1 - Change color profile from ProPhoto (my working RGB) to Custom RGB with Gamma set a 1.0
      Copy 2 - Leave in ProPhoto
      Resize to half the original size, same pixel depth (both files)
      Run unsharp mask to taste on Copy 2 (ProPhoto)
      Run same unsharp mask on Copy 1
      Convert Copy 1 back to ProPhoto or custom
    • by jcupitt65 (68879) on Wednesday February 24, 2010 @05:20AM (#31257042)

      Yes, it depends on the colourspace you use for the resize. If you resize in a non-linear colourspace, you will get (usually) tiny errors.

      I'm the author of one of the programs listed as defective (nip2). If you care about this issue, all you need to do is work in a linear space, such as XYZ (Click Colour / Colourspace / XYZ).

  • Old news (Score:5, Interesting)

    by spitzak (4019) on Tuesday February 23, 2010 @11:33PM (#31255056) Homepage

    My software has been calculating in linear space for over a decade now (this is the Nuke Compositor currenlty produced by The Foundry but at the time it was used by Digital Domain for Titanic). You can see some pages I wrote on the effect here: http://mysite.verizon.net/~spitzak/conversion/composite.html [verizon.net]. See here for the overall paper: http://mysite.verizon.net/~spitzak/conversion/index.html [verizon.net] and a Siggraph paper on the conversion of such images here: http://mysite.verizon.net/~spitzak/conversion/sketches_0265.pdf [verizon.net], in fact a lot more work went into figuring out how to get such linear images to show on the screen on hardware of that era than on the obvious need to do the math in linear. Initial work on this was done for Apollo 13 as the problems with gamma were quite obvious when scaling images of small bright objects against the black of space.

    For typical photographs the effect is not very visible in scaling, as the gamma curve is very close to a straight line for two close points and thus the result is not very much different. Only widely separated points (ie very high contrast images with sharp edges) will show a visible difference. This probably means you are trying to scale line art, there are screenshots in the html pages showing the results of this. Far worse errors can be found in lighting calculations and in filtering operations such as blur. At the time even the most expensive professional 3D renderers were doing lighting completely wrong, but things have gotten better now that they can use floating point intermediate images.

    One big annoyance is that you better do the math in floating point. Even 16 bits is insufficient for linear light levels as the black points will be too far apart and visible (the space is wasted on many many more white levels than you ever would need). A logarithmic system is needed, and on modern hardware you might as well use IEEE floating point, or the ILM "half" standard for 16-bit floating point.

  • I have a Shareware app that I purchased something like 15 years ago. It exhibits the bug; but it also has a gamma correction function. I corrected the gamm to 0.46 (approx inverse of 2.2) then scaled the test image of the Dalai Lama, then corrected back to 2.2. It looks fine. YMMV I suppose; but if your app supports gamma correction then by all means try this trick before doing anything more drastic. That's assuming of course that it's really critical for you; which as others have pointed out it probably

  • now that this is fixed, can we finally have those infinite resolution zoom-in functions they have in the movies?

  • Looks ok to me. Besides, I think I just read a Slashdot article that all digital media is going to rot in the long run.

    In the short run, look for a new marketing bullet point.
  • Wrong (Score:2, Insightful)

    by sootman (158191)

    "There is an important error in most photography scaling algorithms."

    No, there isn't. If millions of professional users haven't been bothered by it over the course of two decades, it is CLEARLY not important.

  • Has anyone else noticed the “dark gamma” cancer that came over the Internet since the dawn of cheap LCD displays?

    I have a couple of very carefully calibrated displays here, and pretty much everything on YouTube and every image on the net has a way too deep gamma.
    Which is caused by the LCDs (especially the cheap ones) all being extremely white and having a distorted gamma by default.

    It’s really annoying, since I always have to switch the color profile when I want to see anything in those vi

  • by buzzn (811479) on Wednesday February 24, 2010 @12:59AM (#31255648)
    Several people have spoken about "linear" RGB. That's nice and gets rid of some small level of distortion introduced by the non-linearity. However, it only starts there. For example, the eye sees R, G, and B differently. It is more sensitive to green than red, and to red more than blue, but it's not even that simple as the equations in your eye's processor are much more complicated. Many algorithms that treat the three "equally" are going to change the perceptual mixture. One can use other color spaces, such as HSV, Yuv, xyY, etc. with different advantages and disadvantages

    Sound makes a good analogy. When you play music through any given combination of source, amp and speakers, it sounds different. Sometimes we actually like a particular type of sonic "distortion". It's never exactly like the "original" live music, though.

    Likewise, any graphics manipulation is "distorting" the original. In fact, when I take a digital image and run it through Lightroom, do a range expansion/equalization, and do a bunch of tweaks to make the image look good, I'm making much larger changes than those little scaling problems listed in the article. The point is, do you think the result looks good?

    There's other important variables, such as what colors are next to other colors in the image, how long you look at the image, what else is around you, how tired you are, etc. There's no such thing as color fidelity, there's only approximations to it. Color is hard, and I mean, really hard. See Hunt, "The Reproduction of Colour", or any number of other fine texts to learn more.
    • Re: (Score:3, Informative)

      by eggnoglatte (1047660)

      Actually, different color spaces are OK, so long as they are just linear transformations from cone space. That is the case for (linear) RGB, (linear) HSV, XYZ, and a few others. As long as the transformation is linear (i.e. just a matrix times the color vector to give you a color vector in the cone space), you can apply any linear operation (such as scaling, blurring, and other weighted sums), and the order of transformation is exchangeable.

      For example, say LMS = M * RGB and you want to average two pixels.

  • Hummm, can you turn this bug around and come up with an image that appears totally gray in normal size but looks like something recognizable when you scale it up or down? If so, that could be a basis of some cool steganographic hack.

  • By accident, I happen to have the aforementioned (TFA) Lightroom 1.4.1.
    That wasn't skill, that was luck.

    I am a Pro Photographer and I've just purchased (well, the Mrs. got it for me for Christmas) a *Nikon Super Coolscan 9000 WITH matching photon torpedo sleds, complete with very lame software and support.

    I'm about to begin transferring a considerable amount of slides to digital and this is news to me. I still shoot film and plan to continue until pried from my cold dead hands etc., etc.
    The article illustra

  • by Xabraxas (654195) on Wednesday February 24, 2010 @11:01AM (#31259342)
    I noticed this bug the other day but I thought perhaps I made a mistake somewhere. I am creating a Drupal site for photos and it has a dark background. I was just testing out the image upload and I used an unscaled image. Later I scaled the same image down to save space and re-uploaded the image. The brightness was noticeably different. It's actually very hard to tell in a lot of cases, especially with a brighter background. A dark background really makes the bug apparent.

I bet the human brain is a kludge. -- Marvin Minsky

Working...