High Dynamic Range (HDR) Technology Analysis 231
THG writes "CoolTechZone.com has published an analysis of Valve's High Dynamic Range, or HDR, technology that enhances graphics in video games. This new video/gaming graphics technology is expected to debut soon with Valve's Half-Life 2: Lost Coast title. According to the article, 'HDR, or High Dynamic Range, is a lighting process that's been designed to emulate in-game or artificially generated lighting to closely mirror the changes we see in the real world. In simpler terms, HDR allows you to make the objects brighter by allowing them to use the full brightness capabilities of the monitor and not just the brightness level at which they have been shot with (or rendered with) in the scene.'"
Article text, non-paginated, for your convenience (Score:2, Informative)
Written by Varun Dubey [mailto]
Manufacturer: Various
Monday, 31 October 2005
(Review) - We've all played Half-Life and it's sequel Half-Life 2. The difference between the two games, in terms of graphics, is tremendous, and now Valve has gone ahead and updated the gaming engine to give you a level of detail and realism that you thought wouldn't be possible until perhaps the next round of game releases.
HDR, or High Dynamic Range, is a lighting process that's been designed
Re:Article text, non-paginated, for your convenien (Score:5, Funny)
So you're saying that this one goes to eleven?
Re:Article text, non-paginated, for your convenien (Score:5, Insightful)
Re:Article text, non-paginated, for your convenien (Score:5, Informative)
"HDR allows you to make the objects brighter by allowing them to use the full brightness capabilities of the monitor."
Pretty bad lie. By using a #ffffff color you already "use the full brightness capabilities of the monitor", unless you count turning up the brightness setting in yout monitor. As it has already been said, it lets the objects be brighter in the internal calculations, not on the monitor.
AnandTech's review from a month ago was better... (Score:5, Informative)
Re:AnandTech's review from a month ago was better. (Score:3, Informative)
I'm still waiting for the updated Source SDK so I can build maps using HDR - it's something I'm really looking forwards to. Eat your heart out, darkness-obsessed Doom 3 and friends!
Re:AnandTech's review from a month ago was better. (Score:2)
Re:AnandTech's review from a month ago was better. (Score:5, Informative)
Here, we see how the bloom effect starts to put a strain on the lower memory cards. The X800 and, in particular, the 6600 GT are the most memory-limited of these cards, but ATI's X800 does significantly better than the 6600 GT.
Welcome to Video Rendering 101. Tell me class, which card will be faster, and by how much:
The 12-pipe, 400 MHz core clock card (x800), or the 8-pipe, 500 MHz core card (6600 GT).
This isn't hard. The x800, when core-limited, should produce speeds 20% faster than the 6600 GT...and lord almighty, it's a miracle: the x800 is 20% faster than the 6600 GT with full HDR enabled! It must be the EXTRA 128MB RAM, or the 40% FASTER MEMORY SUBSYSTEM. It couldn't be the damn raw pixel processing power advantage.
And now class, why would the lower-end cards in this test show greater performance loss? Is it because Here, we see how the bloom effect starts to put a strain on the lower memory cards.
HELL NO.
It's called CPU-LIMITED. You can't measure true relative performance drops becuase the scene is CPU-limited to approximately 70fps. The 6600 GT is not even able to reach the 70fps mark without HDR, and suffers noticably with it on. The other cards scale as you would expect them to according to raw core clock speed, once you turn up the pixel processing requirements (full HDR), and the 7800 GTX is STILL CPU-limited.
And then, after mentioning it CLEARLY in the breakdown above that Valve's HDR implementation supports FSAA, AND after seeing plain-as-day that the 7800 GTX is still CPU-limited, the author doesn't try out FSAA performance. A 5-year old could write the same review.
I wouldn't be surprised at all if most of the language and pictures are verbatim from a Valve-supplied press pack.
Lets do the time warp again... (Score:5, Informative)
Its okay to post old news, but Lost Coast is already out, as is DoD:S which also uses HDR.
HDR Wizards (Score:5, Informative)
Wish List (Score:5, Funny)
- Ending world hunger
- Finding a cure for AIDS
- Making objects brighter by allowing them to use the full brightness capabilities of the monitor
Only two more to go! Thanks, Slashdot, for bringing this to my attention!
Re:Wish List (Score:2)
most nerds don't have aids and aren't starving.
Re:Wish List (Score:2, Funny)
Re:Wish List (Score:2)
Re:Wish List (Score:5, Funny)
Re:Wish List (Score:4, Funny)
But I couldn't find one that fit my monitor.
Re:Wish List (Score:3, Funny)
Re:Wish List (Score:2, Funny)
Re:Wish List (Score:3, Insightful)
http://www.signonsandiego.com/uniontrib/20050902/
"Debut soon"? (Score:5, Informative)
For another, the first Valve game to use HDR is DOD:Source, and that's been out quite a while already.
And finally, Valve didn't actually invent HDR, so other stuff has already used it.
Re:"Debut soon"? (Score:2)
Different games such as FarCry have used what they've called HDR, but valve came up with their own list of features which they felt should be present. Several of which hav
Re:"Debut soon"? (Score:2)
The technology was introduced by Valve in DoD Source (as the GP stated), and was fully implemented in at least 2 of the 4 original maps released (Anzio and Avalanche used it to full effect, not sure about the other two).
The tech demo was released specifically to showcase the lighting technology in the HL2 engine,
FarCry v1.3 too! (Score:2)
More than what was intended? (Score:3, Interesting)
I think that striving for accuracy and balance of the elements is probably more important than striving for the maximum ____ your system can deliver.
Re:More than what was intended? (Score:5, Informative)
Re:More than what was intended? (Score:5, Insightful)
This isn't about altering what any "author" intended. On the contrary, HDR is a new tool which lets the "author" do what's intended more easily, assuming what's intended is to achieve realistic lighting in the rendered scenes. Try Anandtech's recent article on the topic, they explain it very well.
Re:More than what was intended? (Score:5, Insightful)
It's a lot more than just a bass boost, since it's not just a brightness increase, but an increase in the range of brightness, allowing for very high contrast. If you go back and look at a Source game without HDR after seeing HDR for awhile, it looks like it has a dark film over it, similar to a digital camera picture looks before being run through auto-contrast in Photoshop.
Re:More than what was intended? (Score:3, Insightful)
Re:More than what was intended? (Score:5, Insightful)
When you wake up at night and you can see the room in nearly pitch black, things appear to be as bright as your room in the morning with the shades closed. Actually, the room in the morning is 1000 times brighter. The "author" of the real world (God?) "intended" the room to be 1000 times brighter by slamming 1000 times more protons onto your retina, but your brain normalizes things to make the world easier to comprehend.
Now, when you are playing a video game, and you go into a dark room with almost no light, current algorithms don't make it easier to see anything - they present you with a black screen. When you walk out into the sunlight, you get a white screen. This is now the way our brain sees the world, and makes the experience less realistic.
Audio, on the other hand, can be presented to us nearly perfectly. My headphones can range from 20 Hz to 20 KHz, all the frequencies our ears can hear. The mega-bass thing is therefore useless if you have a decent pair of speakers.
I guess my point can be summed up with this example: you can make a really good set of speakers as loud as an airplane if you want. Your monitor, however, cannot shine the sun in your eyes.
Re:More than what was intended? (Score:2)
Isn't adding more brightness than what the author originally intended..
As mentioned before, THE AUTHORS ARE THE ONES WHO ADDED THE FEATURE!!!!!! CHRIST!!!11ONEONE
NOT EVEN REMOTELY!!!!! I write a bit of electronic music, so I know the general mecahanics of production. The closest analogue I can come up with is running audio through an expander (or a compressor set to a ratio < 1. same thing). What do
Re:More than what was intended? (Score:2)
Re:More than what was intended? (Score:2)
In short, any scene in the real world can have luminance (basically, brightness) values anywhere from
Re:More than what was intended? (Score:2)
HDR corresponds more closely to a musician deciding to play his guitar through an overdrive pedal rather than 'clean', making the quiet parts louder at the expense of the loud parts maxing out the avaialble range. It's a decision made at the a
Re:More than what was intended? (Score:2)
Re:More than what was intended? (Score:2)
Basically, the human eye can see an incredibly wide range of brightnesses. While it may appear only half as bright indoors on a bright sunny day, it may actually be dozens of times brighter outside. We process brightness logarithmicly.
The traditional video display concept is to map a list of colo
THANK YOU (Score:4, Interesting)
Re:THANK YOU (Score:5, Informative)
Ah, I see you have never designed any graphics related software or hardware whatsoever. While it is not possible to make truly unbounded colour brightness levels in graphics, it can be approximated with floating point arithmatic, clever gamma curves or just really big integers (32 bits per channel or so). All of which take a lot of processing power, a lot of memory or both. It has only been a recent thing that graphics card manufacturers have had the powerful technology at their disposal to even think about this, let alone implement these techniques. If we had not had the hack known as 24 bit colour for the last twenty years we would have had nothing.
Now I come to think about it, the author of that artical doesn't know much better: Radiosity is a way of rendering a scene using only visible light sources WTF? Radiosity is a way of rendering a scene by taking into account light bouncing between surfaces, being absorbed by surfaces and emitted at different wavelengths etc. Pretty much the oposite of what it describes since real radiosity will create an effect similar to ambient lighting. The artical is written by idiots for idiots.
Re:THANK YOU (Score:2)
Re:THANK YOU (Score:2)
There's also the question of how you flatten that down to the gamut of the monitor. You're typical monitor has maybe 2 to 3 orders of magnitude. Figuring out the best way to compress HDR down to that scale (i.e. tone-mapping) is still an active area of researc
Re:THANK YOU (Score:2)
Re:THANK YOU (Score:2)
Re:THANK YOU (Score:2)
Re:THANK YOU (Score:2)
What's your point?
need more graphics control (Score:2, Interesting)
i remember years ago, i could still play the games fine if i just turned the graphics down - but that doesnt work anymore! my GeForce 2 lasted more than 2 years, but this one b
Re:need more graphics control (Score:2)
OK its not quite as portable as a laptop, but if you want portable get a PSP or DS, they're way cheaper than decent graphics card.
Re:need more graphics control (Score:2)
Re:need more graphics control (Score:2)
HDR and Lost coast (Score:5, Informative)
HDR is a hack (Score:2, Insightful)
Furthermore, we really need to increase contrast ratios
Re:HDR is a hack (Score:2)
It's more like the situation with CDs -- they're sampled at "only" 44.1khz, and thus unable to capture frequencies higher than half of that. And they have "only" 16 bit resolution. There are newer formats with ridicolous sample-rates and resolutions.
Witness people yawn. It quite simply doesn't interest the overwhelming majority of the people. What intere
Re:HDR is a hack (Score:3, Informative)
I'd have to disagree with you about that.
CD-quality audio gets pretty close to the limits of what the average person can hear. It's not perfect, but as you say it meets the "good enough" threshold.
Current display technology doesn't. Look at this representation of what's lost with sRGB [wikipedia.org]. See what a tiny portion of green colours (which our eyes are most sensitive to!) in particular are represented?
I went to a concert a few months ago (Dead Can Dance) and their stage lightin
fade to black (Score:2)
Re:HDR is a hack (Score:2)
Re:HDR is a hack (Score:2)
Sure, for some people, some of the time, it would make sense. Just as there's some people that some of the time need sound-recordings with higher frequency-ranges and/or more sample-accuracy than standard CD.
I'm just saying there's no significant market for it. In the sense that there's not a large population of people who are willing to pay significantly more to ge
Re:HDR is a hack (Score:5, Insightful)
The alternative (the traditional 8-bit path) corresponds to a reality where no light can be brighter than the white of the monitor", including when adding up light from several sources! Trying to get real photorealism that way is a lost cause. There's a reason why even holywood CG until recently always looked really 'flat', except in very dark scenes (lower dynamic range to model)
And for the record: those blooming effects are not part of HDR. They are simply post-process SFX, emulating scattering and other effects in the eye and in cameras. Sure, you couldn't do them really well without HDR, so they're a nice poster child for what it lets you do. But they are not what makes the process HDR.
Volumetric effects are of course not inherently HDR either; they've eben around a long time, just too heavy to do for most games to bother with until now, and looking much better with HDR (and bloom).
Re:HDR is a hack (Score:2)
http://sharp-world.com/corporate/news/051003.html [sharp-world.com]
uses exactly your suggestion im pretty sure: "backlight" made of an array of leds
and to all the responses to yourcomment by people with a "24bit colour is good enough" and "cd quality is good enough", what the hell are you doing on slashdot if your not interested in improving technology to the point where its perfect, not just "good enough" (and no, its NOT good enough, many people with keen ears and good eyes, myself
Re:HDR is a hack (Score:2)
That's why the back light should be distributed in LCD's. You would have a perfect black level if you only lit the areas where there was any significant light source. And LED's produce very minimal heat, so temperature is not an issue. The cool thing is, if you had that much brightness your eyes would naturally adjust like they would if
Re:HDR is a hack (Score:2)
LED's, especially in large numbers produce plenty of heat for the light output they have. you don't notice that heat because usually they are alone and dim. get one of the LED lamps from thinkgeek and you will see that LED's aren't much more efficient than regular lamps, and are less efficient than Compact Flourescent or flourescent tubes
Inaccurate definition (Score:5, Informative)
"High Dynamic Range (HDR) rendering is a technique used to retain color precision of a rendered scene as it goes through the rendering pipeline...
For applications, especially games, this means that our scenes will be rendered in a more realistic manner in terms of lighting. Using high dynamic range rendering we can add a great deal of detail to our applications by retaining as much light information as possible. This will then cause our objects and surfaces to be displayed in a way that comes closer to resembling real life than ever before.
The problem with non-HDR games is that traditionally, the color precision of a rendered scene is lost, and the rendered display is limite to a low dynamic range of color values between 0 and 255. In the past, this limitation was mainly a result of PC or console hardware only supporting integer buffers, which has a limited range of precision when compared to floating point buffers. Thus, to perform HDR rendering we will need to render our scene to an off-screen floating-point surface, so that the data can be manipulated and made ready to be displayed on the screen."
Also, it's not Valve's technology. They've implemented it in the Source engine now, but they didn't invent it and I'm pretty sure they're not the first to use it.
Re:Inaccurate definition (Score:2)
Re:Inaccurate definition (Score:2, Insightful)
Re:Inaccurate definition (Score:2)
After all, who can forget the Cock Monster on the roof scene? :-)
Hardware? (Score:2, Insightful)
Technique not Technology (Score:2)
False (Score:2)
Not Valve's HDR... (Score:5, Insightful)
And that would be inaccurate, too (Score:4, Insightful)
However, 'HDR' as the storage format being used most frequently already existed in the rendering application Radiance for a long, long time before that.
In fact, -most- rendering applications render in HDR - but are forced to clip values so that you can actually output it to a regular display (e.g. your TFT) or storage format (e.g. JPG).
In fact, Valve's HDR isn't an HDR display technology. It's a partial HDR pipeline for rendering (making sure that glints of the sun are bright on water surfaces, and not dull), processing (bloom effects) and simple tone mapping a la a LUT (look into a room from a skylit outside, and the room may appear dark. Walk inside, and the room appears normal whilst the outside world will appear very bright indeed. Note that a more proper tone mapping algorithm would, besides being computationally very expensive, show the room normally and the outside world bright - but not so bright as to be blown out.)
Once we've all got HDR displays (search on Slashdot for these - I've seen them, they're awesome), we can do away with all these basic gimmicks as the human visual perceptance system will simply do all the interpreting of what should be 'correct' HDR values coming from the display.
HDR is used similarly in film/digital photography (Score:5, Informative)
Photoshop CS2 includes this technology out of the box (Photoshop CS2 HDR [luminous-landscape.com]) -- in the demo page, notice that the sky is properly exposed as well as the vegetation on the hill in the foreground; this would be impossible to capture with many cameras. As the article linked by the original post states,
"HDR, or High Dynamic Range, is
And indeed that's what the photographic equivalent does. Unlike a camera, our eyes can properly "expose" the ground as well as they can the sky in the same scene. In fact, this is mentioned on pages 2 and 3 of the linked article in the original post.
More:
HDR - High Dynamic Range Compression - a Photoshop plugin [powerretouche.com]
The Future of Digital Imaging - High Dynamic Range Photography (HDR) [cybergrain.com]
Aizu University's Atrium High Dynamic Range Source Images [mpi-inf.mpg.de]
High dynamic range imaging - Wikipedia, the free encyclopedia [wikipedia.org]
Stitched HDRI [gregdowning.com]
If you would like to try this yourself, many digital cameras have a bracketing feature. I'd suggest at least five exposures, separated by one half stop or one full stop. However, it does not work well for moving objects since there will be a short amount of time that elapses between exposures.
Here is my first attempt:
High Dynamic Range Candy Corn [buran.org]
This particular shot was taken with a Canon EOS 1Ds MkII camera and manual bracketing, although I've made other successfull attempts with the bracketing feature of my Nikon D70.
Re:HDR is used similarly in film/digital photograp (Score:2)
Graduated neutral density filter. Cokin makes a nice kit.
Actually a whole lot of what gets done in photoshop can be done much faster at exposure time. Like lighting colour adjustments (warming/cooling filters), Increasing colour saturation (polarizers), selective focus (large aperature and/or vaseline smeared UV filter), perspective correction (proper came
Re:HDR is used similarly in film/digital photograp (Score:2)
Re:HDR is used similarly in film/digital photograp (Score:2)
Sidenote: Your username makes me wonder if you're a VW driver. Are you?
Umm (Score:2)
Auto-auto-levels (Score:2, Insightful)
HDR needs HDR display... (Score:3, Interesting)
Too bad the BrightSide [brightsidetech.com] display is "a little costly"... (Think several small cars -costly.)
Re:HDR needs HDR display... (Score:2)
If you're on a bright beach, it looks bright, but not nearly as bright as when you first look out your hotel room at it. A good HDR implementation would make the beach brighter when you first look at it, for example.
Burnout 3 Takedown on the PS
HDR in action (Score:4, Interesting)
Fundamental definition (Score:4, Insightful)
There is something missing though, which I think would be beneficial. Basically, the eye has rods and cones for luminance and colour respectively. The rods are far more sensitive than the cones, with the result being that in very low light conditions, we see in greyscale. I have never seen this effect in a game (or film), and I think it would really enhance the realism, especially in darker games like Doom3. It would be even better if the display could become slightly blurry and noisy as the rods are not as high resolution as the cones.
new technology? (Score:2, Informative)
HDR is not a new thing (Score:3, Informative)
HDR Glow in Unreal 3 [unrealtechnology.com]
Although some say light blooms are NOT high-dynamic range (which is true for the case where you just make something radiate light in a way that washes out details of objects around it - see here [wikipedia.org]), light blooms can be done with high-dynamic range color, which is what the Unreal 3 Engine [unrealtechnology.com] page mentions in a brief caption for the above picture.
Anyways, there are other games that ALREADY do HDR, such as Far Cry (with patch 1.3 or above). The best place to get a good view of it is ON a beach in Far Cry that is directly in the sun. It is funny that Far Cry has been ignored as the first of its kind in many things, but it really did do a lot of stuff that Doom 3, Half Life 2, etc. did, except earlier. It was also virutally bugless, compared to for example, the stuttering bug common in Half Life 2. Most are misinformed in crediting games such as HL2 or D3 in bringing in the generation of shader-heavy games (aka 'next gen' games).
That being said, if you don't know what HDR is, the Anandtech Article [anandtech.com] on HL2:TLC is a good read.
Good graphics = Bad graphics (Score:2, Insightful)
I can see it now- Unreal Tournament 2007: Pre-Order and get a FREE pair of Eagle-Eye sunglasses using patented NASA anti-glare technology!
Just give me the damn Kryptonite fog already. Serves us right for letting game designers use that
my brigthness goes to eleven (Score:2, Funny)
WTF - no screenshots?!? (Score:2, Informative)
Half Life 2: Lost Coast HDR overview [bit-tech.net]
Half-Life 2: Lost Coast review [bit-tech.net]
Half-Life 2: Lost Coast Benchmarks [bit-tech.net]
Day of Defeat: Source review [bit-tech.net]
See also (Score:2)
Doesnt actually explain HDR (Score:2)
Crap article. Check out OpenEXR (Score:3, Informative)
Re:Crap article. Check out OpenEXR (Score:2)
Re:Crap article. Check out OpenEXR (Score:2)
Article with pictures (Score:2)
this is new? (Score:2)
Slightly different tack---HDR Compression? (Score:2)
HDR is marketing speak (Score:3, Informative)
A seond use for more bits is various image based rendering techniques. For these, 16 bits is often not enough, unless you go floating point -- and even then, 32 bit floats will produce better results. These techniques often use "blacker than black" (negative values) and "whiter than white" (values > 1.0) as intermediate results of calculations.
As a side note lamenting the demise/withering into obscurity of a once great company, starting around 1992 with the reality engine, SGI made graphics pipelines with 12 bit/channel RGBA support from end to end. It is only recently that we see support for more than 8 bits/channel in the pc world.
The Future of Graphics (Score:3, Insightful)
Re:Sounds better than "turning up the contrast" (Score:5, Interesting)
HDR is a technique that uses floating point values rather than integers to represent luminance values within the pre-rendered scene. These values are then compared to each other before the scene is actually rendered and the luminance of the individual portions of the rendered scene are assigned based on the relative brightness of each light source when compared to each others. Basically, if you have a bright floodlight and a small flashlight visible to the camera the floodlight should vastly overpower the flashlight and should probably max out the brightness of the physical display device. However, if you move the camera angle up a little bit and include the sun in the scene, then HDR would dynamically darken all the other lights in order to make the sun look like the brightest light source and the sun would then have taken the highest brightness setting of the display device.
Another effect that is created using HDR is glare. An example of this in the real world is when you look directly at a bright light source, like the sun (I don't really recommend trying this out with the sun because it might cause eye damage but a flashlight or a light bulb should work too). The light source tends to look larger than it actually is because the light drowns out anything around it.
HDR rendering has been hardware accelerated on the new last few generations of video cards, but only recently has performance been acceptable enough to actually implement into a commercial game.
-GameMaster
Re:Sounds better than "turning up the contrast" (Score:2)
Re:Sounds better than "turning up the contrast" (Score:2)
To do a decent job at compressing dynamic range and
Re:Sounds better than "turning up the contrast" (Score:3, Informative)
Believe it or not CRTs really do get good contrast and colour, better than LCDs. Many people prefer LCD
Re:Monitor burnout (Score:2)
Your monitor probably dimmed quickly after your adjustment because it was really starting to fail. If you have to boost the brightness just to see an image, your tube is already dead. It will get darker at an exponential rate.
Re:Monitor burnout (Score:3, Funny)
> running very high brightness areas on screen is going
> to seriously reduce the lifespan of crts.
So I should rather play Doom 3 than reading Slashdot on that evil white background?
Re:Monitor burnout (Score:2)
A plasma screen (more of an issue for console games) on the other hand will age in the same way that a CRT does since every pixel has a coloured phosphor. Plasma can get that "burned image" effect as well.
Re:Monitor burnout (Score:2)