Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Displays Intel

Improved Image Quality For HMDs Like Oculus Rift 55

An anonymous reader writes "The combination of smartphone panels with relatively cheap and light-weight lenses enabled affordable wide-angle Head Mounted Displays like the Oculus Rift. However, these optics introduce distortions when viewing the image through the HMD. So far these have been compensated for in software by using post-processing pixel shaders that warp the image. However, by doing so a loss in image quality (sharpness) can be perceived. Now researchers from Intel found a way around this error by using different sampling for rendering, therefore potentially increasing the image quality of all current and future HMDs with a wide field of view." Rather than applying barrel distortion to the final raster image, the researchers warp the scene geometry during rasterization. However, it currently requires ray tracing so it's a bit computationally expensive. Note that a vertex transformation can be used (with tessellation used to enhance the approximation), but the results are of variable quality.
This discussion has been archived. No new comments can be posted.

Improved Image Quality For HMDs Like Oculus Rift

Comments Filter:
  • by Beardydog ( 716221 ) on Wednesday October 23, 2013 @12:01PM (#45213129)
    Oculus Rift is one of the greatest products ever, and Ima let you finish, but this is even better for multi-monitor gaming.

    At least Oculus Rift had identified and addressed the problem of distortion, even though their solution loses image quality. Multi-monitor gaming has been garbage for a decade because everyone seems content with horrific distortion at large FOVs.

    I know, it's all a matter of screen placement and eye positioning. That's dumb. I want a wrap-around image. I want to aim a projector at each of three walls and have the result make sense.

    If you've tried Fisheye Quake, you know it's hell on your system, and still doesn't look great. If this technique is at all performance, everyone needs to start shipping with support, and they need to start yesterday.
  • How about you use better lenses so you don't have distortions in the first place?
    Stop trying to fudge shit in software. It doesn't work with projectors with shitty lenses, telescopes with shitty lenses, or camera phones with shitty lenses.
    If you want a good image use good optics. Yes, that means added cost and weight. Deal with it.

    • Yes, that means added cost and weight. Deal with it.

      Sure, if you don't mind producing a headset that no-one wants to buy because it's too expensive, no one wants to use because it's too heavy, and no one wants to supports because of the first two things. Or, in other words, if you want to be a complete and utter failure.

      • Yes, that means added cost and weight. Deal with it.

        Sure, if you don't mind producing a headset that no-one wants to buy because it's too expensive, no one wants to use because it's too heavy, and no one wants to supports because of the first two things. Or, in other words, if you want to be a complete and utter failure.

        Except the people who are interested in these headsets (I am not among them) all excrete into their panties at the mere mention of the Oculus Rift. The only thing they complain about is the image quality. Fixing that should be their priority, even if it makes the units slightly heavier or slightly more expensive.

    • It's not a lens problem. The lenses are trying to correct for the fact that current games display 3D images meant for display on flat surfaces. The lense is there to distort to image and make it wrap around your eyes, but the portion of the image you're wrapping is distorted and lacking detail, even before the lens smears it across your peripheral vision. This is a method for making the initial image much better and full of data so that less aggressive smearing is necessary, and the per-smear image has more
      • It's not a lens problem. The lenses are trying to correct for the fact that current games display 3D images meant for display on flat surfaces. The lense is there to distort to image and make it wrap around your eyes, but the portion of the image you're wrapping is distorted and lacking detail, even before the lens smears it across your peripheral vision. This is a method for making the initial image much better and full of data so that less aggressive smearing is necessary, and the per-smear image has more data in it to begin with.

        Wouldn't the next-step solution be to use curved OLED screens and develop rendering engines which take into account the spherical nature of the monitors?

      • by Guspaz ( 556486 )

        Except it's not just the image that is distorted and needs correcting, it's also the focal plane. And to those of us with vision problems like myopia, the curved focal plane of the Oculus Rift means only part of the image will be in focus. You can't correct for focus in software.

  • Hi,

    Can't one just subdivide (tesselate) polygons that appear relatively large in screen space so that they consist of many small polygons, with a few pixels each? This would allow for doing the barrel distortion entirely in the vertex shader with no ray tracing being required. The challenge is to perform the dynamic tesselation without requiring a constant updating of the geometry (vertex buffers) on the GPU.
    Maybe newer APIs like DirectX10 or 11 would support this dynamic tesselation approach in hardware.

    • by Nemyst ( 1383049 )
      That's the last line of TFS. Tessellating the mesh so that triangles are approximately the size of a pixel is nearly as costly as raytracing, though.
  • From TFA (emphasis added):

    Consumer, wide-angle HMDs use relatively cheap lenses to bring the screen into focus for the human eyes and enable the wide field of view. The drawback is that there are spatial distortions and chromatic aberrations. Luckily, those can be compensated for in software and this e.g. in the Oculus SDK done as a post-processing pixel shader. The default implementations are warping the image in a barrel-distorted way and are using bilinear texture access to do so. The result is that the

    • Having a screen display your prescription so you don't have to wear lenses or glasses? Is that even possible?
    • by Arkh89 ( 2870391 )

      For the first thought, it is extremely hard (~impossible, depending on your specifications) to make a good quality optics with few elements that are (almost) free from distortions, chromatic aberration and aberrations (to keep it simple, blur) at the same time.

      For the second, it might not be possible. The problem of our eyes is that they are imaging a perfect point not into a point but a small blur. The imaging quality is perfect when the size of this blur is kept minimal (more complicated in the facts, but

      • by Arkh89 ( 2870391 )

        My bad, the cornea is no the eye lens, shame on me...

      • by martyb ( 196687 )

        Thank-you for your thoughtful reply! It's been decades since my last physics class that dealt at all with optics, and we never got into the various distortions and aberrations. It was also in light of a theoretically perfect lens (or two) and a bit with reflection and refraction. Your explanation of the wavefront delays made perfect sense - I think I see it now (pun intended!).

    • by lxs ( 131946 )

      I recommend an optics course as your very first step because no distortion of the image on the screen will correct for the failure of your eyes to form a sharp image. What you want could possibly be done with liquid lens technology, but it will take decades for that to be anywhere close to affordable for the large lenses needed in this application.

      • by martyb ( 196687 )

        It's been decades since my last physics course that dealt at all with optics, so a course that dealt specifically with optics is not a bad idea. We only touched upon ideal lenses, reflection, and refraction. Never touched on aberrations or distortions.

        Liquid lenses would be nice. Bifocals are a pain. Full-lens near and far vision would be wonderful!

    • My first thought was "Why not use better quality lenses?" Sure, they'd be more expensive, but there is an expense involved in the software having to correct Every Single Frame. Why not fix it once, at the source, and obviate the need for continuous real-time updates?

      Because the corrections are needed no matter how good your lenses are - it's a remapping of pixels so those pixels appear in the correct place in your plane of vision, and doesn't have anything to do with compensating for low quality lenses.

      I'd imagine a learning session where certain scenes (e.g. grids) are displayed and the system would apply software corrections under my control until it looked good to me.

      Not possible, I'm afraid. Whatever the screen puts out is going to get blurred by your astigmatism and short-sightedness, and only a physical lens can pre-correct it for your eyes.

      • by martyb ( 196687 )

        Because the corrections are needed no matter how good your lenses are - it's a remapping of pixels so those pixels appear in the correct place in your plane of vision, and doesn't have anything to do with compensating for low quality lens

        Yes, I get that, now. The corners of a flat display is further away than the center and the pixels there subtend a smaller arc on the eye than those at the center would. Duh!

        I'd imagine a learning session where certain scenes (e.g. grids) are displayed and the system wou

    • by Nemyst ( 1383049 )
      Sadly, you can't apply that much correction in software. Warping can be done to a certain extent, but you cannot fix chromatic aberrations, which are inherent to any wide-angle lens, and other such optical effects. Even quality lenses would not eliminate everything and will still cause uneven pixel density across your field of view.

      The Oculus Rift, like most VR head gear, is based off two small screens, one per eye. Those screens are rectangular and that's it. If you output a rectangular image, the image,
      • You can correct for chromatic aberration in software, to a varying degree. You can approximate it (so the aberration is ~1/3 of what it would normally be, by aligning the centers of the primary colors) for arbitrary inputs, e.g. a photograph captured with an imperfect lens (image editing software can do this). You can do it on the output side with perfect accuracy if you're displaying an image using three monochromatic light sources (e.g. a laser display), since the three wavelengths involved would then be

      • by martyb ( 196687 )

        Sadly, you can't apply that much correction in software. Warping can be done to a certain extent, but you cannot fix chromatic aberrations, which are inherent to any wide-angle lens, and other such optical effects. Even quality lenses would not eliminate everything and will still cause uneven pixel density across your field of view.

        My bad. I focused on the word "cheap" in "relatively cheap lenses", and assumed that was the cause of (at least some) of the problems they were trying to fix in software.

        The

    • My first thought was "Why not use better quality lenses?" Sure, they'd be more expensive, but there is an expense involved in the software having to correct Every Single Frame.

      The idea behind the Rift is to produce a HMD that does NOT cost 1'300$ to build.
      It does this by using cheap of-the-shelf parts.

      Whereas things like the "eMagin Z800 3D Visor [wikipedia.org]" use special purpose display units (OLEDs) and needs very spetial complex optics so that the virtual display seems square (at 60%, it was one of the widest field of view at its time), Occulus went:
      - Fuck this expensive shit, lets use the same Retina-level display as any other smartphone on the market, and throw a rather simple len at it.

      • by martyb ( 196687 )

        The idea behind the Rift is to produce a HMD that does NOT cost 1'300$ to build. It does this by using cheap of-the-shelf parts. [brevity snip]

        Thanks! Early adopters are willing to pay more for something, but it helps to get the price down as soon as possible so as to build sales volume. They've put their money where they get the bet bang or the buck, and lenses are not it.

        The bad news is that your problem can't be fixed in software. You're near-sighted + astigma, meaning that your eye fails to focus on the picture (and can't focus on a single point at all, actually). Software fixes are for distortion, meaning that the eye is capable of focusing on a pixel, but it gets the wrong pixel in that position. The type of eye-sight problem that *could* be fixed in software is eye-mobility problems, where on of the eye isn't able to focus in the correct directions and thus gets "shifted" view, giving a doubled picture. This kind of problems are fixed with "prismatic" type of lens, this kind of problem could be fixed by shifting the image on the Rift in opposite direction.

        Failure to focus correctly... got it, thanks!

        The good news is: Well read again the first paragraph: Rift use plain simple cheap lens. Just swap the lens with another set of cheap lens which are adapted to your near-sightness and voilÃ, glass-free 3D.

        That makes sense. I'd assumed that it would be difficult to fit the OR over my glasses.

      • > (And has been done in part in the past: Fish-eye Quake does similar kind of distortions [strlen.com], too)

        Actually it looks like Fish-eye Quake uses something somewhat more similar to the Rift's approach than to Intel's. Basically it renders "normally" to multiple 90* FOV screens that make up the faces of a cubic environment map, and then uses a lookup table to sample those images to create the smooth distortion of a fish eye lens. It's using multiple rendering frustums and a lookup table rather tha

    • by gl4ss ( 559668 )

      I don't think that kind of correction can be done in software. correction for eyes pointing at wrong place yes, focus no.

      anyhow, oculus comes with couple of lenses for people with different eyesight and I'd imagine the consumer model will as well.

      I didn't think the lens quality to be the problem with oculus though, the problem with rift right now rather than the blue/redshift from the lens is that the resolution is rather small and stretched over a large area(that the fov is large is a plus though), so you

      • by martyb ( 196687 )

        I don't think that kind of correction can be done in software. correction for eyes pointing at wrong place yes, focus no.

        Yes, I can see that now. I was kind of hoping that *I* was misunderstanding something and that it was, indeed, possible.

        anyhow, oculus comes with couple of lenses for people with different eyesight and I'd imagine the consumer model will as well.

        I didn't know that! Thanks!

        I didn't think the lens quality to be the problem with oculus though, the problem with rift right now rather than the blue/redshift from the lens is that the resolution is rather small and stretched over a large area(that the fov is large is a plus though), so you see the pixels(it's like playing at 320x200 all over again).

        We HAVE come a long way from the old CGA graphics! We've come a long ways, but it seems like it's still a long ways until we have totally immersive displays. Time will tell. Thanks for the reply!

    • by mcgrew ( 92797 ) *

      I wear glasses (near-sighted and have astigmatism, too). It would be *so* nice if there were a way to correct for that in software so I could wear VR goggles without my glasses.

      If you have $15,000 you can have your vision fixed completely, but it involves surgery and each eye costs about $7k. It's a mechanical replacement for your eyes' natural focusing lenses, and if you get the surgery you not only won't need glasses, because it cures both your myopia and your astigmatism, but you won't need reading glass

      • by martyb ( 196687 )

        Thanks for the reply! No cataracts (yet).

        As a consumer, I generally avoid release 1.0 of anything. I realize you mentioned CrystaLens and not Lasik (spelling?) laser surgery A relative who had Lasik done reported nighttime halos from oncoming car headlamps. I'll let others get the bugs out of the process and find out what long[er]-term consequences may arise. My eyes are well-corrected with conventional glasses, so I can afford to wait.

        Had not heard of this, but will keep it in mind for when the ne

        • by mcgrew ( 92797 ) *

          There's a little lens flare in the CrystaLens, too, but not enough to matter. Oh, and you spelled "lasik" right.

  • So they've taken something with variable quality and improved it by implementing a solution that itself has variable quality with the downside that it's horrendously expensive to render.

    Brilliant.
  • ...Or worse, with bifocals?

    When something like this becomes available, it'd be awesome even if it was only used for watching a movie on an airplane, but I worry it would be worthless for people who have an eyeglasses prescription.

THEGODDESSOFTHENETHASTWISTINGFINGERSANDHERVOICEISLIKEAJAVELININTHENIGHTDUDE

Working...