Smart Rendering For Virtual Reality 25
An anonymous reader writes: Researchers from Intel have been working on new methods for improving the rendering speed for modern wide-angle head-mounted displays like the Oculus Rift and Google Cardboard. Their approach makes use of the fact that because of the relatively cheap and lightweight lenses the distortion astigmatism happens: only the center area can be perceived very sharp, while with increasing distance from it, the perception gets more and more blurred. So what happens if you don't spend the same amount of calculations and quality for all pixels? The blog entry gives hints to future rendering architectures and shows performance numbers.
What if you move your eyes (Score:1)
instead of your head?
Re: (Score:2)
The experience would be less than ideal.
Future iterations that have better lens quality and higher field of view could make use of eye tracking to determine what location of the viewable area should be rendered optimally.
Re:What if you move your eyes (Score:4, Interesting)
Eye tracking isn't all that hard to do. I worked in a lab way back in the early 80s that did it with couple of phototransistors and an IR light source to measure corneal reflections.
Re: (Score:1)
Re: (Score:1)
Re: (Score:2)
Ideally, it is re-rendered at that rate or higher and you have large memory bandwith for the vid card or GPU (anywhere from 20GB/s to 300GB/s)
Re: (Score:1)
It's being rendered at around that rate as well.
The VR software includes some ability to shift an already rendered frame because of head tracking, the same approach could probably be used to compensate for eye motion. I'm not sure how much an eye really moves in 1/60th of a second. It also has a micro-stutter that is probably fairly unpredictable. Gross motor movement takes a while to start and stop, so the viewport of the next frame can generally be calculated with reasonable accuracy.
Re: (Score:2)
The problem is how human eyes move. We dart them quickly and unpredictably about.
You are wildly underestimating the speed of modern GPUs. They can easily generate a scene in less time than your eye can perceive it, especially if all the polygons are preloaded into graphic memory. All you have to do is run the polygons through the shaders, which are massively parallel, typically 1024 cores.
Re: (Score:2)
Eye tracking with high latency is not hard... now please do it accurately in under 10ms, 90 times a second, so that the rendering pipeline can use that in the next frame.
A lot of 'solved' tasks become a lot harder under the constraints imposed by VR's need for speed.
Re: (Score:2)
Which is why the current crop of displays won't last long, if VR really catches on. Magic Leap is already well on the way to developing consumer-level retinal displays. I'm pretty sure Oculus and Apple are working on their own; other companies likely are, as well. There are some significant challenges, particularly with making it economical, but nothing insurmountable. Advances in MEMS and fiber-coupled diode lasers will play a critical role. I expect to see consumer-ready, variable-focus retinal displays i
God's Prior Art, we sim (Score:1)
So it's reinventing quantum physics: it's fuzzy until you look at it more carefully. P.S. don't look at cats.
Re: (Score:2)
So it's reinventing quantum physics: it's fuzzy until you look at it more carefully. P.S. don't look at cats.
He's not kidding. I looked at my cat through VR glasses and saw this [all-that-i...esting.com].
That's not what astigmatism is - is it? (Score:3)
According to the linked Wikipedia page:
An optical system with astigmatism is one where rays that propagate in two perpendicular planes have different focus.
What the article seems to be about, though, is the way images as viewed in a VR headset get blurrier as you move away from the center, seemingly equally in all directions.
Re: (Score:2)
What the article seems to be about, though, is the way images as viewed in a VR headset get blurrier as you move away from the center, seemingly equally in all directions.
What you're missing is that astigmatism tends to get worse as you move off-axis and astigmatism causes blurring and other similar effects. Thus, more blur at the edges of the field than in the centre. You can correct for this, but it involves a lot heavy and expensive glass.
Re: (Score:2)
I got that, I just don't see that it fits the definition of astigmatism. It sounds a lot more like spherical aberration [wikipedia.org].
Re: (Score:2)
Give up! (Score:1)
Aspheric lenses ! (Score:2)
If there is a optical problem, why not solve it rather than trying to get software to (maybe) compensate and eat up battery life?
Just design aspheric lenses specifically for the headsets. Yes, the injection molds might be a bit more complex, but I believe these have to be CNC anyways. A bit more setup, but no production cost increase.
Foveated rendering (Score:1)
Solutionherbal.com || vimax,obat kuat sex (Score:1)