NVIDIA Doubts Ray Tracing Is the Future of Games 198
SizeWise writes "After Intel's prominent work in ray tracing in the both the desktop and mobile spaces, many gamers might be thinking that the move to ray-tracing engines is inevitable. NVIDIA's Chief Scientist, Dr. David Kirk, thinks otherwise as revealed in this interview on rasterization and ray tracing. Kirk counters many of Intel's claims of ray tracing's superiority, such as the inherent benefit to polygon complexity, while pointing out areas where ray-tracing engines would falter, such as basic antialiasing. The interview concludes with discussions on mixing the two rendering technologies and whether NVIDIA hardware can efficiently handle ray tracing calculations as well."
Counterpoint (Score:5, Funny)
Re:Counterpoint (Score:5, Funny)
Re:Counterpoint (Score:5, Funny)
Re: (Score:2)
Crisco??!?!!!! (Score:2)
Yeah- that's right - mother-f**king Crisco - everyone had like 2 pound cans of the stuff, and wore old clothes.
Pretty damn fun throwing a big greaseball and hitting someone upside the head with it, 'cause it sticks. Then the walk home from the local baseball field with big things of grease stuck to us.
Being a bored kid can be really fricken cool sometimes.
Re: (Score:3, Insightful)
The Ray-Tracing images are super slick, but are non real-time, highly processed work.
Whereas the comparison Rasterized images are real-time, game-generated examples. If you were to allow the pro-rasterization side the same time to produce a single picture, it would be super fancy.
Re:Counterpoint (Score:5, Interesting)
Graphics hardware has evolved into huge parallel general-purpose stream processors capable of obscene numbers of FLOPs per second... yet we're still clinging tenaciously to the old safety blanket of a mesh of tesselated triangles and projecting textures onto them.
And it makes sense: the industry is really, really good at pushing them around. Sort of how like internal combustion engines are pretty much the only game in town until alternative save themselves from the vapor.
Nvidia, either by being wise or shortsighted, is discounting ray-tracing. ATI is trailing right now so they'd probably do well to hedge their bets on the polar opposite of where Nvidia is going.
3D modelling starts out in abstractions anyway with deformations and curves and all sorts of things that are relatively easy to describe with pure mathematics. Converting it all to mesh approximations of what was sculpted was, and still is, pretty much just a hack to get things to run at acceptable real-time speeds.
Re: (Score:2)
Snark aside, I think that the true future is a combination of both methods, with ray-tracing being used for light effects over the top of rasterized 3d models.
After all, that's (pretty much) how it works in real life....
Re: (Score:2)
Snark aside, I think that the true future is a combination of both methods, with ray-tracing being used for light effects over the top of rasterized 3d models.
After all, that's (pretty much) how it works in real life....
Re:Counterpoint (Score:4, Interesting)
Personally, I would much prefer Intel winning this. It's a hit or miss, but Intel often does provide open specs and/or open drivers. nVidia provides drivers that mostly work, except when they don't, and then you're on your own...
Re: (Score:2)
I don't see why ray tracing would necessarily tip things in Intel's favor. Ray tracing is lots of parallel, repetitive floating-point calculations, not so unlike vertex shading. When polygonal 3d graphics started to catch on, I'm sure Intel assumed their CPUs would grow to encompa
Re: (Score:2)
Not an issue, really. The question is whether it can be done faster, better, or cheaper on that GPU than on a CPU.
If it's the case that there's not really an advantage to the GPU, then it makes more sense to have a really low-end GPU, reserve it maybe for some cool desktop effects, and spend the money you saved on a quad-core CPU -- or a second CPU (which might itself be quad-core, so you'd now have eigh
Re: (Score:3, Informative)
yet we're still clinging tenaciously to the old safety blanket of a mesh of tesselated triangles and projecting textures onto them.
Tessellation is also frequently used in ray-tracers as it makes things much simpler and faster.
Converting it all to mesh approximations of what was sculpted was, and still is, pretty much just a hack to get things to run at acceptable real-time speeds.
It also makes things much simpler for ray-tracers. Really. Intersecting a line with an arbitrarily curved surface is de
Re: (Score:2)
Re:Counterpoint (Score:4, Informative)
What you may not realise is that NVIDIA sells a renderer/raytracer which uses the GPU for accelleration, targeted at the animation and VFX market.
They are not discounting ray-tracing. They are embracing it. And they know, from lots of their own R&D, that it's not going to be competitive in the real-time market at any point in the forseeable future.
Steve Jobs also uses this trick (Score:5, Insightful)
Relief Texture Mapping (Score:5, Informative)
A good way to mix both techniques is Relief Texture Mapping [ufrgs.br]. It's a good way to get smooth surfaces thanks to the texture interpolation hardware, with no extra polygons.
Re: (Score:2)
Re:Relief Texture Mapping (Score:4, Informative)
Incidentally, Steep Parallax Mapping and Interval Mapping also use pseudo ray tracers, but the surface curvature is unique to Relief Mapping (the quadric technique doesn't work with them). Most other techniques can be adapted to support soft shadows, however, and most game developers I know (3 professional) think soft shadows are more important than curved surface relief maps. If surface curvature is important, they'd rather tessellate it out.
Re: (Score:2)
Re: (Score:3, Interesting)
Ray tracing is firing rays from each screen pixel, test for collision against the geometry, and figure out the proper color. RTM is firing rays from each polygon's "pixel", test for collision against the "texture geometry", and figure out the proper color. It's just ray tracing in a subset of the screen pixels, in a geometry (heigthmap) represented by a texture (or multiple textures) from a polygonal face. Why do you think this is not related to ray tracing?
Like we were expecting something else (Score:5, Insightful)
Re: (Score:2)
True. I believed Intel, when they explained the superiority of Ray-Tracing, and now I believe Nvidia, when they say the opposite.
From what I can tell, Ray-Tracing is closer to 'reality', and so you'd expect the technology eventually to tend in that direction. But the explanation from the Nvidia dude makes it seem like that point is many years away, owing to the excellent results available now with rasterization, a
Re: (Score:3, Insightful)
From what I can tell, Ray-Tracing is closer to 'reality', and so you'd expect the technology eventually to tend in that direction.
Not necessarily, humans aren't that picky that the virtual reality be a perfect reality. For example, I don't ever expect to pick up a laser, find a thin enough slit and see the quantum effects on the wall, not the real simulated deal anyway. More natural than modelling it as a pefect beam? Sure, but like what's the point. Graphic cards and Copperfield are in the same business, making the grandest possible illusion with a minimum of resources and maximum immersion. If you get the macroscopic effects close
Re: (Score:2)
On the one hand, quantum interference can easily be exempted from the potential ray-tracing future, should it prove hard to model. On the other, what's so hard about it?
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
It sounds like ray tracing is better, but slower. If that is the case, a move to ray tracing might be more likely if we see a "leveling off" of scene complexity while hardware performance continues to increase. It might get to where a ray traced game looks better and is fast enough.
Re: (Score:3, Insightful)
Seriously though, does anyone expect Nvidia to say, "Yes, we really do think that our products will all be obsolete and outdated in a few years. Thank you for asking." I personally have no idea as to whether or not ray tracing is the future of games, but I really don't think that Nvidia is the right person to ask either, (just as Intel isn't).
One could argue that the Nvidia folks have been well aware of ray-tracing for a long time, and if they thought it was reaching the point where it was going to be usef
Re: (Score:2)
Re:Like we were expecting something else (Score:4, Interesting)
Not only that; you get a performance boost from your server. And from your kids computers. You get a performance boost from your HTPC. And potentially any other computer on your network.
The highly paralellizable nature is the most interesting aspect to raytracing IMO; with distributed engines one could do some very cool things.
Re: (Score:3, Insightful)
Re: (Score:3, Interesting)
So, quake3 runs easily using raytracing, and that was just something to show how fast it can be...it's feasable in the latest games too. So why is raytacing so slow again?
Oh, you're thinkin
Re:Like we were expecting something else (Score:5, Informative)
Besides, the whole reason to have separate, specialized gear for doing things like audio/visual processing is to free up the main CPU for doing other things. Heck, we're even seeing specialized, third-party hardware for doing things like physics and AI calculations, not to mention accelerators for H.264 decoding, etc. As such, I see no reason to move graphics rendering back to the main CPU(s).
Re: (Score:2)
Graphics cards already do something similar with deferred rendering. They sort the projected triangles according to position on the screen, and render groups of triangles into a local cached copy of the framebuffer. Onl
Re: (Score:2)
Whoops, I think you meant "hacks", there.
This is the same thing that's been going on with rasterization for years. Developers and hardware designers have built hack upon hack in order to implement what raytracing does so easily. The result is a library of tricks that often can't be easily intermixed, and are always a pain to work with.
So, if you can switch to raytracing and get similar performance, and end up with a larger feature
Re: (Score:2)
No one is saying you can, today, and you bringing it up constitutes a strawman. The point is that we're getting to the point, in terms of available technology, where it *will* be possible.
The rest of your post is based on the same, erroneous presumption, so there seems little point in addressing it.
How would it obsolete their products? (Score:2)
That is the problem, as the article noted. You get research oriented things that are very pie in the sky about rendering techniques. They concentrate on what is theoretically possible and so on. nVidia isn't in that position, they are
Re: (Score:2)
Re: (Score:2, Insightful)
1) Ray-tracing isn't going to solve all problems (it doesn't for movie rendering, why would it for real-time?)
2) Existing software still needs to run.
3) A hybrid approach will end up making the most sense (since it has for everything else).
He's not just talking "party line"
Re: (Score:2)
Re: (Score:2)
Why yes, I do expect Nvidia to say that all their current products will be obsolete in a couple years.
Hmmm - who to believe, who to believe?? (Score:5, Interesting)
Re: (Score:2)
Re: (Score:2)
Option 1: Co-processor with a nice fat pipe to the graphics memory (GPU)
Or
Option 2: Co-processor transferring both model AND image data over the system bus.
Obey your thirst... (Score:5, Insightful)
Re: (Score:2)
Re: (Score:3, Funny)
Re: (Score:2)
And, by the way, I thought I'd laugh my ass off first time I went on a camping trip and saw the Starcraft conversion van.
Re: (Score:3, Informative)
Re: (Score:2)
But It's all about the lighting and animation in a changing environment (starcraft was isometric with static lighting). It's hard to light and animate sprites convincingly, unless you have a lot of artists willing to draw a lot of frames.
Re: (Score:3, Informative)
Games like Ratchet and Clank or Jak and Daxter pull this off well.
Re: (Score:2)
That would be the Chuck Jones/Tex Avery Effect. The movie "Madagascar" also made good use of this.
Re: (Score:2)
Re:Obey your thirst... (OT) (Score:2)
Re: (Score:2, Insightful)
Re: (Score:2)
Re: (Score:2)
This just in... (Score:2, Insightful)
Buggy manufacturers poo-poo the new horseless carriage
etc, etc.
Translation (Score:2, Insightful)
What do the people that make the software say? (Score:5, Interesting)
Re: (Score:3, Informative)
IAAGD. My current paid project is to have both a raytracing module and a rasterizing module, and is designed to use them completely interchangeably. Personally, I'm a much bigger fan of raytracing than rasterization, and I'm going to a great deal of effort to make sure that it can be done efficiently with my engine.
-:sigma.SB
Re: (Score:2)
Re:What do the people that make the software say? (Score:5, Interesting)
On the other hand, the film industry could be a good place to look for the future technology of computer games. And as it is right now, it's preferable to avoid the slowness of raytracing and fake the effects instead, even when making big blockbuster movies.
Re: (Score:2)
Abstract from Ray Tracing for the Movie 'Cars' (pdf warning) [pixar.com]
This paper describes how we extended Pixar's RenderMan renderer with ray tracing abilities. In order to ray trace highly complex scenes we use multiresolution geometry and texture caches, and use ray differentials to determine the appropriate resolution. With this method we are able to efficiently ray trace s
Re: (Score:2)
I am an amateur game developer, so I probably can't speak for large devs, but I can speak for the community. The biggest problem facing game devs (and graphics people in general) is lighting. Doom 3 created an elegant, unified lighting model (i.e. the environment and characters are lit with the same algorithm, making them consistent), but it had severe limitations. Gone were the days where lights could be soft, and bounce around, and generally look nice, and replacing it was a very harsh lighting system tha
Re: (Score:2)
Re: (Score:2)
A proper raytrace implementation will automatically account for things like the shadow penumbra (soft shadows), indirect lighting (light bounces, aka radiosity), and ambient occlusion. We're definitely not talking about the raytracers of yesteryear, which were very functionally limited.
I suppose the argument isn't even really about raytracing vs. not. It's about whether it's worthwhile to brute force the problem (thereby keeping the solution elegant and simple) with sheer CPU power, or to try and fake you
Re: (Score:2)
Re: (Score:2)
Those additional techniques used by modern "ray tracers" to avoid the limitations of pure ray tracing are not ray tracing (the term has a very specific meaning). They belong to a different family of algorithms.
Probably right on this one... (Score:5, Insightful)
Sure, ray-tracing has its place in a lot of areas, but real-time gaming would be a terrible misuse of processing horsepower... especially when you could be applying it to other areas of gaming that actually affect gameplay itself. For example, how about more robust AIs for in game elements, or high-end physics processing that can combine things like fabric/hair/ fluid/fire physics processing with the ability to decimate objects completely as vector-calculated chunks based on the surrounding environments, rather than all this predetermined destruction we currently see in games. (Example, a surface could be eroded incrimentally by having a fluid running acrossed it until a hole forms in the shape of the fluid path...)
Re: (Score:2)
I think they need to be working more on the physics of the environment than making it all look pretty. Hardware like the PhysX card are a step in the right direction and I would like to see that trend continue.
FYI, nVidia bought the company that makes the PhysX cards and said they plan to add Physics supports via a driver update to all 8xxx series cards (so you can use your GeForce 8xxx as a PhysX card as well as a video card). It'll probably be several generations of game engines before any REALLY take advantage of technology like that, though.
Re: (Score:2)
But the suspension of disbelief is what makes us enjoy most forms of screen entertainment - when it is strained too much, the game or movie is collapsing to a moving set of pixels and its "magic" suffers.
This is so ridiculous in so many games. I mean, you have that ultra modern main battle tank rolling along a
Hardly anything new (Score:4, Interesting)
For years, the movie studios were using Pixar PRMan, which is in many ways a high-quality software equivalent of a modern graphics card. It takes a huge and complex scene, divides it into screen-space buckets of equal size, sorts the geometrical primitives in some way, and then (pay attention now!) tesselates and dices the primitives into micropolygons about one pixel each in size (OpenGL fragments, anyone?), shades and deforms them them using simple (or not-so-simple) algorithms written in the RenderMan shading language (hey, vertex and pixel shaders!) and then, it rasterizes them using stochastic algorithms that allow for high quality motion blur, depth-of-field and antialiasing.
Now the latest part is slightly easier for a raytracer, which can cast a ray from any place in the scene - of course, it needs this functionality anyway. But a raytracer also needs random access to the scene which means that you need to keep the whole scene in memory at all times, along with spatial indexing structures. The REYES algorithms of PRMan needs no such thing (it easily handles 2 GB of scene data on a 512 MB machine along with gigs of textures), and it allows for coherent memory access patterns and coherent computation in general (there is of course some research into coherency in raytracing, but IIRC, the results were never that good). This is a big win for graphics cards, as the bandwidth of graphics card RAM has never been spectacular - it's basically the same case as with general-purpose CPU of your computer and its main memory. But with 128 or so execution units of modern graphics card, the problem is even more apparent.
Unless the Intel engineeers stumbled upon some spectacular breakthrough, I fail to see how raytracing is supposed to have any advantage in a modern graphics card. And if I had to choose between vivid forest imagery with millions of leaves flapping in the wind and reflective balls on a checkered floor, I know what I would choose.
Re: (Score:2)
All true. But, of course, Pixar has converted RenderMan to a full ray tracer because it's now feasible and because they wanted that extra bit of realism that ray tracing elegantly delivers (see Cars and Ratatouille).
Depending on design decisions, ray tracers can represent complex scenes with far less memory than most rasterizers. Rasterizers generally rely on increasing numbers of triangles (or perhaps other polygons) for complexity. Ray tracers can use all sorts of parametric surfaces that require muc
Re: (Score:2)
That's
Are they doing on the fly level of detail to minimize the amount of data needed for each section of the image?
If so, then partitioning of images among the processors would no longer be scanline based but probably rectangular sections.
If I have this right you could have a 1 meg gaming type model re
Radiosity , not raytracing (Score:2)
Radiosity does more for indoor scene quality than does raytracing. Radiosity gives you the visual cue of a dark band at an inside corner, which is subtle and a basic part of the human visual mechanism for resolving depth. Raytracing makes shiny things look cool.
Oh, right, this is for gamers.
If any alien race ever recieves that (Score:2)
A matter of speed (Score:4, Insightful)
"Scripted languages are much too slow, to get good performance you must use compiled languages."
As computers get faster, there is always a move from technologies that are easier for the computer to technologies that are easier for the developer. Since ray tracing involves less hacks and is a more direct model of the effects developers want to create, it seems inevitable.
Re: (Score:2)
Re:A matter of speed (Score:4, Informative)
Ray tracing scales almost perfectly. The same processor with 4 cores will perform about 3.9x as well (pushed pixles per second, so either higher FPS or higher resolution, or a mix) as one with 1 core. This is why intel is pushing this.
The Future of Gaming (Score:2)
I'll call it the mind-generated graphics system, or MGGS for short.
Patented and Copyrighted and Trademarked!
Strange (Score:2)
Re: (Score:2)
No, anti-aliasing is fairly commonly used with raytracing to remove artifacts that would otherwise crop up.
Re: (Score:2)
How do you tell a hundred people all the same thing? Call a meeting and announce it with a loudspeaker, play it over the radio, broadcast it on TV.
The point is, memory bandwidth is NOT the problem if you're trying to transport the same data to multiple places at the same time. You broadcast it on a bus to multiple memory stores. One other cool thing in a memory architecture like that is that it could act like Content Addressable Memory [wikipedia.org]. Database people would then start buyin
Re: (Score:2)
what's rendered obsolete here? (Score:2)
Ok, that's to torque the mod that decided I was being redundant. Now to expand on it: though I can see where ray tracing might be able to describe a perfectly curved surface or how it could replace some amount of texture mapping by modeling the reflections off it, it's not like the model can exactly describe geometry itself or the starting colo
Earlier interview: David Kirk & Philipp Slusal (Score:3, Informative)
I'd appreciate a better translation of the German part of the text.
Re: (Score:2)
Re: (Score:2)
Tell that to Pixar.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Such as?
Re: (Score:2)
Steel, plastic and rubber will bend then break.
Wood and fiberglass will leave long fibers sticking out.
Each of the above will have a different end texture.
I mean if you crash one car into another both cars don't dissolve into colored grains of sand.
Other than that, I have been quite frustrated at how little the average gamer can impact his environment, and if voxels can help that sounds really cool.
Re: (Score:2)
Re: (Score:2, Offtopic)
Re: (Score:2)
Re: (Score:2)
nVidia seems to be hedging their bets (Score:3, Interesting)
According to Dr. Slusallek's LinkedIn profile [linkedin.com] he's currently a "Visiting Professor at NVIDIA".
The performance of Dr. Slusallek's real-time raytracing engine at only 90 MHz was quite impressive: "In contrast we have recently implemented a prototype of a custom ray tracing based graphics card using a single Xilinx FPGA chip. The first results sho