Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Graphics Software Entertainment Games

NVIDIA Doubts Ray Tracing Is the Future of Games 198

SizeWise writes "After Intel's prominent work in ray tracing in the both the desktop and mobile spaces, many gamers might be thinking that the move to ray-tracing engines is inevitable. NVIDIA's Chief Scientist, Dr. David Kirk, thinks otherwise as revealed in this interview on rasterization and ray tracing. Kirk counters many of Intel's claims of ray tracing's superiority, such as the inherent benefit to polygon complexity, while pointing out areas where ray-tracing engines would falter, such as basic antialiasing. The interview concludes with discussions on mixing the two rendering technologies and whether NVIDIA hardware can efficiently handle ray tracing calculations as well."
This discussion has been archived. No new comments can be posted.

NVIDIA Doubts Ray Tracing Is the Future of Games

Comments Filter:
  • by peipas ( 809350 ) on Friday March 07, 2008 @12:27PM (#22677248)
    Kirk should talk to Picard [demoscene.hu] who is quite enthused about real time raytracing.
    • by moderatorrater ( 1095745 ) on Friday March 07, 2008 @12:39PM (#22677428)
      That lends a lot of weight to the raytracing argument, since I generally prefer Picard over Kirk...
    • Re: (Score:3, Insightful)

      As a counter-counterpoint, the article has quite misleading pictures-

      The Ray-Tracing images are super slick, but are non real-time, highly processed work.

      Whereas the comparison Rasterized images are real-time, game-generated examples. If you were to allow the pro-rasterization side the same time to produce a single picture, it would be super fancy.
      • Re:Counterpoint (Score:5, Interesting)

        by Applekid ( 993327 ) on Friday March 07, 2008 @01:46PM (#22678504)
        We're not talking about the current technology, we're talking about the future. As in whether Ray Tracing is the Future of Games.

        Graphics hardware has evolved into huge parallel general-purpose stream processors capable of obscene numbers of FLOPs per second... yet we're still clinging tenaciously to the old safety blanket of a mesh of tesselated triangles and projecting textures onto them.

        And it makes sense: the industry is really, really good at pushing them around. Sort of how like internal combustion engines are pretty much the only game in town until alternative save themselves from the vapor.

        Nvidia, either by being wise or shortsighted, is discounting ray-tracing. ATI is trailing right now so they'd probably do well to hedge their bets on the polar opposite of where Nvidia is going.

        3D modelling starts out in abstractions anyway with deformations and curves and all sorts of things that are relatively easy to describe with pure mathematics. Converting it all to mesh approximations of what was sculpted was, and still is, pretty much just a hack to get things to run at acceptable real-time speeds.
        • Wait.... Applekid lecturing me about the Future of Games?

          Snark aside, I think that the true future is a combination of both methods, with ray-tracing being used for light effects over the top of rasterized 3d models.

          After all, that's (pretty much) how it works in real life....
          • Wait.... Applekid lecturing me about the Future of Games?

            Snark aside, I think that the true future is a combination of both methods, with ray-tracing being used for light effects over the top of rasterized 3d models.

            After all, that's (pretty much) how it works in real life....
            Perhaps they're denouncing it because they know the CPU+GPU integration AMD will have going on (and Intel soon too) will leave Nvidia out in the cold with no friends.
        • Re:Counterpoint (Score:4, Interesting)

          by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Friday March 07, 2008 @03:56PM (#22680638) Journal
          From what I read, I think part of this is nVidia assuming GPUs still stay relevant. If we ever do get to a point where raytracing -- done on a CPU -- beats out rasterization, done on a GPU, then nVidia's business model falls apart, whereas Intel suddenly becomes much more relevant (as their GPUs tend to suck).

          Personally, I would much prefer Intel winning this. It's a hit or miss, but Intel often does provide open specs and/or open drivers. nVidia provides drivers that mostly work, except when they don't, and then you're on your own...
          • If we ever do get to a point where raytracing -- done on a CPU -- beats out rasterization, done on a GPU, then nVidia's business model falls apart, whereas Intel suddenly becomes much more relevant (as their GPUs tend to suck).

            I don't see why ray tracing would necessarily tip things in Intel's favor. Ray tracing is lots of parallel, repetitive floating-point calculations, not so unlike vertex shading. When polygonal 3d graphics started to catch on, I'm sure Intel assumed their CPUs would grow to encompa

        • Re: (Score:3, Informative)

          yet we're still clinging tenaciously to the old safety blanket of a mesh of tesselated triangles and projecting textures onto them.

          Tessellation is also frequently used in ray-tracers as it makes things much simpler and faster.

          Converting it all to mesh approximations of what was sculpted was, and still is, pretty much just a hack to get things to run at acceptable real-time speeds.

          It also makes things much simpler for ray-tracers. Really. Intersecting a line with an arbitrarily curved surface is de

        • This may be what you would like to believe, but is not necessarily reality, which is the point of the entire article. Ray tracing is still avoided when possible even in high end film because it is so expensive. There are two applications that I see it filling in games, dealing with transparency and shadows, which would make it a hybrid like the article talks about. Ray tracing everything throws away many cheap and easy anti-aliasing techniques too. I am not really sure what people hope to gain from ray
        • Re:Counterpoint (Score:4, Informative)

          by Pseudonym ( 62607 ) on Saturday March 08, 2008 @04:23AM (#22685674)

          Nvidia, either by being wise or shortsighted, is discounting ray-tracing.

          What you may not realise is that NVIDIA sells a renderer/raytracer which uses the GPU for accelleration, targeted at the animation and VFX market.

          They are not discounting ray-tracing. They are embracing it. And they know, from lots of their own R&D, that it's not going to be competitive in the real-time market at any point in the forseeable future.

  • by OrangeTide ( 124937 ) on Friday March 07, 2008 @12:28PM (#22677266) Homepage Journal
    Saying something sucks if he's already developing a product for it.
  • by DeKO ( 671377 ) <(moc.liamg) (ta) (iramsoleinad)> on Friday March 07, 2008 @12:30PM (#22677292)

    A good way to mix both techniques is Relief Texture Mapping [ufrgs.br]. It's a good way to get smooth surfaces thanks to the texture interpolation hardware, with no extra polygons.

    • Except it can't do jack-squat at the edges of geometry...
  • Seriously though, does anyone expect Nvidia to say, "Yes, we really do think that our products will all be obsolete and outdated in a few years. Thank you for asking." I personally have no idea as to whether or not ray tracing is the future of games, but I really don't think that Nvidia is the right person to ask either, (just as Intel isn't).
    • but I really don't think that Nvidia is the right person to ask either, (just as Intel isn't).

      True. I believed Intel, when they explained the superiority of Ray-Tracing, and now I believe Nvidia, when they say the opposite.

      From what I can tell, Ray-Tracing is closer to 'reality', and so you'd expect the technology eventually to tend in that direction. But the explanation from the Nvidia dude makes it seem like that point is many years away, owing to the excellent results available now with rasterization, a

      • Re: (Score:3, Insightful)

        by Kjella ( 173770 )

        From what I can tell, Ray-Tracing is closer to 'reality', and so you'd expect the technology eventually to tend in that direction.

        Not necessarily, humans aren't that picky that the virtual reality be a perfect reality. For example, I don't ever expect to pick up a laser, find a thin enough slit and see the quantum effects on the wall, not the real simulated deal anyway. More natural than modelling it as a pefect beam? Sure, but like what's the point. Graphic cards and Copperfield are in the same business, making the grandest possible illusion with a minimum of resources and maximum immersion. If you get the macroscopic effects close

        • I don't ever expect to pick up a laser, find a thin enough slit and see the quantum effects on the wall, not the real simulated deal anyway.
          I'm assuming you mean within a simulated 'reality'.

          On the one hand, quantum interference can easily be exempted from the potential ray-tracing future, should it prove hard to model. On the other, what's so hard about it?
          • by MenTaLguY ( 5483 )
            Ray tracing is based on tracing the paths of particles, not wavefronts.
            • But it intends to be able to reproduce reality via a set of starting assumptions, no?
              • by MenTaLguY ( 5483 )
                No. Ray-tracing solves a small subset of the rendering equation, based on a purely particle-oriented approximation of photon behavior. It is not at all a suitable technique for modeling photon wavefronts (the interaction of which gives us interference patterns). If you want to model photon wavefronts, you need to use other techniques.
      • True. I believed Intel, when they explained the superiority of Ray-Tracing, and now I believe Nvidia, when they say the opposite.

        It sounds like ray tracing is better, but slower. If that is the case, a move to ray tracing might be more likely if we see a "leveling off" of scene complexity while hardware performance continues to increase. It might get to where a ray traced game looks better and is fast enough.

    • Re: (Score:3, Insightful)

      Seriously though, does anyone expect Nvidia to say, "Yes, we really do think that our products will all be obsolete and outdated in a few years. Thank you for asking." I personally have no idea as to whether or not ray tracing is the future of games, but I really don't think that Nvidia is the right person to ask either, (just as Intel isn't).

      One could argue that the Nvidia folks have been well aware of ray-tracing for a long time, and if they thought it was reaching the point where it was going to be usef

      • My understanding is that part of the threat to Nvidia and other dedicated graphics card makers is that ray tracing doesn't lend itself as well to dedicated solutions. Or rather, the type of processor needed tends to be the type that is already being used as a CPU with some minor tweaks to optimize performance. So instead of buying a separate chip for graphics, you get the same performance boost from just getting a second CPU or one with more cores. Instead of a graphics card with more RAM, you just add m
        • by Znork ( 31774 ) on Friday March 07, 2008 @01:16PM (#22678010)
          So instead of buying a separate chip for graphics, you get the same performance boost from just getting a second CPU or one with more cores.

          Not only that; you get a performance boost from your server. And from your kids computers. You get a performance boost from your HTPC. And potentially any other computer on your network.

          The highly paralellizable nature is the most interesting aspect to raytracing IMO; with distributed engines one could do some very cool things.
          • Re: (Score:3, Insightful)

            by jcrash ( 516507 ) *
            If I try to offload to my HTPC something tells me that network latency is sure gonna send my FPS to crap. But, in the current world where raytracing takes a LONG time, sure you can offload all you want.
            • Re: (Score:3, Interesting)

              by Mac Degger ( 576336 )
              You should check out the quake3 engien modded to use raytracing. Quite nice, and the computer scientist also has some interesting points about the efficiencies of raytracing (ie those problems Portal had with it's portal rendering (which you didn't see, because they had to hack around it))? Not a problem with raytracing).

              So, quake3 runs easily using raytracing, and that was just something to show how fast it can be...it's feasable in the latest games too. So why is raytacing so slow again?

              Oh, you're thinkin
        • by Abcd1234 ( 188840 ) on Friday March 07, 2008 @01:35PM (#22678338) Homepage
          Actually, I don't think that's true at all. Raytracing, just like today's rasterizers, can greatly benefit from dedicated hardware for doing vector operations, geometry manipulation, and so forth. This is particularly true as raytracing benefits greatly from parallelization, and it would be far easier to build a dedicated card with a nice fat bus for shunting geometry and texture information between a large number of processing units than it would be to use a stock, general multicore processor which isn't really designed with those specific applications in mind.

          Besides, the whole reason to have separate, specialized gear for doing things like audio/visual processing is to free up the main CPU for doing other things. Heck, we're even seeing specialized, third-party hardware for doing things like physics and AI calculations, not to mention accelerators for H.264 decoding, etc. As such, I see no reason to move graphics rendering back to the main CPU(s).
    • by mikael ( 484 )
      When it comes to ray-tracing complex shapes using spline patches, the recommended approach is to tessellate the geometry into triangles and then ray-trace the collection of triangles, using octrees for optimization (just test a single cube for ray-intersection than a whole set of triangles).

      Graphics cards already do something similar with deferred rendering. They sort the projected triangles according to position on the screen, and render groups of triangles into a local cached copy of the framebuffer. Onl
      • although there are programming techniques to do that with graphics cards too

        Whoops, I think you meant "hacks", there.

        This is the same thing that's been going on with rasterization for years. Developers and hardware designers have built hack upon hack in order to implement what raytracing does so easily. The result is a library of tricks that often can't be easily intermixed, and are always a pain to work with.

        So, if you can switch to raytracing and get similar performance, and end up with a larger feature
    • Just because their products now are focused on rasterization (their current GPUs can do raytracing as well) doesn't mean their next generation ones have to be. I'm sure they'd be happy to produce raytracing hardware, if there was a demand for it and if they could make it fast.

      That is the problem, as the article noted. You get research oriented things that are very pie in the sky about rendering techniques. They concentrate on what is theoretically possible and so on. nVidia isn't in that position, they are
      • by ivan256 ( 17499 )

        Just because their products now are focused on rasterization...
        Ray tracing is a form of rasterization... In other words, it translates a description of a scene into an array of pixels.
    • Re: (Score:2, Insightful)

      by podperson ( 592944 )
      I think his response was pretty reasonable and balanced, actually.

      1) Ray-tracing isn't going to solve all problems (it doesn't for movie rendering, why would it for real-time?)

      2) Existing software still needs to run.

      3) A hybrid approach will end up making the most sense (since it has for everything else).

      He's not just talking "party line" ... he's talking common sense. Ray-tracing everything is just an inefficient way to get the job done. It produces great mirror-finished objects but ugly shadows and medioc
    • by xeoron ( 639412 )
      No I don't think they will, but they might say, "We see this becoming the next trend, which we are committed to supporting."
    • by Hatta ( 162192 )
      Seriously though, does anyone expect Nvidia to say, "Yes, we really do think that our products will all be obsolete and outdated in a few years. Thank you for asking."

      Why yes, I do expect Nvidia to say that all their current products will be obsolete in a couple years.
  • by $RANDOMLUSER ( 804576 ) on Friday March 07, 2008 @12:37PM (#22677400)
    Intel says to do it with the CPU, and nVidia says to do it with the GPU. What a surprise.
    • Intel says to do it with the CPU, and nVidia says to do it with the GPU.
      AMD does it in court!
  • by Itninja ( 937614 ) on Friday March 07, 2008 @12:39PM (#22677424) Homepage
    Personally, I prefer spites to either ray-trace or polygons. I still thing Starcraft (the game, not the conversion van) had some of the best graphics. But then I am kind of a fuddy-duddy. I also think River Raid was an awesome game.
    • by Wabin ( 600045 )
      Agreed. Doing anything out of spite is a bad idea. Long live Space Invaders!
    • Re: (Score:3, Funny)

      by Belial6 ( 794905 )
      I still remember, as boy in the early 80's, getting the opportunity to take a cruse on a navy ship. Seeing the targeting equipment on that ship gave me a real appreciation for just how realistic the graphics on Missile Command were. They were darn near indistinguishable from the real thing.
    • Very good point! It does maybe not have that super-realistic "I shot the guy in the head and he's actually BLEEDING from the head", but it just looks... nice! The same with the old black isle games, such as Baldur's gate (especially II), fallout, planespace torment, etc... Clean, simple and nice.

      And, by the way, I thought I'd laugh my ass off first time I went on a camping trip and saw the Starcraft conversion van. :)
    • Re: (Score:3, Informative)

      by Stormwatch ( 703920 )
      Talking about sprites, did you see the teaser video for King of Fighters XII? So. Fuckin'. Beautiful. [kotaku.com]
    • by GreggBz ( 777373 )
      If you were talking about pixel detail, I might agree.

      But It's all about the lighting and animation in a changing environment (starcraft was isometric with static lighting). It's hard to light and animate sprites convincingly, unless you have a lot of artists willing to draw a lot of frames.
    • Re: (Score:3, Informative)

      It takes a lot of work to get a 3D model to look as good as a 2D sprite. You gain more freedom and, as the amount of actions increase, can create new animations with as lot less hassle. But it remains very difficult to get a really good "animated" feel with 3D models which need to look good from all angles, and nowadays under all lighting conditions. 2D sprites, while laborious to create, invariably display precisely as the animator intended.

      Games like Ratchet and Clank or Jak and Daxter pull this off well.
      • by nuzak ( 959558 )
        Apparently the games use a Naughty Dog technique whereby the models "bones", i.e. canonically fixed points, are themselves allowed warp and distort, meaning that the models do not simply consist of fixed points rotating on joints.

        That would be the Chuck Jones/Tex Avery Effect. The movie "Madagascar" also made good use of this.
    • Check out Odin Sphere [atlus.com] (link goes to links to the trailer). That game is just beautiful.
    • But then I am kind of a fuddy-duddy.
      Thats the word I was thinking of! I really didn't mean fuddle-duddle boss, I swear. God my brain hates me.
  • This just in... (Score:2, Insightful)

    by Majik Sheff ( 930627 )
    IBM doubts the future of the "personal computer"
    Buggy manufacturers poo-poo the new horseless carriage
    etc, etc.
  • Translation (Score:2, Insightful)

    by snarfies ( 115214 )
    We can't do it as well as Intel yet, therefore it sucks. BUY NVIDIA.
  • by 9mm Censor ( 705379 ) on Friday March 07, 2008 @12:46PM (#22677526) Homepage
    What about game and engine devs? Where do they see the future going?
    • Re: (Score:3, Informative)

      by Solra Bizna ( 716281 )

      What about game and engine devs? Where do they see the future going?

      IAAGD. My current paid project is to have both a raytracing module and a rasterizing module, and is designed to use them completely interchangeably. Personally, I'm a much bigger fan of raytracing than rasterization, and I'm going to a great deal of effort to make sure that it can be done efficiently with my engine.

      -:sigma.SB

    • by Squapper ( 787068 ) on Friday March 07, 2008 @12:57PM (#22677700)
      I am an game-developing 3d-artist, and our biggest problem right now is the lack of ability to render complex per-pixel shaders (particulary on the PS3). But this is only a short-term problem, and what would we do with the ability to create more complex shaders? Fake the appearance of the benefits of Ray-tracing (more complex scenes and more realistic surfaces) of course!

      On the other hand, the film industry could be a good place to look for the future technology of computer games. And as it is right now, it's preferable to avoid the slowness of raytracing and fake the effects instead, even when making big blockbuster movies.
      • by jcnnghm ( 538570 )
        Take a look at http://graphics.pixar.com/ [pixar.com], they sure are producing a lot of papers about ray tracing if that isn't a technique they are using.

        Abstract from Ray Tracing for the Movie 'Cars' (pdf warning) [pixar.com]

        This paper describes how we extended Pixar's RenderMan renderer with ray tracing abilities. In order to ray trace highly complex scenes we use multiresolution geometry and texture caches, and use ray differentials to determine the appropriate resolution. With this method we are able to efficiently ray trace s
    • by p0tat03 ( 985078 )

      I am an amateur game developer, so I probably can't speak for large devs, but I can speak for the community. The biggest problem facing game devs (and graphics people in general) is lighting. Doom 3 created an elegant, unified lighting model (i.e. the environment and characters are lit with the same algorithm, making them consistent), but it had severe limitations. Gone were the days where lights could be soft, and bounce around, and generally look nice, and replacing it was a very harsh lighting system tha

      • by MenTaLguY ( 5483 )
        Well, sort of. Raytracing isn't a complete solution to the rendering equation -- you'll still need hacks to get nice soft shadows, indirect lighting, and ambient occlusion.
        • by p0tat03 ( 985078 )

          A proper raytrace implementation will automatically account for things like the shadow penumbra (soft shadows), indirect lighting (light bounces, aka radiosity), and ambient occlusion. We're definitely not talking about the raytracers of yesteryear, which were very functionally limited.

          I suppose the argument isn't even really about raytracing vs. not. It's about whether it's worthwhile to brute force the problem (thereby keeping the solution elegant and simple) with sheer CPU power, or to try and fake you

          • by Creepy ( 93888 )
            I know this isn't what you meant, at least by how you phrased the rest of your comment, but technically light bounces are handled very well by raytracers, as is proper color absorption from nearby reflections as long as you are referring to Specular (shiny) lighting. What isn't handled well is non-point source diffuse (soft), which is why many ray tracers bolt on photon mapping or radiosity.
          • by MenTaLguY ( 5483 )

            A proper raytrace implementation will automatically account for things like the shadow penumbra (soft shadows), indirect lighting (light bounces, aka radiosity), and ambient occlusion. We're definitely not talking about the raytracers of yesteryear, which were very functionally limited.

            Those additional techniques used by modern "ray tracers" to avoid the limitations of pure ray tracing are not ray tracing (the term has a very specific meaning). They belong to a different family of algorithms.

            I supp

  • by Bones3D_mac ( 324952 ) on Friday March 07, 2008 @12:48PM (#22677550)
    For the most part, I really don't see ray-tracing adding much to the world of gaming that isn't being handled well enough by current methods. Unless someone was specifically creating games that somehow directly incorporated either the benefits or the added calculations involved with ray-tracing itself, it would only be a costly, and highly inefficient gimmick of an alternative to current techniques.

    Sure, ray-tracing has its place in a lot of areas, but real-time gaming would be a terrible misuse of processing horsepower... especially when you could be applying it to other areas of gaming that actually affect gameplay itself. For example, how about more robust AIs for in game elements, or high-end physics processing that can combine things like fabric/hair/ fluid/fire physics processing with the ability to decimate objects completely as vector-calculated chunks based on the surrounding environments, rather than all this predetermined destruction we currently see in games. (Example, a surface could be eroded incrimentally by having a fluid running acrossed it until a hole forms in the shape of the fluid path...)
  • Hardly anything new (Score:4, Interesting)

    by K. S. Kyosuke ( 729550 ) on Friday March 07, 2008 @12:58PM (#22677714)

    For years, the movie studios were using Pixar PRMan, which is in many ways a high-quality software equivalent of a modern graphics card. It takes a huge and complex scene, divides it into screen-space buckets of equal size, sorts the geometrical primitives in some way, and then (pay attention now!) tesselates and dices the primitives into micropolygons about one pixel each in size (OpenGL fragments, anyone?), shades and deforms them them using simple (or not-so-simple) algorithms written in the RenderMan shading language (hey, vertex and pixel shaders!) and then, it rasterizes them using stochastic algorithms that allow for high quality motion blur, depth-of-field and antialiasing.

    Now the latest part is slightly easier for a raytracer, which can cast a ray from any place in the scene - of course, it needs this functionality anyway. But a raytracer also needs random access to the scene which means that you need to keep the whole scene in memory at all times, along with spatial indexing structures. The REYES algorithms of PRMan needs no such thing (it easily handles 2 GB of scene data on a 512 MB machine along with gigs of textures), and it allows for coherent memory access patterns and coherent computation in general (there is of course some research into coherency in raytracing, but IIRC, the results were never that good). This is a big win for graphics cards, as the bandwidth of graphics card RAM has never been spectacular - it's basically the same case as with general-purpose CPU of your computer and its main memory. But with 128 or so execution units of modern graphics card, the problem is even more apparent.

    Unless the Intel engineeers stumbled upon some spectacular breakthrough, I fail to see how raytracing is supposed to have any advantage in a modern graphics card. And if I had to choose between vivid forest imagery with millions of leaves flapping in the wind and reflective balls on a checkered floor, I know what I would choose.

    • All true. But, of course, Pixar has converted RenderMan to a full ray tracer because it's now feasible and because they wanted that extra bit of realism that ray tracing elegantly delivers (see Cars and Ratatouille).

      Depending on design decisions, ray tracers can represent complex scenes with far less memory than most rasterizers. Rasterizers generally rely on increasing numbers of triangles (or perhaps other polygons) for complexity. Ray tracers can use all sorts of parametric surfaces that require muc

    • by IdeaMan ( 216340 )
      Woahhh. If you can ray-trace a scene that is larger than physical ram that would mean that in a massively parallel graphics card you don't need to have the whole scene loaded into each GPUs memory.
      That's ... HUGE.
      Are they doing on the fly level of detail to minimize the amount of data needed for each section of the image?
      If so, then partitioning of images among the processors would no longer be scanline based but probably rectangular sections.
      If I have this right you could have a 1 meg gaming type model re
  • Radiosity does more for indoor scene quality than does raytracing. Radiosity gives you the visual cue of a dark band at an inside corner, which is subtle and a basic part of the human visual mechanism for resolving depth. Raytracing makes shiny things look cool.

    Oh, right, this is for gamers.

  • They will cut back our "first contact" date for at least 500 years.
  • A matter of speed (Score:4, Insightful)

    by CopaceticOpus ( 965603 ) on Friday March 07, 2008 @02:51PM (#22679616)
    "C is much too slow, to get good performance you must use assembly."

    "Scripted languages are much too slow, to get good performance you must use compiled languages."

    As computers get faster, there is always a move from technologies that are easier for the computer to technologies that are easier for the developer. Since ray tracing involves less hacks and is a more direct model of the effects developers want to create, it seems inevitable.
    • But if you haven't noticed computers aren't getting faster like they used to. They are getting the ability to do more in parallel. The question is if ray tracing scales to parallel processing as well as current methods do.
      • Re:A matter of speed (Score:4, Informative)

        by batkiwi ( 137781 ) on Friday March 07, 2008 @05:27PM (#22681838)
        Current methods do not scale parallel. SLI does not give you 2x the resolution at the same framerate, or double the framerate at the same resolution. Same for any sort of quad SLI solution (two dual chip gfx cards, etc).

        Ray tracing scales almost perfectly. The same processor with 4 cores will perform about 3.9x as well (pushed pixles per second, so either higher FPS or higher resolution, or a mix) as one with 1 core. This is why intel is pushing this.
  • Considering how I've seen people going back to '50s style music, clothing, etc., it'd make sense if they suddenly just went back to text-gaming. Retro's the way to go! Stranded on an alien planet without a six pack of beer, baby!

    I'll call it the mind-generated graphics system, or MGGS for short.

    Patented and Copyrighted and Trademarked!
  • Intel's original studies stated that currently raytracing requires too much of a performance hit to be viable. They're expecting it when 8-16 core processors becoming available at a commodity level, and thats at least 2-3 years from now. As for anti-aliasing, I thought that raytracing removed the need for it entirely because of how graphics are drawn?
    • As for anti-aliasing, I thought that raytracing removed the need for it entirely because of how graphics are drawn?


      No, anti-aliasing is fairly commonly used with raytracing to remove artifacts that would otherwise crop up.
    • Anti-aliasing can be done basically the same way in either technique. You supersample the scene at a higher resolution and the use a filter (e.g. bilinear or bicubic) to shrink it down to the desired pixel resolution.
  • I'm not too up on 3D graphics, but what's with these comparisons of ray-tracing vs polygons, or ray-tracing vs rasterizing? Isn't ray tracing just a lighting model?

    Ok, that's to torque the mod that decided I was being redundant. Now to expand on it: though I can see where ray tracing might be able to describe a perfectly curved surface or how it could replace some amount of texture mapping by modeling the reflections off it, it's not like the model can exactly describe geometry itself or the starting colo
  • There used to be an interesting debate between Professer Philipp Slusallek of the University of Saarbruecken and chief scientist David Kirk of nVidia at GameStar.de. The original article has been taken down, but I found a slightly mangled version on the Wayback machine and I've cleaned it up a bit and put it up on my not-a-blog: link [scarydevil.com].

    I'd appreciate a better translation of the German part of the text.

"The vast majority of successful major crimes against property are perpetrated by individuals abusing positions of trust." -- Lawrence Dalzell

Working...