Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Graphics AMD Entertainment Games Hardware

Crytek Shows 4K 30 FPS Ray Tracing On Non-RTX AMD and NVIDIA GPUs (techspot.com) 140

dryriver writes: Crytek has published a video showing an ordinary AMD Vega 56 GPU -- which has no raytracing specific circuitry and only costs around $450 -- real-time ray tracing a complex 3D city environment at 4K 30 FPS. Crytek says that the technology demo runs fine on most normal NVIDIA and AMD gaming GPUs. As if this wasn't impressive already, the software real-time ray tracing technology is still in development and not even final. The framerates achieved may thus go up further, raising the question of precisely what the benefits of owning a super-expensive NVIDIA RTX 20xx series GPU are. Nvidia has claimed over and over again that without its amazing new RTX cores and AI denoiser, GPUs will choke on real-time ray tracing tasks in games. Crytek appears to have proven already that with some intelligently written code, bog ordinary GPU cores can handle real-time ray tracing just fine -- no RTX cores, AI denoiser or anything else NVIDIA touts as necessary.
This discussion has been archived. No new comments can be posted.

Crytek Shows 4K 30 FPS Ray Tracing On Non-RTX AMD and NVIDIA GPUs

Comments Filter:
  • Fuck me. (Score:5, Funny)

    by PopeRatzo ( 965947 ) on Tuesday March 19, 2019 @09:02PM (#58301866) Journal

    Crytek has published a video showing an ordinary AMD Vega 56 GPU -- which has no raytracing specific circuitry and only costs around $450

    Wait a fucking minute. "Only" costs around $450? If I tried to spend $450 on a video card for gaming, my wallet would jump up and slap me on the head. Visa would call me and ask if my credit card had been stolen.

    • In six months that will be a $300 GPU. In a year it will be $150.

      • And remember: That's at 4k resolution. At 1080p you only need half the graphics card to do the same thing.

        • Less than half.

          • True... it's actually more like a quarter, ie. my $150 graphics card should be able to do 1080p at the same frame rate.

            (assuming I have enough GRAM, etc)

        • Good point. 4k is wasted on my old-ass eyes.

          • Re:Fuck me. (Score:4, Interesting)

            by NerdENerd ( 660369 ) on Tuesday March 19, 2019 @09:52PM (#58302110)
            I was super excited to play 4K the day I bought my shinny new 4K TV. But since then I have chosen 1080P over 4K on both my GTX 1080 and my PS4 Pro as the frame rate is waaaaaaaaayyyyyyyy more important than resolution.
            • 4K is nice for some things, I recently(christmas) bought myself a 4k60 tv for my monitor. It looks cool as shit for nature videos and the such. but as you say in gaming with settings maxed out and my aging 1070 the 1% lows will kill you, I only drop down to 1440p and brings most games to beyond playable frame rate and still looks good. 1080p is a little low when you sit close to a 43" screen. Like back in CRT days when you could count the horizontal lines.

            • frame rate is waaaaaaaaayyyyyyyy more important than resolution.

              Indeed it is. Right until you hit a limit when discussing either of them.

            • It's not that frame rate is more important than resolution. It's that both need to be good enough; for you, 1080 is good enough a lot of the time. I tend to run 2560x1440 at about 100-120 fps. That 4K monitor could still have been a good choice since you could render at lower resolutions, if you used it for other work that did benefit (perhaps video, CAD, or many windows). Just not at all costs (such as too steep a downgrade in refresh rate).

      • Prices used to fall like that, but not any more. Maybe falls at 1/3rd of the rate of the good ol days now, and still slowing.

    • Re:Fuck me. (Score:4, Funny)

      by ArchieBunker ( 132337 ) on Tuesday March 19, 2019 @09:11PM (#58301906)

      Aren't you missing your nightly cup of sleepy time tea and Jeopardy?

    • At 4k, meaning at 1080p it'll run you $200-$250 today. And the equivalent of this $450 car will be about $250 just a few months from now (hooray hardware leaks!)
    • which sounds like a lot, but I paid $300 for a Voodoo Rush in 95. That's $500 in today's money and it wasn't a big deal back then.

      I'm not sure if it's the $229 GTX 1060 6gbs or just plain the worse economy (it is a lot shittier, You could make $12/hr starting at a call center in my dirt poor town back then, which is $20/hr now for a job a high school dropout could get, now the same job pays $9.50/hr, or about $5.70/hr in today's money) but these prices didn't used to seem all that nuts.
    • The RTX cards from NVidia can cost twice as much or more than that. $450 (and really they're closer to $300 [newegg.com] (and it also comes with 3 free games on top of that) in reality) is a bargain by comparison. It's still stupid, but I blame all of the yahoos mining crypto-currencies and driving up the prices.
    • Then perhaps you shouldn't have spent all that money on a super-intelligent wallet with inbuilt head-slapping capabilities.

      That's pretty cool though. I bet it saves you a lot of money when you go to IKEA.

    • Wait a fucking minute. "Only" costs around $450? If I tried to spend $450 on a video card for gaming

      I'm sure you have actual hobbies you spend money on. Clearly gaming isn't one of them.

    • by mjwx ( 966435 )

      Crytek has published a video showing an ordinary AMD Vega 56 GPU -- which has no raytracing specific circuitry and only costs around $450

      Wait a fucking minute. "Only" costs around $450? If I tried to spend $450 on a video card for gaming, my wallet would jump up and slap me on the head. Visa would call me and ask if my credit card had been stolen.

      Fairy nuff...

      You're not the intended audience. For a PC gamer, US$450 is mid range for a graphics card. High end cards easily cost in the $700 to $1000 range. I tend to buy mid range cards and replace them more often as they become a bottleneck. A CPU and Mainboard will keep trucking for 6 or more years, graphics cards tend to last 2-3.

      Yes, there are heaps of memes about how much we spend on gaming, it's the same as any hobby, motorsports, golfing... in fact high end PC ownership is significantly ch

    • by tlhIngan ( 30335 )

      Wait a fucking minute. "Only" costs around $450? If I tried to spend $450 on a video card for gaming, my wallet would jump up and slap me on the head. Visa would call me and ask if my credit card had been stolen.

      $500 cards have been mid-high end for a long time now. The high end cards cost around $1000 with the top of the line costing $2000 or so.

      Of course, today's $500 card was yesterday's $1000 card, so it does rapidly cost a lot less money.

      A mid-tier card costs around $300 or so.

  • Real-time ray tracing on mobile can supposedly be done via patented method - ref: Venturebeat article [venturebeat.com]
  • by Anonymous Coward on Tuesday March 19, 2019 @09:51PM (#58302108)

    Turing (Nvidia) has exactly ZERO ray tracing units. All 'ray' algorithm maths calculations are done on the standard shader cores- same as with this demo on the AMD Vega 56. So what gives?

    An ex-Nvidia engineer post on Beyond3d gave the game away. This engineer was partially responsible for the so-called 'ray tracing' enhancement on Turing. Put simply, this is what Turing does:

    a = b + c * d

    say the above is a ray maths calc (obviously it is not). The '+' and '*' are the maths operations done on the usual shaders. What every tech site misses is that there is another issue- the process that gets the VARIABLES to the shader ALU blocks.

    What Nvidia did with Turing was to add tiny ASIC circuits that allow the VARIABLES that represent triangles and rays to be more efficiently moved to the STANDARD shader units from the other units (like geometry). No RT 'cores' (there are no such thing in Turing) but a tiny logic hack that allowes the GPU to be reconfigured to move certain kinds of data much faster to the shader cores.

    However the significance of this is that if one arranges for ray/triangle data to be held in a more efficient form on a 'normal' GPU, the same ray 'acceleration' can be achieved.

    Not that even on Turing does REAL-TIME ray tracing happen. Real ray tracing needs far too many 'bouncy' rays per screen pixel to ever be possible on any ordinary GPU- and the problem with real ray tracing is MEMORY COHERENCE, not the maths of the ray/triangle collision.

    Turing 'ray tracing' is actually simple ray algorithms applied to real-time reflection maps (NOT true reflection) and shadows. Metro Exodus tried a very very basic form of ray averaging for lighting, which was no better than simply using more traditional light sources.

    And the tensor cores on Turing? Well unlike the non-existent ray tracing cores, the Tensor cores are real and use vast numbers of transistors. Why are the Tensor cores real- and the main reason Turing exists? Because Nvidia spent hundreds of millions of dollars developing new crypto currency mining algorithms to run exclusively on Tensor. However, crypto currency collapsed between Turing's design and release. Nvidia's future GAMING GPUs will not have tensor cores.

    Turing was 100% designed to displace AMD in the PC crypto currency mining space. Nvidia lost an absolute fortune with Turing cos of the collapse (and Nvidia actually reported this fact at its investor conferences in 2018).

    PS all practical 'ray' algorithms can be far better done (faster, less energy) using traditional raster algorithms. Light probe lighting methods with voxel data sets do the real time lighting more than good enough. Real time reflection maps do not need 'ray tracing' to deploy in reflection enhancement. Same applies to shadows- where good enough is better than extreme GPU power/processing requirements.

    Indeed, with shadows, the minor improvements to near shadows are not the issue- the issue is shadows being disabled beyond a certain z-distance- something ray methods actually make worse. Better shadows = MORE shadows and shadows across more of the scene.

    • You don't need FP16 for crypto. The tensor cores do neural network acceleration and lots of universities are buying massive amounts of them because it's the new hotness.

    • by im_thatoneguy ( 819432 ) on Wednesday March 20, 2019 @02:11PM (#58305336)

      As someone who works with raytracers you're spouting gibberish.

      There are two components to raytracing: ray intersection and shading. No shit that Nvidia uses their SHADING hardware to shade raytraced rays. What they added was ray-intersection hardware which yes does take a good bit of processing power on non-trivial scenes.

      Yes ray cohesion is important and no Nvidia didn't address it (like arguably Caustic\IMG did with their now-dead OpenRL raytracer) but they are in fact tracing rays.

      As to raytracing needing "bouncy rays". You're confusing Global Illumination with "Ray Tracing". Ray tracing is just firing rays and returning the results. You are using a no-true-scotsman falacy to equate one with the other. If you raytrace only primary visible rays (like a rasterizer) you're still raytracing even if you have 0 bounces. And bounces is exactly where the RT cores do help boost the trace rate on the RTX GPUs.

      As to why there are Tensor Cores? Because you already mentioned that you need millions of bouncing rays to deliver global illumination. Imagination Technologies almost delivered true ray counts high enough for GI but it was still too noisy. Nvidia without cohesion for its shaders said "how can we do global illumination without ray counts high enough?" and the answer was to apply a denoising neural net. So RT cores + a large tensor unit means they can denoise their low sample count raytracing into something useful.

      As to "Rasterization can do anything raytracing can do but better". That's a load of bullocks. Raytraced shadows are infinitely more memory efficient. You also can't have self-reflections with reflection maps. There is a reason every single production renderer in existence today for high end visual effects is now a raytracer. Ray tracing is more efficient, it's faster and it produces far superior quality to rasterization hacks.

  • by igotmybfg ( 525391 ) on Tuesday March 19, 2019 @10:48PM (#58302304) Homepage
    This video shows "classical" raytracing [wikipedia.org], in which rays are traced coherently, and it has long been doable in realtime on a GPU because it is trivially parallelizable. It looks impressive because it can do mirror- and glass-type effects (specular reflection and refraction), but there is more to photorealism than just those effects. In particular, while ray tracing does simulate light bouncing around a scene, it doesn't do so in a physically-accurate way.

    What nvidia means by "ray tracing" with their RTX thing and the AI denoiser is actually path tracing [wikipedia.org], which uses incoherent rays and actually does simulate light bounces in a physically accurate way. Effects like depth of field, soft shadows, caustics, ambient occlusion, and diffuse interreflection are a natural result of the path tracing algorithm, but have to be specially accounted for in other algorithms like ray tracing. A good reference for this is Physically-Based Rendering [pbrt.org], by Matt Pharr. Because the rays in a path tracer are incoherent, it's an inherently noisy algorithm that requires many samples to reduce variance to acceptable levels. That's where the AI denoiser comes in - it's able to take a noisy image made with fewer path-traced samples and reduce variance to an acceptable level in realtime.

    The guys over at brigade [otoy.com] also have an actual realtime path tracer, and while the work is world-class and draw-droppingly impressive, you can see how noisy it still is.

    • by Anonymous Coward

      This video shows "classical" raytracing [wikipedia.org]

      What nvidia means by "ray tracing" with their RTX thing and the AI denoiser is actually path tracing [wikipedia.org],

      Whether its "classical" raytracing [wikipedia.org] or "path" tracing [wikipedia.org],

      I bet 99% of the game players can't feel the difference

      • I'll believe realtime raytracing -- classic, physically-based, or otherwise -- has really arrived when a future version of Windows gives us everything Aero was supposed to have been in the first place, including accurately-blurred semi-translucency with refraction, crystal-like window chrome, and everything else. Not faked translucency, but literal "window content is rendered onto semi-translucent dark smoked virtual glass, with obscured content that's somewhat visible through it accurately updated in realt

        • by Megol ( 3135005 )

          4+ TFlops wasted because you want a raytraced desktop?

          • Well, if you have an expensive video card that can do it without breaking a sweat & would otherwise just sit there unused going to waste, why NOT?

            Effects like that might be wasteful if the computer is struggling to do the task at hand, but if you have that much raw horsepower just sitting around unused going to waste, you might AS WELL take full advantage of it.

            It would be like running a stripped-down minimalist Linux distro on a 5GHz 8-core i9 with 1tb SSD instead of Ubuntu. You certainly could... but

      • You'd be wrong. Raytracing reflections compared to our classical ways of faking (screen space reflection) only shows differences in a few minor areas. Path tracing of lighting compared to general global illumination however makes a huge visual difference, so much so that many players actively complained about it in Metro Exodus due to the fact that hey realism actually sucks, and what did you think was going to happen on a moonlit night indoors, of course it looks much darker in the shadows than GI.

        • Completely agreed, GI is definitely the bigger deal. Less hacky reflections (and shadows) are nice of course but GI completely transforms how a scene feels. One could say almost night and day difference!

    • In the end, nobody cares about how it's done, what matters is what the end result looks like. That's a pretty sweet demo they have there, if anyone can take that demo and apply that rendering engine to a game... Does it still look as good? Are there unforeseen complications? Practice is the criterion of truth.
      • The point he was making is that the visual benefits here are only a very minor part of what RTX (and Microsoft's DXR) actually achieve. Hence it's not surprising that it performs well.

        Reflections are a minor benefit of ray tracing. Accurate lighting is where the real benefit is, and at the moment that gives crippling performance for the non-wealthy gamers.

  • by OneHundredAndTen ( 1523865 ) on Wednesday March 20, 2019 @09:14AM (#58303812)
    Why is it the case that objects in videos that show off graphics capabilities always look shiny, brand-new, crisp, all the time? Even when they are supposed to be old and dusty, they manage to look very shiny, brand-new and crisp.
    • by ffkom ( 3519199 )
      Because non-shiny objects don't look any better when using ray-tracing over the usual texture mapping.

      So it's understandable why demos concentrate on environments where you can actually see a benefit.
  • that I start on my yt channel to summarise the latest IT Bits & pieces: https://www.youtube.com/watch?... [youtube.com]
  • ...into the trash that will go.

Math is like love -- a simple idea but it can get complicated. -- R. Drabek

Working...