Crytek Shows 4K 30 FPS Ray Tracing On Non-RTX AMD and NVIDIA GPUs (techspot.com) 140
dryriver writes: Crytek has published a video showing an ordinary AMD Vega 56 GPU -- which has no raytracing specific circuitry and only costs around $450 -- real-time ray tracing a complex 3D city environment at 4K 30 FPS. Crytek says that the technology demo runs fine on most normal NVIDIA and AMD gaming GPUs. As if this wasn't impressive already, the software real-time ray tracing technology is still in development and not even final. The framerates achieved may thus go up further, raising the question of precisely what the benefits of owning a super-expensive NVIDIA RTX 20xx series GPU are. Nvidia has claimed over and over again that without its amazing new RTX cores and AI denoiser, GPUs will choke on real-time ray tracing tasks in games. Crytek appears to have proven already that with some intelligently written code, bog ordinary GPU cores can handle real-time ray tracing just fine -- no RTX cores, AI denoiser or anything else NVIDIA touts as necessary.
Re: (Score:2)
They just implement raytracing most favourable to their architecture.
Since they have lot of area dedicated to neural network acceleration they use a neural network to denoise, then they don't bother trying to find any more efficient algorithms for legacy hardware. They have BVH acceleration, so they don't bother trying to find any more efficient ray acceleration structures for legacy hardware. Etc etc.
They don't lie, they just don't really try to make things shine on older/competitor hardware ... and since
Re: (Score:1)
Re: (Score:2)
Since they have lot of area dedicated to neural network acceleration they use a neural network to denoise
And it looks like crap as you would expect.
Re: (Score:2)
They don't lie, they just don't really try to make things shine on older/competitor hardware
That's an over simplification. So far there's nothing that shines on older hardware about what NVIDIA is doing and this demo is no exception. What is being shown here is ray tracing of reflections. Rays come from the camera to analyse reflection and refraction of surfaces. This has been demonstrated on traditional hardware in real time many times before (I believe Intel showed it off 4 years ago on standard but top end hardware), and it is the least computationally intensive part of ray tracing any scene.
So
Re: (Score:2)
Apart from reflections the rest is just single bounce dynamic GI with a couple of rays per pixel and lots of hacks, it's not like NVIDIA has the power to do Monte Carlo light transport with thousands of multi-bounce rays per pixel and order dependent effects.
Crytek does cone tracing with lots of hacks to also do single bounce dynamic GI, either way a very coarse approximation of physical lighting.
Re: (Score:3)
Nvidia has always been full of it. To see them exposed as the frauds they are _this_ quickly is very nice.
Re: (Score:3)
The biggest lie they ever told is that their stock value is worth ten times annual revenue, which explains why they were willing to outbid Intel for Mellanox, because they were basically paying half price.
Re: (Score:2)
Companies do get to lie a lot if they are that way inclined, ethically. Have you ever read an Nvidia 10-Q report? Promotional does not begin to describe it.
Fuck me. (Score:5, Funny)
Wait a fucking minute. "Only" costs around $450? If I tried to spend $450 on a video card for gaming, my wallet would jump up and slap me on the head. Visa would call me and ask if my credit card had been stolen.
Re: (Score:3)
Re: (Score:1)
A lot of people also live with the mentality that you should buy things based on the value they are to you. It doesn't mean that computers are a waste of money. It does, though, heavily question price vs performance. It questions whether one should wait 5 years until certain graphics cards drop in price substantially and at the same time buy (or in between if possibly) buy 5 year old games when they're on discount. It argues that long term thrift puts you in a better position economically, and the only
Re: (Score:1)
I'd be surprised if someone who does paintball on occasion spent a similar amount on a single piece of paintball equipment. I'd expect most slashdotters to be in the casual hobby region rather than the hardcore enthusiast when it comes to video games.
Re: (Score:2)
Yes but a car or paintball gun isn't useless in a few years. Video cards depreciate fast and hard just like most PC hardware. I am less willing to spend $450 on something that has a shelf life.
I remember there was a time when the high end model GPUs were about $350. Now Nvidia wants over a grand. Now you can argue that things like Tensor cores and RTX are value adds that make it worthwhile. But frankly I don't care about ray tracing right now. I've seen the demos and it doesn't really change enough to matt
Re: (Score:2)
Re: (Score:2)
You also need to consider life insurance however.
Re: Fuck me. (Score:5, Funny)
It has nothing to do with how much money I have or don't have. They're bragging about this $450 video card being able to do 30fps at 4k (but only if you have it attached to a PC that already cost you over $1000.. Before I do that, I'd buy a $450 PS4 Pro and spend the balance on an eight-ball of coke and take your mom out for a nice dinner and then anal sex.
Re: (Score:3)
Second, in terms of USD the Vega 56 is around $310, not $450. The PS4 Pro is $400. An entire computer, with the gpu, could be built for around $700-800. Best part, it can also be used to do every day computer things.
Third, the Vega 56 is 50%+ faster in terms of gpu performance than a PS4 Pro. You're basically getting a better gaming experience, plus computer, for twic
Re: (Score:2)
I'm with you. I play exclusively on a PC. but I don't believe you could build a capable gaming PC that includes a Vega 56 for $700-800. If you want to play current games, a good processor, 16gig memory, SSD drives, power supply and case and you're over $1000.
But really, why are we talking about the Vega 56 for gaming? Isn't it more of a pro-sumer style card for video editing and cad and stuff? I would think you'd be better off with one of the sub-$300 Radeons or AMD cards.
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
And 30FPS is a joke with 144hz monitors being so cheap.
Re: (Score:2)
and spend the balance on an eight-ball of coke and take your mom out for a nice dinner and then anal sex.
You're not such a bad guy after all. I take back most of that bad shit I said about you.
Re: (Score:2)
To know me is to love me.
Re: (Score:2)
What, AGAIN?
Re: (Score:2)
Before I do that, I'd buy a $450 PS4 Pro and spend the balance on an eight-ball of coke and take your mom out for a nice dinner and then anal sex.
If you've ever met my mother it would be you pleading to keep her out of this. Me, I'm just entertained, ... and grossed out.
Re: (Score:2)
Then you have to use a controller to play video games. I'd rather cut off my hands, it would probably be easier to control the camera that way
Re: (Score:3)
In six months that will be a $300 GPU. In a year it will be $150.
Re: (Score:2)
And remember: That's at 4k resolution. At 1080p you only need half the graphics card to do the same thing.
Re: (Score:2)
Less than half.
Re: (Score:2)
True... it's actually more like a quarter, ie. my $150 graphics card should be able to do 1080p at the same frame rate.
(assuming I have enough GRAM, etc)
Re: (Score:2)
If you ONLY consider pixel fill rate; there are many other factors in rendering a scene.
That's why I said "Assuming I have enough..." - we don't know where the bottleneck is in that demo.
But that doesn't really matter. Polygon counts can be reduced, number of reflections can be reduced, it can be tweaked to fit lesser cards and it shows that NVIDIA is exaggerating the need for special instructions.
Re: (Score:1)
Good point. 4k is wasted on my old-ass eyes.
Re:Fuck me. (Score:4, Interesting)
Re: (Score:2)
4K is nice for some things, I recently(christmas) bought myself a 4k60 tv for my monitor. It looks cool as shit for nature videos and the such. but as you say in gaming with settings maxed out and my aging 1070 the 1% lows will kill you, I only drop down to 1440p and brings most games to beyond playable frame rate and still looks good. 1080p is a little low when you sit close to a 43" screen. Like back in CRT days when you could count the horizontal lines.
Re: (Score:2)
frame rate is waaaaaaaaayyyyyyyy more important than resolution.
Indeed it is. Right until you hit a limit when discussing either of them.
one more important than the other... (Score:2)
It's not that frame rate is more important than resolution. It's that both need to be good enough; for you, 1080 is good enough a lot of the time. I tend to run 2560x1440 at about 100-120 fps. That 4K monitor could still have been a good choice since you could render at lower resolutions, if you used it for other work that did benefit (perhaps video, CAD, or many windows). Just not at all costs (such as too steep a downgrade in refresh rate).
Re: (Score:2)
Prices used to fall like that, but not any more. Maybe falls at 1/3rd of the rate of the good ol days now, and still slowing.
Re:Fuck me. (Score:4, Funny)
Aren't you missing your nightly cup of sleepy time tea and Jeopardy?
Re: (Score:2)
Some people spend 50% of their free time gaming so coughing up a few extra bucks is justifiable.
Only 50%? Filthy casual.
Re: (Score:2)
I've seen Vega 64s going for $350 (Score:3)
I'm not sure if it's the $229 GTX 1060 6gbs or just plain the worse economy (it is a lot shittier, You could make $12/hr starting at a call center in my dirt poor town back then, which is $20/hr now for a job a high school dropout could get, now the same job pays $9.50/hr, or about $5.70/hr in today's money) but these prices didn't used to seem all that nuts.
Re: (Score:2)
Re: (Score:2)
AM4 motherboard is $50 new and will scale to the sixteen thread $200-$300 cups.
If you put even 1st gen r7 cpu in a $50 am4 motherboard and actually use it. That motherboard is going to go up in smoke and possibly take your PSU and CPU with it when the VRM melts. and thats not even considering 2nd gen 2700x.
Re: (Score:1)
Then perhaps you shouldn't have spent all that money on a super-intelligent wallet with inbuilt head-slapping capabilities.
That's pretty cool though. I bet it saves you a lot of money when you go to IKEA.
Re: (Score:2)
Wait a fucking minute. "Only" costs around $450? If I tried to spend $450 on a video card for gaming
I'm sure you have actual hobbies you spend money on. Clearly gaming isn't one of them.
Re: (Score:2)
Wait a fucking minute. "Only" costs around $450? If I tried to spend $450 on a video card for gaming, my wallet would jump up and slap me on the head. Visa would call me and ask if my credit card had been stolen.
Fairy nuff...
You're not the intended audience. For a PC gamer, US$450 is mid range for a graphics card. High end cards easily cost in the $700 to $1000 range. I tend to buy mid range cards and replace them more often as they become a bottleneck. A CPU and Mainboard will keep trucking for 6 or more years, graphics cards tend to last 2-3.
Yes, there are heaps of memes about how much we spend on gaming, it's the same as any hobby, motorsports, golfing... in fact high end PC ownership is significantly ch
Re: (Score:2)
$500 cards have been mid-high end for a long time now. The high end cards cost around $1000 with the top of the line costing $2000 or so.
Of course, today's $500 card was yesterday's $1000 card, so it does rapidly cost a lot less money.
A mid-tier card costs around $300 or so.
Re: Suspicious (Score:2)
Re: (Score:2)
Imagine a beowulf cluster of these rendering Natalie Portman, stoned and petrified, with a bowl of hot grits and a greased up Yoda doll shoved up my ass.
In Soviet Russia, beowulf cluster of Natalie Portmans with hot grits down pants feeding Yoda dolls imagines you!
[Hmm. Soviet Russia sounds damn good...]
Stencil mapped shadowing (Score:1)
No 'ray tracing' units on Turing- it's a con (Score:3, Interesting)
Turing (Nvidia) has exactly ZERO ray tracing units. All 'ray' algorithm maths calculations are done on the standard shader cores- same as with this demo on the AMD Vega 56. So what gives?
An ex-Nvidia engineer post on Beyond3d gave the game away. This engineer was partially responsible for the so-called 'ray tracing' enhancement on Turing. Put simply, this is what Turing does:
a = b + c * d
say the above is a ray maths calc (obviously it is not). The '+' and '*' are the maths operations done on the usual shaders. What every tech site misses is that there is another issue- the process that gets the VARIABLES to the shader ALU blocks.
What Nvidia did with Turing was to add tiny ASIC circuits that allow the VARIABLES that represent triangles and rays to be more efficiently moved to the STANDARD shader units from the other units (like geometry). No RT 'cores' (there are no such thing in Turing) but a tiny logic hack that allowes the GPU to be reconfigured to move certain kinds of data much faster to the shader cores.
However the significance of this is that if one arranges for ray/triangle data to be held in a more efficient form on a 'normal' GPU, the same ray 'acceleration' can be achieved.
Not that even on Turing does REAL-TIME ray tracing happen. Real ray tracing needs far too many 'bouncy' rays per screen pixel to ever be possible on any ordinary GPU- and the problem with real ray tracing is MEMORY COHERENCE, not the maths of the ray/triangle collision.
Turing 'ray tracing' is actually simple ray algorithms applied to real-time reflection maps (NOT true reflection) and shadows. Metro Exodus tried a very very basic form of ray averaging for lighting, which was no better than simply using more traditional light sources.
And the tensor cores on Turing? Well unlike the non-existent ray tracing cores, the Tensor cores are real and use vast numbers of transistors. Why are the Tensor cores real- and the main reason Turing exists? Because Nvidia spent hundreds of millions of dollars developing new crypto currency mining algorithms to run exclusively on Tensor. However, crypto currency collapsed between Turing's design and release. Nvidia's future GAMING GPUs will not have tensor cores.
Turing was 100% designed to displace AMD in the PC crypto currency mining space. Nvidia lost an absolute fortune with Turing cos of the collapse (and Nvidia actually reported this fact at its investor conferences in 2018).
PS all practical 'ray' algorithms can be far better done (faster, less energy) using traditional raster algorithms. Light probe lighting methods with voxel data sets do the real time lighting more than good enough. Real time reflection maps do not need 'ray tracing' to deploy in reflection enhancement. Same applies to shadows- where good enough is better than extreme GPU power/processing requirements.
Indeed, with shadows, the minor improvements to near shadows are not the issue- the issue is shadows being disabled beyond a certain z-distance- something ray methods actually make worse. Better shadows = MORE shadows and shadows across more of the scene.
Re: (Score:2)
You don't need FP16 for crypto. The tensor cores do neural network acceleration and lots of universities are buying massive amounts of them because it's the new hotness.
Re:No 'ray tracing' units on Turing- it's a con (Score:4, Informative)
As someone who works with raytracers you're spouting gibberish.
There are two components to raytracing: ray intersection and shading. No shit that Nvidia uses their SHADING hardware to shade raytraced rays. What they added was ray-intersection hardware which yes does take a good bit of processing power on non-trivial scenes.
Yes ray cohesion is important and no Nvidia didn't address it (like arguably Caustic\IMG did with their now-dead OpenRL raytracer) but they are in fact tracing rays.
As to raytracing needing "bouncy rays". You're confusing Global Illumination with "Ray Tracing". Ray tracing is just firing rays and returning the results. You are using a no-true-scotsman falacy to equate one with the other. If you raytrace only primary visible rays (like a rasterizer) you're still raytracing even if you have 0 bounces. And bounces is exactly where the RT cores do help boost the trace rate on the RTX GPUs.
As to why there are Tensor Cores? Because you already mentioned that you need millions of bouncing rays to deliver global illumination. Imagination Technologies almost delivered true ray counts high enough for GI but it was still too noisy. Nvidia without cohesion for its shaders said "how can we do global illumination without ray counts high enough?" and the answer was to apply a denoising neural net. So RT cores + a large tensor unit means they can denoise their low sample count raytracing into something useful.
As to "Rasterization can do anything raytracing can do but better". That's a load of bullocks. Raytraced shadows are infinitely more memory efficient. You also can't have self-reflections with reflection maps. There is a reason every single production renderer in existence today for high end visual effects is now a raytracer. Ray tracing is more efficient, it's faster and it produces far superior quality to rasterization hacks.
All raytracing is not equal (Score:5, Informative)
What nvidia means by "ray tracing" with their RTX thing and the AI denoiser is actually path tracing [wikipedia.org], which uses incoherent rays and actually does simulate light bounces in a physically accurate way. Effects like depth of field, soft shadows, caustics, ambient occlusion, and diffuse interreflection are a natural result of the path tracing algorithm, but have to be specially accounted for in other algorithms like ray tracing. A good reference for this is Physically-Based Rendering [pbrt.org], by Matt Pharr. Because the rays in a path tracer are incoherent, it's an inherently noisy algorithm that requires many samples to reduce variance to acceptable levels. That's where the AI denoiser comes in - it's able to take a noisy image made with fewer path-traced samples and reduce variance to an acceptable level in realtime.
The guys over at brigade [otoy.com] also have an actual realtime path tracer, and while the work is world-class and draw-droppingly impressive, you can see how noisy it still is.
No difference to average eyeballs (Score:1)
This video shows "classical" raytracing [wikipedia.org]
What nvidia means by "ray tracing" with their RTX thing and the AI denoiser is actually path tracing [wikipedia.org],
Whether its "classical" raytracing [wikipedia.org] or "path" tracing [wikipedia.org],
I bet 99% of the game players can't feel the difference
Re: (Score:2)
I'll believe realtime raytracing -- classic, physically-based, or otherwise -- has really arrived when a future version of Windows gives us everything Aero was supposed to have been in the first place, including accurately-blurred semi-translucency with refraction, crystal-like window chrome, and everything else. Not faked translucency, but literal "window content is rendered onto semi-translucent dark smoked virtual glass, with obscured content that's somewhat visible through it accurately updated in realt
Re: (Score:2)
4+ TFlops wasted because you want a raytraced desktop?
Re: No difference to average eyeballs (Score:2)
Well, if you have an expensive video card that can do it without breaking a sweat & would otherwise just sit there unused going to waste, why NOT?
Effects like that might be wasteful if the computer is struggling to do the task at hand, but if you have that much raw horsepower just sitting around unused going to waste, you might AS WELL take full advantage of it.
It would be like running a stripped-down minimalist Linux distro on a 5GHz 8-core i9 with 1tb SSD instead of Ubuntu. You certainly could... but
Re: (Score:2)
Re: (Score:2)
You go first and tell me if it hurts.
Re: No difference to average desktops. (Score:2)
> Ah played with Enlightenment, have you?
Pfft. 20 years ago, I went on a multi-week holy
quest to try and find a way to make Enlightenment work with networked X11 and a headless Linux box. It was an act of hopeless futility (Enlightenment needed capabilities that X11 can't provide over a network, and I later discovered that for various unintuitive reasons, networked X11 usually has worse performance than VNC).
BTW, for anybody who's wondering, networked X11 is NOT the Linux equivalent of RDP with Windows.
Re: (Score:2)
You'd be wrong. Raytracing reflections compared to our classical ways of faking (screen space reflection) only shows differences in a few minor areas. Path tracing of lighting compared to general global illumination however makes a huge visual difference, so much so that many players actively complained about it in Metro Exodus due to the fact that hey realism actually sucks, and what did you think was going to happen on a moonlit night indoors, of course it looks much darker in the shadows than GI.
Re: (Score:2)
Completely agreed, GI is definitely the bigger deal. Less hacky reflections (and shadows) are nice of course but GI completely transforms how a scene feels. One could say almost night and day difference!
Re: (Score:2)
Re: (Score:2)
The point he was making is that the visual benefits here are only a very minor part of what RTX (and Microsoft's DXR) actually achieve. Hence it's not surprising that it performs well.
Reflections are a minor benefit of ray tracing. Accurate lighting is where the real benefit is, and at the moment that gives crippling performance for the non-wealthy gamers.
Impressive. However... (Score:3)
Re: (Score:2)
So it's understandable why demos concentrate on environments where you can actually see a benefit.
Yep, discussed in my last weeks news summary (Score:1)
30 FPS... (Score:1)
Re: (Score:1)
... How many years have you been on this Slashdot spamming campaign, now?
When you pick a lost cause, you really commit. My hat's off to you.