
Nvidia Ends 32-Bit CUDA App Support For GeForce RTX 50 Series (tomshardware.com) 41
Nvidia has confirmed on its forums that the RTX 50 series GPUs no longer support 32-bit PhysX. Tom's Hardware reports: As far as we know, there are no 64-bit games with integrated PhysX technology, thus terminating the tech entirely on RTX 50 series GPUs and newer. RTX 40 series and older will still be able to run 32-bit CUDA applications and thus PhysX, but regardless, the technology is now officially retired, starting with Blackwell. [...]
The only way now to run PhysX on RTX 50 series GPUs (or newer) is to install a secondary RTX 40 series or older graphics card and slave it to PhysX duty in the Nvidia control panel. As far as we are aware, Nvidia has not disabled this sort of functionality. But the writing is on the wall for PhysX, and we doubt there will be any future games that attempt to use the API.
The only way now to run PhysX on RTX 50 series GPUs (or newer) is to install a secondary RTX 40 series or older graphics card and slave it to PhysX duty in the Nvidia control panel. As far as we are aware, Nvidia has not disabled this sort of functionality. But the writing is on the wall for PhysX, and we doubt there will be any future games that attempt to use the API.
Quit deving with proprietary (Score:5, Funny)
If you develop using proprietary APIs then you're going to lose over and over when the company that owns it simply decides it's not worth maintaining it.
However, if you're already developing using CUDA then you've already lost that game.
Stop using proprietary APIs.
Re:Quit deving with proprietary (Score:4, Insightful)
Most game studios don't care what happens after the first few years of sales because the numbers are usually too low to matter.
Initially, people were using PhysX because it was the quickest way to get a game out with decent physics performance. But the vast majority of budget and indie games that require PhysX are because they're using Unity or Unreal Engine.
Some poor developer that hasn't updated their game since Unity 5 is going to have some decisions to make.
Re: (Score:2)
It's going to be somewhat bad when people who have built imaging pipelines for medical research realise that they can no longer use their sm32 coded binaries.
That's already happened, mind... kepler sm target is no longer supported by H series cards. They just don't know that it affects them yet, and we can still work around it by putting their workflows on older cards. That wont be possible, sooner or later.
Re: (Score:2)
Guess they should have paid for a long-term support contract.
Re: (Score:2)
They'd avoid paying me if they could. They've "forgotten" to do it in the past.
That said, most science is actually done on an inadequate budget, and the idea that folks have that big science is wealthy is not grounded in reality... and that's actually compounded by the fact that big name researchers really do attract the majority of the funding.
Re: (Score:2)
This is so true. My work gets cellular providers and automotive companies to pony up a decade of support contracts in advance. But despite the cost of a medical imaging device, the manufacturers aren't really rolling in dough.
Re: (Score:2)
Re:Quit deving with proprietary (Score:4, Interesting)
PhysX is technically middleware, thus there's nothing technically preventing another developer from creating middleware replacement. It might take a while (considering there's multiple versions), but it's technically possible to get something to work (perhaps a PhysX->CUDA mapping.)
But also, it can happen to open source software as well, and I've encoutered two of them.
The BSD Games collection [linuxquestions.org] often uses Curses, which is still around, but the latest version available removes the feature of replacing stdscr, and thus plenty of the old games won't even compile without having to scour the source.
Then, Cardinal Quest [github.com] used Haxe - which changed something in the programming language, and thus can't be compiled without having to make extensive changes in the source code.
But the main issue is that the older games don't get maintained or updated whenever something like this happens.
Re:Quit deving with proprietary (Score:5, Informative)
god, curses is such a pain in the ass.
One thing I I repeatedly encounter in open source communities is this over-dependance on libraries that are a house of cards because of their inability to separated from the OS.
So the OS removes a library, now EVERY SINGLE program that uses that library, is now broken unless you recompile it. Good luck if you don't have the source code.
The alternative is to always statically compile games so that the game will survive as long as the OS and GPU/Sound/Input drivers installed into the OS remain. Which has not generally been a problem. But if Nvidia is removing PhysX entirely, then ANY game using Physx, be it 32bit or 64bit, is dead. The game will not work.
Fortunately, we have precedence for "fixing this"
DXVK - DirectX9 on Vulkan
DXWND - Override "forced full screen" and screen resolution changes that no longer work (eg switching to 8-bit/16-bit color modes or 640x400 or 320x200 VGA resolutions)
nGlide - Wrapper for the 3DFX Glide API
DOSBOX - runs 16-bit MSDOS software
There are precedents for solving this. Fortunately, at least with PhysX, there is also a CPU mode, so when you run a game that uses PhysX and doesn't see Nvidia hardware, it either doesn't turn PhysX on, or only operates in CPU mode.
https://www.reddit.com/r/Amd/comments/a2qzt5/physx_is_now_open_source_radeon_implementation/
So since the source code to PhysX is available, that just means no more GPU support for it, not like it was ever that beneficial. It's actually been removed from Unreal Engine, so chances of running into a compatibility issue now is pretty low and it will mostly be encountered in games released in the Halflife-2 era.
Re: (Score:3)
To clarify, the PhysX middleware is bundled with the game. It's usually statically compiled in, but there are also some instances where it's shipped as part of a DLL.
The issue is that the API PhysX uses to access the GPU to execute GPU-accelerated effects is CUDA. And NVIDIA is dropping 32-bit CUDA support. That means there's no way for the PhysX middleware to talk to the GPU. As you co
Re: (Score:1)
Fortunately, at least with PhysX, there is also a CPU mode, so when you run a game that uses PhysX and doesn't see Nvidia hardware, it either doesn't turn PhysX on, or only operates in CPU mode.
performance is so bad using CPU-mode its practically useless.
So since the source code to PhysX is available, that just means no more GPU support for it, not like it was ever that beneficial.
its only the 5000 series that lost support; older GPUs are unaffected--if you really wanted, you could have a dual GPU setup where an old GPU is a dedicated physx processor.
i loved seeing fluid physics, soft-body physics, etc.
shame it was never properly superseded.
Re: (Score:2)
However, if you're already developing using CUDA then you've already lost that game.
So what, you should use... ROCm instead?
Stop using proprietary APIs.
Why is CUDA so much better than OpenCL?
Re: (Score:2)
https://github.com/NVIDIA-Omni... [github.com]
okay
It doesn't matter. (Score:5, Insightful)
PhysX has a software fallback. After all, those games still worked on AMD GPUs. GPU-accelerated PhysX was only relevant in an era where CPUs were dramatically slower, and CPUs had a single core. Those games will run just as well today with the CPU fallback as they did on the GPU in 2008.
Re:It doesn't matter. (Score:4, Informative)
Those games will run just as well today with the CPU fallback as they did on the GPU in 2008.
Not really.
I've seen tests showing that a Threadripper 1950X couldn't keep up with a GTX 970 in PhysX.
As for the relevance of PhysX- games come out to this day that use it.
This, however, is about 32-bit PhysX.
The article's claim that that there are no 64-bit games using PhysX (that they know of) is ignorant. Off the top of my head? Any Unity game using Unity.Physics.
Re: (Score:2)
Both the 970 and 1950X are out of date. How about a 9800X3D or upcoming 9950X3D? Will they struggle with old 32-bit PhysX titles while running a 5090?
Re:It doesn't matter. (Score:4, Informative)
From the reports that I've seen where this came from, the people that got Nvidia to confirm the issue in the first place, they noted that games that ran at 4k/120 on the 4000 series are running at under 60fps on the 5000 series using either 7000 or 9000 X3D CPUs.
Yes, its literally that bad.
And as far as the "CPU Fallback is good enough" thing, there is comparison video footage out there that what devs did was literally entirely remove objects from the game to significantly reduce the computation burden from the lack of PhysX support.
So, you either get worse performance when you "force" enable PhysX on the CPU, or you get reduced details in game. GREAT experience for the "latest and greatest" GPU generation.
Re: (Score:2)
Interesting. What are the specific titles that have this problem? Most recent software should be 64bit PhysX which is apparently still supported by the 5090. Unless someone's still releasing 32-bit Unity software?
Re: (Score:2)
Recent games are fine, the games affected all predate 2013. People are complaining about Borderlands 2 and Batman Arkham Origins
Re: (Score:2)
That's stuff that could probably be run in a vm.
Re: (Score:2)
How is running them in a VM going to help?
Re: (Score:2)
Hardware emulation. Might not be worth it to try.
Re: (Score:2)
That's what's already in the driver. Much easier to just write an improved DLL then. Running in a VM won't change anything, and will make it harder, since that would mean writing a full NVidia emulation of an older card instead. I doubt there is enough information available to even do that, especially not a well performing such emulation.
Re: (Score:2)
The driver currently has emulation that relies on the CPU, does it not?
Re: (Score:2)
I wrote clumsily, which is my bad. I was referring to the DLL. But yes, the implementation which was written back in the day was CPU driven. Nothing limits it to that though. It could use GPU if desired. Just code it that way. Heck, it could use CUDA, OpenCL or Vulkan. It would be possible to make it perform a hell of a lot better than it would be to make some kind of emulation in a VM, and with mush less work.
Nobody is going to make one though. It's way too much work, especially when the NVidia driver just
Re: (Score:2)
That's stuff that could probably be run in vim.
(fixed the typo)
I've seen DOOM-vim, but are you saying there's a Borderlands2-vim now?!?!?
Re: (Score:2)
The card no longer supports CUDA32 compute shaders, period, VM or not.
I think this is a bummer, because there are still some games that use it (and look fucking amazing when it's enabled- like Metro Last Light), but I mean, it had to happen sometime. They weren't going to support a dead architecture forever.
Re: (Score:2)
You could emulate the hardware. Maybe.
Re: (Score:2)
GREAT experience for the "latest and greatest" GPU generation.
To be fair if you're the type of person running out and buying a high end 5000 series GPU you're unlikely to be doing it to run 12 year old games. And that's putting it mildly since the list I've seen has only the most recent of games being 12 years old already.
That said the internet group brain seems to suggest that the games are perfectly playable as for "removing objects from the game" there seems to only be 1 game confirmed to do this: Batman Arkham Origins where the CPU physx literally doesn't let you
Re: (Score:2)
If you're the type of person to buy a 50xx series GPU you are likely to be a gamer, and that means you are quite likely to still be playing many older titles. And you're also likely to expect rather massive graphical fidelity, especially in 12 year old games.
Re: (Score:2)
there seems to only be 1 game confirmed to do this: Batman Arkham Origins where the CPU physx literally doesn't let you put it on max settings.
Literally all games will do this.
Metro Last Light, Borderlands 2- all of them.
There is no choice.
CPU cannot do the work of even a very old GPU in PhysX.
You shouldn't argue by Google search. It's a bad look.
Re: (Score:2)
Well for me - I was having issues with Borderlands 2 PhysX - and on my 1070's driver panel - there is an option to run PhysX on the GPU or the CPU - I noticed negligible difference in FPS on either setting (on a 6 year old Core I7) - and the PhysX particles (e.g. the rubble when you drive over rock) are all still there either way.
Surely a game that old won't have too many problems FPS wise on new hardware and newer games will be using 64 bit physx
Re: (Score:2)
Most games will definitely have a gimp mode for CPU PhysX mode. [youtube.com]
Re: (Score:2)
Both the 970 and 1950X are out of date.
Completely fair. It's hard to find modern examples, though.
How about a 9800X3D or upcoming 9950X3D?
Still not even *close* to a GPU from the PhysX GPU era.
A 9800X3D can pump out around 0.58TFLOPS, which is good for a CPU.
A GTX970 does around 4. That's 8 9800X3Ds in number crunching capacity.
Will they struggle with old 32-bit PhysX titles while running a 5090?
That's hard to say. If their physics load was designed for the GPU, which has far more performance than any CPU, then yes, they probably will.
Sure, a modern CPU physics library like modern PhysX or Havok could probably do the job just fine, but we're not wor
Re: (Score:2)
You said "struggle", OP said "run just as well".
Those are very different goalposts.
I'm quite certain that most games will have at least some mode that they'll still run probably perfectly fine on... they'll just run worse than they would have on a card that still supported CUDA32- in that they'll sacrifice the quality of their physics most likely.
Re: It doesn't matter. (Score:2)
Re:It doesn't matter. (Score:4, Informative)
PhysX has a software fallback. After all, those games still worked on AMD GPUs. GPU-accelerated PhysX was only relevant in an era where CPUs were dramatically slower, and CPUs had a single core. Those games will run just as well today with the CPU fallback as they did on the GPU in 2008.
That part about CPU fallback running as well as GPU in 2008 is simply not true. https://www.dsogaming.com/news... [dsogaming.com]
So, I went ahead and downloaded the Cryostasis Tech Demo. I remember that tech demo running smoothly as hell with the RTX 4090. So, how does it run on the NVIDIA RTX 5090 with an AMD Ryzen 9 7950X3D? Well, see for yourselves. Behold the power of CPU PhysX. 13FPS at 4K/Max Settings. Thanks NVIDIA. Ironically, the RTX 4090 (which still has GPU PhysX support) was able to push over 100FPS at 4K/Max Settings. Let this sink in.
PhysX prior to version 3 is locked to a single core, and was compiled with x87. It's horribly inefficient and not usable even on modern CPUs. From 100fps down to 13fps? The only option is to disable PhysX in games that use the 32-bit version, and thus forever changing how these games are experienced.
Linus middle-finger time (Score:2)
4080Super Last Year was my best decision. (Score:1)
I'm so glad I bought a 4080Super when they launched. They were available at MSRP for several weeks (months?) and now it looks more and more like the 5080 would have been a downgrade in many ways. 5-10% performance uplift for fewer features, black screen issues, and melting power cables?
Naaaaahhhhh.
Smart decision for everyone would be to just skip the 50-series entirely. Buy AMD or Intel and whenever Nvidia gets serious about re-entering the gaming market, give them another look.
Just to add another data point (Score:2)
I decided to try this out myself, on the modern Intel ultra 265K. Cryostasis tech demo.
With different Intel architecture and all the microcode updates, I thought that this might be a better demo because the architecture is more robust overall, AMD is simply stacking its chip for traditional games.
At ultra settings, everything maxed out, 2560 X 1600, I get an average of 28 frames per second, a low of 11.
I wish I could run it at 4K and see how well it does. However, point being that there is no way a modern C