NVIDIA To Enable PhysX For Full Line of GPUs 140
MojoKid brings news from HotHardware that NVIDIA will be enabling PhysX for some of its newest graphics cards in an upcoming driver release. Support for the full GeForce 8/9 line will be added gradually. NVIDIA acquired PhysX creator AGEIA earlier this year.
Hentai (Score:5, Funny)
Re:Hentai (Score:5, Funny)
They're having difficulties realistically modelling penetration. Close contact like that tends to lead to numerical instabilities in physics engines. There's not much Physx can do to help, though.
Re:Hentai (Score:5, Funny)
That's why there are teams of researchers working night and day to improve the state of tentacle modeling.
If you have what it takes to advance the state of the art there could be a big government grant and a PhD in it for you.
Comment removed (Score:4, Funny)
Re: (Score:3, Funny)
Re: (Score:3, Funny)
Re:Hentai (Score:5, Insightful)
That's just disturbing.
Re:Hentai (Score:4, Funny)
Not as disturbing as the Chronicles of Goatse.cx Part IV: Rick Astley's Revenge
Re: (Score:2)
You won this story.
Happy Friday.
Works on just the one card? (Score:4, Interesting)
I read TFA, but it didn't really give many details as to how this works, just some benchmarks that don't really reveal much.
Will this work on single cards or will it require an SLi system where one card does the PhysX and the other does the rendering?
Plus, how does handling PhysX affect framerates? Will a PhysX enabled game's performance actually drop because the GPU is spending so much time calculating it and not enough time rendering it, or are they essentially independent because they're separate steps in the game's pipeline?
Re: (Score:1, Flamebait)
The effect on framerate doesn't matter - the target audience for this will have at least one spare graphics card to run physics on.
Re: (Score:2)
Are you sure that's the target audience, though?
See I've only got 1 card and I'd love hardware accelerated physics, but I sure as hell wouldn't buy a separate card for it.
Re: (Score:1, Insightful)
Previously you had to buy a $200+ physics card from Ageia. I'm not sure how well a graphics card can do physics, but it'd be neat if I could take an older graphics card and repurpose it to do physics instead of throwing it away.
Re: (Score:1)
Re: (Score:2)
Re:Works on just the one card? (Score:5, Informative)
That's not true at all. It works in a single card configuration as well. Modern GPUs have more than enough spare parallel processing power to chug away at some physics operations. Guys are already modifying the beta drivers to test it out on their Geforce 8 cards. The OP in this thread is using a single card configuration:
http://forums.overclockers.com.au/showthread.php?t=689718 [overclockers.com.au]
Re: (Score:2)
Re: (Score:2)
No, don't be stupid. Any half-decent games engine nowadays does everything with parallel threads.
They (cpu and gpu) still have to wait on each other if they finish early (to synchronise the frames), but they will spend at least 50% of their time both running at once. Ideally it would be 95%+, but games are often unbalanced in favour of graphics complexity these days.
Re:Works on just the one card? (Score:5, Informative)
Yes, it works on one card. I have enabled it on my 8800GT earlier today. The CUDA/PhysX layer gets time-sliced access to the card. Yes, it will drop framerates by about 10%.
OTOH if you have 2 cards, you can dedicate one to CUDA and one to rendering so there won't be a hit. The cards need to NOT be in SLI (if they're in SLI, the driver sees only one GPU, and it will time-slice it like it does with a single card). This is actually the preferred configuration.
Re:Works on just the one card? (Score:4, Informative)
You need the latest unreleased yet drivers for toe GTX2xx series, version 177.39. Then edit the nv4_disp.inf file and add an entry for device ID of 0611 (=8800GT). You will then be able to install the driver on the 8800GT. Next, install the new (also unreleased yet, but google is your friend) 8.06 software for PhysX. That's it.
Re: (Score:2)
Damn, even with preview...
the GTX2xx, not toe GTX2xx
Re: (Score:3, Interesting)
According to the Maximum PC Podcast [maximumpc.com] they saw significant framerate hits with single card setups, but that it was much better under SLi. They did stress that they had beta drivers, so things may drastically improve once nvidia gets final drivers out the door.
Kind old news (Score:2)
Does anyone else remember... (Score:1, Flamebait)
...how much gamers used to shit all over PhysX cards? Now, they can't wait to get their hands all over it.
Re: (Score:3, Insightful)
Re: (Score:3, Informative)
I don't need to Google. Anything built on the Unreal 3 engine has PhysX support built in.
Re: (Score:2)
Re:Does anyone else remember... (Score:4, Informative)
Reading comprehension...anything built on the Unreal 3 engine.
Like one of these many licensees:
http://www.unrealtechnology.com/news.php [unrealtechnology.com]
Native PhysX Support:
http://www.theinquirer.net/en/inquirer/news/2007/05/30/unreal-3-thinks-threading [theinquirer.net]
Re:Does anyone else remember... (Score:5, Insightful)
Unreal 3 is an engine that's used on LOTS of games - technically ALL of them have PhysX support, so no, not "just" Unreal 3, because there is no game called Unreal 3.
Re: (Score:3, Funny)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Your point is moot, as "Unreal 3" and "Unreal tournament 3" are two completely different things. One's an engine, one is a game based on that engine.
But sure, if you want a list of games, how about Mass Effect, Huxley, Gears of War and Roboblitz? Those are just the unreal engine games off the top of my head that I know have hardware PhysX support, there's plenty of other titles out there that use it as well, such as both GRAW games as well as a few other Tom Clancy games (one of which being Vegas - another
Re: (Score:2)
At first I thought you were playing coy, but now I just think you are retarded. UE3 is the engine. PhysX support is native to the engine. Any game developed around that engine, will inherently support PhysX.
http://en.wikipedia.org/wiki/List_of_Unreal_Engine_games [wikipedia.org]
Unreal Engine 3
* 50 Cent: Blood on the Sand â" (2008) Swordfish Studios
* A4 (sequel to A3) - (2009) AniPark
* Aliens: Colonial Marines â" (Late 2008) Gearbox Software
* Alpha Protocol - (Spring 2009) Obsidian Entertainment
* America's Army 3.0 â" (2008) US Army
* American Mcgee's Grimm - (2008) Spicy Horse
* APB â" (2008) Realtime Worlds
* Army of Two â" (2008) Electronic Arts
* Alliance of Valiant Arms - (2007) Pmang
* Black College Football: BCFX: The Xperience - (2007) Nerjyzed Entertainment
* Black Powder Red Earth - (TBA) Echelon Software
* Brothers In Arms: Hell's Highway â" (2008) Gearbox Software[23]
* BioShock - (2007) 2K Boston/2K Australia
* BlackSite: Area 51 â" (2007) Midway Austin
* Blitz - (2008) CJIG
* Borderlands - (TBA) Gearbox Software
* Crimecraft - (2008) Vogster Entertainment
* Damnation - (TBA) Blue Omega / Point of View
* DC Comics MMO â" (TBA) Sony Online Entertainment
* Earth No More â" (2009) Recoil Games / 3D Realms
* Empire - (TBA) Chair Entertainment
* Empire: Alpha Complex - (TBA) Chair Entertainment
* Ender's Game â" (TBA) Chair Entertainment
* Elveon â" (TBA) 10tacle Studios[24]
* End - (TBA) Faramix Enterprises[25]
* Fatal Inertia â" (2007) Koei[26]
* Free Realm - (TBA) Sony Online Entertainment
* Frontlines: Fuel of War â" (2008) Kaos Studios
* Fury â" (2007) Auran[27]
* Gears of War â" (2006) Epic Games
* Gears of War 2 â" (2008) Epic Games
* Global Agenda â" (TBA) Hi-Rez Studios[28]
* Hail to the Chimp â" (2008) Wideload Games [29]
* Highlander: The Game â" (TBA 2008) TBA
* Hei$t - (2007) inXile Entertainment
* Hour of Victory â" (2007) Midway Games
* Huxley â" (2008) Webzen Games[30]
* Lost Odyssey â" (2007) Mistwalker[31]
* Mass Effect â" (2007) BioWare[32]
* Magna Carta 2 - (TBA) Softmax
* Medal of Honor: Airborne â" (2007) Electronic Arts[33]
* MirrorÂs Edge (TBA) DICE
* Monster Madness: Battle for Suburbia â" (2007) Artificial Studios[34]
* Mortal Kombat vs. DC Universe - (2008) Midway
* Mortal Online - (2009) Star Vault
* Parabellum - (2008) Acony
* Project M - (TBA) NC Soft
* Red Steel sequel - (TBA) Ubisoft
* Rise of the Argonauts - (2008) Liquid Entertainment
* Robert Ludlum's The Bourne Conspiracy - (2008) High Moon Studios
* RoboBlitz â" (2006) Naked Sky Entertainment[35]
* Rogue Warrior: Black Razor - (2007) Bethesda Softworks
* Section 8 (2009) Timegate Studios
* Sephiroth 2 - (TBA) IMagic Entertainment
* Sin City - (TBA) Transmission Games / RedMile Entertainment[36]
* Stargate Worlds â" (2007) Cheyenne Mountain Entertainment[37]
* Stranglehold â" (2007) Midway Chicago[38]
* The Agency - (2008) Sony Online Entertainment
* The Last Remnant â" (2008) Square Enix
* The Scourge Project â" (N/A) Tragnarion Studios
* Tiberium â" (TBA) Electronic Arts[39]
* TNA iMPACT! - (2008) Midway Games / Point of View
* To End All Wars â" (2008) Kuju Entertainment [40] [41]
* Tom Clancy's EndWar â" (2008) Ubisoft
* Tom Clancy's Rainbow Six: Vegas - (2006) Ubisoft
* Tom Clancy's Rainbow Six: Vegas 2 - (2008) Ubisoft
* Too Human - (TBA) Silicon Knights (until ditched in favour of an internal engine)
* Turning Point: Fall of Liberty â" (2008) Spark Unlimited
* Turok â" (2008) Propaganda Games
* Undertow â" (2007) Chair Entertainment
* Unreal Tournament 3 â" (2007) Epic Games
* The Wheelman â" (2008) Midway Games
* Warmonger - (2008) NetDevil
* project M (codename by NC Soft new MMORPG) - (2009) NC Soft[citation needed]
* project M (codename by Red Duck new MMORPG) - (2010) Red Duck[citation needed]
* MU2 (2010) - Webzen[citation needed]
* collaborated with People Can Fly on design of a new IP for Epic - Epic Games and People Can Fly[citation needed]
* Unannounced project - (TBA) NetDragon[citation needed]
* Unannounced project - (TBA) 9YOU[citation needed]
Re: (Score:2)
I would be embarrassed to post as myself too there urbanriot. It's native to the engine. What don't you understand about that? Do you know how game engines work?
You do know games use physics right? So in your mind it makes perfect sense to rewrite/license additional/not use the physics code that is native to the engine? Man, you truly are dense.
Maybe in your mind it makes sense to pay the outrageous licensing fees for the Unreal 3 engine, and then pay additional licensing fees for the Havok physics eng
Re: (Score:2)
Re: (Score:2, Funny)
Duke Nukem Forever.
Re: (Score:2)
Re: (Score:2, Interesting)
Except modern physics engines (see: Quake 1 for MS DOS) use threads for each individual moving physics object, and the Render Thread that manages control of the graphics card uses 1 thread itself (hard to split up that...), so with new Quad Core and 8 and 16 core systems you've got a much better physics processing engine running on your CPU.
Re: (Score:3, Informative)
When we're talking about game worlds in which there could easily be 50 or 100 objects on the screen at once, it makes much more sense to have maybe one physics thread (separate from the render thread, and the AI thread) -- or maybe one per core. I very much doubt one real OS thread per object would work well at all.
Re:Does anyone else remember... (Score:4, Informative)
Um, except if you you have exactly 1 physics thread you have to juggle complex scheduling considerations about who needs how much CPU, handle the prioritization against the render and AI threads, handle intermixing them, etc. You have to implement a task scheduler. ... which is exactly what Quake 1 did. Carmack wrote a userspace thread library, and spawned multiple threads. Since DOS didn't have threads this worked rather well.
An OS thread will give any thread a base priority, and then raise that priority every time it passes it over in the queue when it wants CPU time. It lowers the priority to the base when it runs. If a task sleeps, it gets passed over and left at lowest priority; if it wakes up and wants CPU, it climbs the priority tree. In this way, tasks which need a lot of CPU wind up getting run regularly-- as often as possible, actually-- and when multiple ones want CPU they're split up evenly.
If you make the render thread one thread, you have to implement this logic yourself. Further, the OS will see your thread as exactly one thread, and act accordingly. If you have 10000 physics objects and 15 AIs, keeping both threads CPU-hungry, then the OS will give 1/3 CPU to the physics engine; 1/3 CPU to the AI; and 1/3 CPU to the render thread. This means your physics engine starves, and your physics start getting slow and choppy well before you reach the physical limits of the hardware. The game breaks down.
You obviously don't understand either game programming or operating systems.
Re: (Score:2)
Um, except if you you have exactly 1 physics thread you have to juggle complex scheduling considerations about who needs how much CPU, handle the prioritization against the render and AI threads, handle intermixing them, etc.
Which people do.
Or simpler: Give the render thread priority, and set it to vsync. Anything above 60 fps is a waste.
If you have 10000 physics objects and 15 AIs, keeping both threads CPU-hungry, then the OS will give 1/3 CPU to the physics engine; 1/3 CPU to the AI; and 1/3 CPU to the render thread.
Assuming the render thread needs that 1/3rd.
Keep in mind that ideally -- that is, if you're not lagging -- none of these are pegging the CPU, and you're just making whatever calculations you make every tick.
You obviously don't understand either game programming or operating systems.
Well, let's see -- most games I know of won't take advantage of more than one CPU. In fact, when Quake3 was ported to dual-core, it took a 30% performance hit -- and keep in mind, that's Car
Re: (Score:2)
Well, let's see -- most games I know of won't take advantage of more than one CPU. In fact, when Quake3 was ported to dual-core, it took a 30% performance hit -- and keep in mind, that's Carmack doing it.
Really? Quake 3 was already threaded when released; I looked through the code myself. On Windows, however, the Windows scheduler pegs all threads in one program to the CPU the program's on unless you manually manage that part of the scheduler (you have to give threads CPU affinity or they have affinity to CPU 0). I had it on Linux (which, if threads have no CPU affinity, will distribute them to the next available CPU when scheduling), it pushes both cores just fine.
And this is hardly the first place this argument has been made -- green threads are inherently more efficient than OS threads.
Ah, the green threads argument. Green
Re: (Score:2)
1 core to AI, 1 to physics, 1 to render, 1 idle. Or 13 idle, since Intel believes 16 cores is the "Sweet Spot" and will hit the desktop in a few years.
Re: (Score:2)
"WHAT I have to buy a second card, it's not free/can't run off the bios chip in the MB, WTF!!??"
The funny thing that now that the PhysX cards are Rago (It's in there) you still are going to have to buy a second video card to keep your frame rate up and increase the number of physX objects. Of course with this arrangement your GPU is less speciallized thant he PhysX hardware and can be used for all the CUDA applications.
I'm going to end up
I called it (Score:4, Insightful)
Re:I called it (Score:5, Funny)
Re:I called it (Score:5, Funny)
Hi
We need an address for your 'Sarcastic Achievement - Level 3' certificate - you'll have to pay postage, but I'm sure you won't mind that, right?
Ned Again
COO - Sarcasm Society
Level 5 Sarcasm Ninja (certified)
Re: (Score:1, Troll)
That is kind of a non-sequitur when you work in a whorehouse.
Re: (Score:2)
Of course not that NV were packaging a GPU accelerated Havok engine with TWIMTBP for developers (look at company of heroes for that kind of thing), their plans with Havok dropped out when Intel brought the engine tech, so NV secured Ageia so that this time its tech can't be yanked out from under it.
A fun thing to do, load up all your favorite games, and actually watch the intros, how many have TWIMTBP, how many of the new games from these makers will require a NV card for their games physics to run well?
Do they do their accounting like Apple does? (Score:2, Insightful)
Re: (Score:2)
Re: (Score:2)
title ? (Score:2)
No, actually they are adding it to new editions of their cards. Not current cards already in machines. It is not a driver update.
Re: (Score:2)
Re:Linux Support (Score:4, Funny)
Re:Linux Support (Score:4, Funny)
And hopefully when it does I'll get first post in the /. article about it.
Re:Linux Support (Score:4, Funny)
Re:Linux Support (Score:5, Funny)
And hopefully the comments in the article won't all be attempts at +5, Funny.
Re: (Score:2)
And hopefully everyone won't just be stuck at +4 Funny and some negative karma mods that make the whole thing feel worthless.
Re:Linux Support (Score:5, Funny)
And hopefully the story wont be posted 4/1/2009.
-J
Re: (Score:2)
Re: (Score:2)
Re: (Score:1, Flamebait)
Hopefully they'll include their Linux drivers.
Re:Linux Support (Score:5, Funny)
Re: (Score:3, Funny)
Re:Linux Support (Score:5, Interesting)
That's not a useless comment at all unless I'm missing something. UT3 hasn't been able to put out the long-promised Linux driver because AGEIA is being so unwilling to release the license grapple hold they have over the PhysX engine. This is a legitimate concern. Unless their stance changes, Linux drivers will not be possible.
Re: (Score:2)
and the mac osx drivers (Score:2)
and the mac osx drivers
Re: (Score:2, Informative)
Re:Linux Support (Score:4, Interesting)
What Linux application/game uses Havok?
Re:I didn't RTFA (Score:4, Informative)
Hardware accelerated physical acceleration, gravity and particlestuff if I remember correctly, atleast old examples used to be throwing away items or exploding walls and such.
Re:I didn't RTFA (Score:5, Funny)
Mmmmm.. hardware accelerated litter..
Re:I didn't RTFA (Score:4, Interesting)
It makes City of Heroes look all awesome, particularly if you use Gravity, Storm, Kinetics or Assault Rifle power sets.
Having bullet casings, leaves, newspapers and the like drop and swirl around in response to player actions is actually pretty nifty from an immersion standpoint, particularly for a game that's essentially set in something that resembles the real, modern world.
Re:I didn't RTFA (Score:5, Funny)
"Having bullet casings, leaves, newspapers and the like drop and swirl around in response to player actions is actually pretty nifty from an immersion standpoint"
That's it. I'm done with immersion games. I'm going outside to stand in the rain. Back later.
--
BM0
Re:I didn't RTFA (Score:4, Funny)
Re: (Score:2)
I'm not sure the bullet casings or newspapers did, and given that essentially every PC game that's not City of Heroes is either a D&D ripoff, a Doom clone or a WWII shooter, I didn't want there to be any confusion.
Re: (Score:2)
Don't forget futuristic shooters, futuristic rtss and driving games.
Re: (Score:3, Insightful)
Doom is a futuristic shooter. We had it back in this thing called the 90s ;) And an RTS is an RTS. Driving games on the PC have never been quite as prolific as on consoles either.. something I used to lament, but things are improving these days.
Re: (Score:2)
a WWII shooter
You mean a Wolfenstein Clone?
Re: (Score:2)
People still actually play that piece of crap?
I went out and bought that quite a few years ago, and my friends all did too so they could play with me, and many of them won't speak to me anymore.
I didn't realize people actually liked it though.
Re:I didn't RTFA (Score:4, Interesting)
Basically exactly what it sounds like... its a real-time physics calcuating engine.
Used in games for things like shooting the limbs off of creatures, or even wind on trees, or water...
Likewise for other 3D applications, im not sure how extensive it is, or what its limitations are, but im looking forward to it, and more because calculating physic type things on most 3D software takes a lot of CPU power, so if the GPU can handle that, that takes a great load of the main CPU. (from what I would assume)
Re: (Score:2, Interesting)
What's next? "Graphic" cards with hardware accelerated AI support?
Re: (Score:1, Offtopic)
Yes... quite obviously, that's why AMD + ATI, and Intel has their own Graphics stuff, VIA, etc.
Given the power that can be crammed into millimeters of space, may as well combine them, eventually your entire "PC" will be the size of the average processor is now, which just aren't there yet, so there are some naysayers used to older tech, and the junkies waiting for the next best thing.
Which, is both exciting and scary, welcomed and feared... but its just interesting right now like watching two planets collid
Re: (Score:3, Insightful)
Re:I didn't RTFA (Score:4, Interesting)
Not exactly true, all of the Unreal Tournament Edition 3 engine games consistantly use all four cores in my Intel Q6600 with over a dozen threads spaced throughout my cores. The most notible examples would be UTE3, Bioshock and Mass Effect, 3 of the biggest games of 2007 and 2008. I can typically max out settings for UTE3 engine games.
On the other hand, performance demanding games like Crysis are total doucebags and peg just one core and sometimes using one more if it feels like it every now and then. Although it's not a very good comparison since there's so many different factors involved, I would gather to say that if crysis took an approach of optimizing better for duo and quad core cpus, their publisher would have far less complaints about performance from gamers.
Re: (Score:2)
And now you know why the people at Intel have been pushing raytracing so hard recently: they know this, and are trying to avoid becoming irrelevant.
Re: (Score:3, Informative)
Except that general purpose CPUs aren't really particularly great for raytracing. GPUs are simply special-purpose processors designed with raster graphics in mind. The newest fad is, of course, using all that special-purpose horsepower in more imaginative ways, but it's still a raster graphics processor at heart.
Why is it that they're raster graphic special purpose processors? Because raster dominates the playfield. What's the logical conclusion there? As soon as raytraced graphics engines start becoming
Re: (Score:2)
I think we will have to wait a few generations until game developers see profit in expensive 'text'.
Eye candy is what sells. Why waste time and cash with AI's
Re: (Score:3, Interesting)
"What's next? "Graphic" cards with hardware accelerated AI support?"
Actually this isn't a bad idea, this is a good idea since pathfinding in games like Supreme commander is just a nightmare as you add more units, I've wondered about using the GPU for pathfinding acceleration.
Re: (Score:2)
What's next? "Graphic" cards with hardware accelerated AI support?
If the problem can be represented as an array or matrix of data elements, and the core algorithm looks at two or more data elements together, then it can be solved using GPU techniques.
Re: (Score:3, Insightful)
But going from a little physics demo to full blown kick ass 3d game with any meaningful results is a whole 'nother matter.
Re:PhysX? (Score:5, Informative)
http://en.wikipedia.org/wiki/PhysX [wikipedia.org]
Realtime hardware accelerated physics. Used to be on a separate expensive board which few games supported but Nvidia are implementing it on CUDA so it can run on their graphic cards instead.
Re:PhysX? (Score:4, Insightful)
nvidia bought out he company so they own it and can put it on their cards, games that decide to add support for it it will benefit nvidia.
Re: (Score:2)
I did hear in an interview with an NV engineer recently that they are working to have a CUDA environment under a standard x86 cpu, just with reduced speed (since there's only 1-8 CPUs).
This stuff they designed specifically for their gforce shader unit (or vice verser), why should they do the work to key in AMD or anyone else to be able to do it, when AMD built their own GPU processing API do you think they offered to port it to NV cards?
The big question, how hard are NV going to push TWIMTBP (The Way Its Me
Re: (Score:2)
The real big question, for people in the Real World who need to be able to support Nvidia and ATI GPUs, is when there's going to be a standard GPGPU and/or physics API that works on both.
Until then, all this shit's entirely useless.
Re: (Score:2)
The number of people in this world you speak of, is not as large (or indeed any substantial subset) of the end users of the devices in question.
As I said in another post, fire up all your games one at a time, and look through the start credits, TWIMTBP shows up a lot now on the "big" titles.
And I am not even going near their new CUDA processing array servers, which stand on their own merits with this tech.
Re: (Score:2)
It's not useless. There are a few graphics engines out there that are capable of scaling to different capabilities for different cards. For example: id Tech 4, Havok, Source engine, etc.
It does make development more difficult, though.
Re: (Score:3, Informative)
The source engine, while "capable" of scaling to multiple cores, does a very poor job on current x86 chips. The games become very unstable with mat_queue_mode 2 on, and there are problems with jerky motion in any sort of latency.
It's a shame, too, because the engine works with multicore on various consoles, and it's a lot faster when it does work on PC.
Re: (Score:2)