Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Graphics Software Technology

NVIDIA To Enable PhysX For Full Line of GPUs 140

MojoKid brings news from HotHardware that NVIDIA will be enabling PhysX for some of its newest graphics cards in an upcoming driver release. Support for the full GeForce 8/9 line will be added gradually. NVIDIA acquired PhysX creator AGEIA earlier this year.
This discussion has been archived. No new comments can be posted.

NVIDIA To Enable PhysX For Full Line of GPUs

Comments Filter:
  • Re:I didn't RTFA (Score:4, Interesting)

    by Vectronic ( 1221470 ) on Friday June 20, 2008 @08:32PM (#23881133)

    Basically exactly what it sounds like... its a real-time physics calcuating engine.

    Used in games for things like shooting the limbs off of creatures, or even wind on trees, or water...

    Likewise for other 3D applications, im not sure how extensive it is, or what its limitations are, but im looking forward to it, and more because calculating physic type things on most 3D software takes a lot of CPU power, so if the GPU can handle that, that takes a great load of the main CPU. (from what I would assume)

  • by neokushan ( 932374 ) on Friday June 20, 2008 @08:37PM (#23881177)

    I read TFA, but it didn't really give many details as to how this works, just some benchmarks that don't really reveal much.
    Will this work on single cards or will it require an SLi system where one card does the PhysX and the other does the rendering?

    Plus, how does handling PhysX affect framerates? Will a PhysX enabled game's performance actually drop because the GPU is spending so much time calculating it and not enough time rendering it, or are they essentially independent because they're separate steps in the game's pipeline?

  • Re:I didn't RTFA (Score:2, Interesting)

    by trooperer ( 1305425 ) on Friday June 20, 2008 @09:08PM (#23881397)
    I begin to wonder what's the use of having a multi-core CPU if GPU will be taking all the hard work?

    What's next? "Graphic" cards with hardware accelerated AI support?
  • by bluefoxlucid ( 723572 ) on Friday June 20, 2008 @09:48PM (#23881605) Homepage Journal

    Except modern physics engines (see: Quake 1 for MS DOS) use threads for each individual moving physics object, and the Render Thread that manages control of the graphics card uses 1 thread itself (hard to split up that...), so with new Quad Core and 8 and 16 core systems you've got a much better physics processing engine running on your CPU.

  • Re:I didn't RTFA (Score:4, Interesting)

    by slaker ( 53818 ) on Friday June 20, 2008 @09:59PM (#23881641)

    It makes City of Heroes look all awesome, particularly if you use Gravity, Storm, Kinetics or Assault Rifle power sets.

    Having bullet casings, leaves, newspapers and the like drop and swirl around in response to player actions is actually pretty nifty from an immersion standpoint, particularly for a game that's essentially set in something that resembles the real, modern world.

  • Re:Linux Support (Score:5, Interesting)

    by keithjr ( 1091829 ) on Friday June 20, 2008 @10:08PM (#23881691)

    That's not a useless comment at all unless I'm missing something. UT3 hasn't been able to put out the long-promised Linux driver because AGEIA is being so unwilling to release the license grapple hold they have over the PhysX engine. This is a legitimate concern. Unless their stance changes, Linux drivers will not be possible.

  • by nobodyman ( 90587 ) on Friday June 20, 2008 @10:23PM (#23881803) Homepage

    According to the Maximum PC Podcast [maximumpc.com] they saw significant framerate hits with single card setups, but that it was much better under SLi. They did stress that they had beta drivers, so things may drastically improve once nvidia gets final drivers out the door.

  • Re:Linux Support (Score:4, Interesting)

    by Zymergy ( 803632 ) on Friday June 20, 2008 @10:42PM (#23881887)
    So ATI has in their new Linux drivers Havok technology and it works under Linux for the new ATI cards?
    What Linux application/game uses Havok?
  • Re:I didn't RTFA (Score:4, Interesting)

    by ya really ( 1257084 ) on Saturday June 21, 2008 @04:17AM (#23883097)

    There hasn't been for a while, that's why buying a quad-core CPU is largely useless for gamers and one of the best uses of a dual-core CPU is running a single-threaded application alongside Windows.

    Not exactly true, all of the Unreal Tournament Edition 3 engine games consistantly use all four cores in my Intel Q6600 with over a dozen threads spaced throughout my cores. The most notible examples would be UTE3, Bioshock and Mass Effect, 3 of the biggest games of 2007 and 2008. I can typically max out settings for UTE3 engine games.

    On the other hand, performance demanding games like Crysis are total doucebags and peg just one core and sometimes using one more if it feels like it every now and then. Although it's not a very good comparison since there's so many different factors involved, I would gather to say that if crysis took an approach of optimizing better for duo and quad core cpus, their publisher would have far less complaints about performance from gamers.

  • Re:I didn't RTFA (Score:3, Interesting)

    by blahplusplus ( 757119 ) on Saturday June 21, 2008 @08:17AM (#23883957)

    "What's next? "Graphic" cards with hardware accelerated AI support?"

    Actually this isn't a bad idea, this is a good idea since pathfinding in games like Supreme commander is just a nightmare as you add more units, I've wondered about using the GPU for pathfinding acceleration.

Thus spake the master programmer: "Time for you to leave." -- Geoffrey James, "The Tao of Programming"

Working...