Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Graphics Software Businesses Hardware

NVIDIA To Buy AGEIA 160

The two companies announced today that NVIDIA will acquire PhysX maker AGEIA; terms were not disclosed. The Daily Tech is one of the few covering the news to go much beyond the press release, mentioning that AMD considered buying AGEIA last November but passed, and that the combination positions NVIDIA to compete with Intel on a second front, beyond the GPU — as Intel purchased AGEIA competitor Havok last September. While NVIDIA talked about supporting the PhysX engine on their GPUs, it's not clear whether AGEIA's hardware-based physics accelerator will play any part in that. AMD declared GPU physics dead last year, but NVIDIA at least presumably begs to differ. The coverage over at PC Perspectives goes into more depth on what the acquisition portends for the future of physics, on the GPU or elsewhere.
This discussion has been archived. No new comments can be posted.

NVIDIA To Buy AGEIA

Comments Filter:
  • by mrxak ( 727974 ) on Monday February 04, 2008 @08:43PM (#22300822)
    That's just it, really. Games need to support it in large enough numbers, and need to do it well enough to make a difference between those without the cards and those that have them. Most people seem to think this is a joke, and the way CPUs are going anyway with extra cores, I think we'd be better off seeing multithreaded games instead of physics cards.
  • by Scareduck ( 177470 ) on Monday February 04, 2008 @08:45PM (#22300838) Homepage Journal
    I don't pay close attention to the GPU market in general, though lately I've been interested in a few numerical modeling projects that could benefit from high-performance computing. The AMD Firestream 9170 [amd.com] is supposed to be released in the first quarter of this year, with a peak speed of 500 GFLOPS, most likely single-precision, but the beauty part is that it should also support double-precision, the numeric standard for most computational modeling. NVidia's option in this space is the Tesla C870 [nvidia.com]; I wonder whether this move to purchase another GPU line will divert resources away from their number-crunching-first GPUs.
  • by Rival ( 14861 ) on Monday February 04, 2008 @08:51PM (#22300910) Homepage Journal
    I see your floating point.

    The way I picture things, a Physics Processing unit (PPU?) will end up like FPUs: at first an optional, narrow-use add-on, then integrated on premium products, then more widespread as software vendors feel comfortable relying on it, and finally ubiquitous and practically indispensable.

    And then Slashdotters will be able to say, "You kids with your integrated PPUs nowadays -- when I was your age, we had to calculate trajectories and drag coefficients by hand, and we liked it that way!"
  • by Cathoderoytube ( 1088737 ) on Monday February 04, 2008 @08:57PM (#22301006)
    So, I'm assuming I'm not getting all the physics simulation quality I can get out of my games? The whole deal with the bridges collapsing in real time and all sorts of junk bouncing around isn't the ultimate physics experience? Is there... Another level of ragdoll I'm not experiencing? Is there some dynamic to a flaming barrel rolling down a hill my computer can't handle?! Or.. Or.. Is it Nvidia making one of its patented cash grabs?! Considering all the physics simulations in games to date have been done on the processor with no performance hit (Have you played the last level in Half-Life Episode 2?) I'm finding the notion of dedicated physics card fairly stupid. But that's just me.
  • by milsoRgen ( 1016505 ) on Monday February 04, 2008 @08:59PM (#22301022) Homepage
    Just look at Intel's rather quick turn around from the P4 to the Core architecture. They were headed down the same road GPU makers are going, yet reversed course. Sure it's mostly thanks to the Israeli development team that produced the Pentium M. Which was in turn based the Pentium 3. The fact of the matter is nVidia has shown time and time again they can make a killer product. I believe they could make a highly efficient CPU with performance to watt ratios well inline with current products. If not even better.

    But on another note... The heat issue with GPUs really does need to be resolved. I'm using a x1800 XT ATI card... And I've come pretty close to 100C at times... I'm not quite sure how current gen cards are doing in this area, but I doubt it's been anything like the P4 > Core turn around.
  • by Rival ( 14861 ) on Monday February 04, 2008 @09:15PM (#22301190) Homepage Journal
    Games are great at motivating the development of better video cards, and to some extent bus speeds, processors and other non-gaming-specific components. This is a good thing, though I have some old-man opinions on how Moore's Law is spoiling many developers.

    That being said, I don't believe games drive the adoption of hardware as much as you might be thinking. As a case in point, look at Vista. Ugly and bloated, yes, but perforce nearly everywhere. And the minimum requirements for Aero (which is the one feature your average user is going to jump on -- ooh, it's pretty!) are going to do more to push the next large jump in base video card standards than any given game.

    Retailers don't have enough fiscal incentives to stop pushing Vista, even if they do try to gain positive PR by selling Ubuntu or XP on a few low-end models. And if they're pushing Vista, they want to support the pretty interface the public expects. By making hardware-accelerated rendering a practical requirement of the OS, Microsoft has raised the bar of the "minimum acceptable" video card.

    Right now we see physics cards as a niche product, barely supported. It has been the same with all technical developments. But if we're heading toward 3D interfaces (which I believe we are,)then physics can only play an increasing roll in such an environment. If that should become the case, then a dedicated processor will be much more valuable then assigning a generic CPU core to try and handle the calculations.
  • by idonthack ( 883680 ) on Monday February 04, 2008 @09:30PM (#22301328)
    With dual-core coming standard now on all new PCs, and multi-core rapidly approaching, physics cards are done for. Graphics cards are still a good idea because the kind of calculations they do can be heavily hardware-optimized in a way that general purpose CPUs are not, but physics cards don't do anything a second (or fourth) full speed CPU isn't capable of doing better and faster.
  • by mikael ( 484 ) on Monday February 04, 2008 @09:31PM (#22301340)
    With the current PC architecture, the CPU has to send data to the Physics card, read the data back, then finally send it down to the GPU. This would have to be done for things like character animation (ragdoll motion), particle systems for visual effects (bouncing off the scenery/characters). Ideally, you would want the Physics processor to have a direct path to the GPU. Then you could avoid two of these steps.

    And if nothing else, Nvidia also get a team of engineers who have worked together and have both DSP and current game industry technology experience.
  • by 644bd346996 ( 1012333 ) on Monday February 04, 2008 @09:35PM (#22301380)
    Don't forget that PhysX has software out there, too. It hasn't been doing well against Havok, but it's obviously in NVidia's best interests to promote the use of physics engines in games, seeing as they could provide the hardware acceleration for them. I expect the PhysX engine will soon have the ability to use NVidia GPUs, and it will pushed as a more viable competitor to Havok, especially since Intel cancelled Havok FX.
  • by BagOBones ( 574735 ) on Monday February 04, 2008 @10:33PM (#22301884)
    A year ago both Nvidia and ATI/AMD both showed off their GPUs doing HAVOK acceleration equal or better than AGEIA. With ATI claiming to have a 7 month lead... Could this be a catchup move of patent grab by NVIDIA?

    http://www.reghardware.co.uk/2006/06/06/ati_gpu_physics_pitch/ [reghardware.co.uk]
  • by sssssss27 ( 1117705 ) on Tuesday February 05, 2008 @12:27AM (#22302788)
    I don't think you are thinking grand enough. I remember the days where you didn't need a dedicated graphics card to play games and I'm only 21. You really didn't get improved game play per say but it did look a heck of a lot better. A dedicated physics processor though has the potential to vastly improve game play and realism.

    Imagine instead of designers creating models of buildings they actually built them. That is a brick building had individual bricks all stacked on each other. Whenever you hit it with an explosive it would actually crumble like a real building or burn like a real building. That is a lot of calculations which a general CPU isn't the best at.

    The thing is not enough people have PPUs in their computers so you can't include it into core game play yet. Hopefully nVidia acquiring Ageia will allow them to start bundling it with their GPUs or even better yet offer it embedded on their motherboards. While graphics are easily scaled, game play elements are not. I wouldn't be surprised if you see PPUs being crucial to the game on consoles before PCs.
  • Re:Not Good (Score:2, Interesting)

    by DeKO ( 671377 ) <danielosmari.gmail@com> on Tuesday February 05, 2008 @12:51AM (#22302956)
    That's pretty much unfeasible. Every game needs a different physics simulation. Rigid bodies, ropes, soft bodies, particles, cloth, and so on; each requires a very different strategy. And there are many special cases where you can customize the algorithms for your specific simulation; using a more general algorithm when a specialized one is possible is less efficient.

    And this doesn't even get into the details about strategy; continuous vs fixed time steps, different orders of integration, collision detection and so on. Each has its own quirks; and Nintendo is proving us all the time that you can create superb games using almost no physics.
  • by Wesley Felter ( 138342 ) <wesley@felter.org> on Tuesday February 05, 2008 @02:41AM (#22303658) Homepage
    There's nothing stopping you from buying a low-end PC and installing a real GPU. AFAIK, most systems with integrated graphics still have a PCI Express slot so you can upgrade.

    I also don't see any gouging going in in gaming PCs. I recently built a $1000 gaming PC and prebuilt models with similar specs were selling for $1100-1200, which is not much of a markup.
  • by Charcharodon ( 611187 ) on Tuesday February 05, 2008 @02:43AM (#22303670)
    Actually the thing you are missing is the bridge pieces bending before it collapses, the barrel being dented as it rolls down the hill, or the rag doll limbs breaking or being ripped off with the proper application of force. Those things cannot be done in real time on a CPU.

    Unfortunately most of those things are only avialable in demos atm. UT3 has a couple of special maps that do some neat stuff, but then you start running into problems with the video card trying to keep up with the 100 or so bricks that just came crashing down from the wall you just demolished.

    In the main game the only physx I noticed was the cloth simulation of the flags, and the main characters outfit, of course you don't exactly have a whole lot of time to take this all in since everyone is trying to kill you.

    Ageia is not just hardware physics the software they make does a pretty good job. The vehicles in UT3 are some of the best I've every seen. I started laughing during one game because I managed to get a small vehicle wedged under my tank and kept going, dragging it along, verses instantly getting stuck while the CPU sits there trying to figure out the clipping and collosion detection.

    The premium idea in my book would be for Nvidia to integrate the function into their video cards, but keep it dormant, so that it is only used as a video card at the time, and then when you upgrade your video card, the new one takes over the video and the old one moves over a slot and becomes a dedicated PPU in the second SLI slot.

  • by TheThiefMaster ( 992038 ) on Tuesday February 05, 2008 @09:12AM (#22305516)
    The biggest problem in FPS game physics at the moment is a ridiculously trivial one:
    When the player moves forward he pushes whatever is under him forward.

    I'm not kidding, try standing on something and moving. In reality there's no real problem, as long as the centre of gravity of you+object is under the object's base. If it's light enough, you might kick it out behind you. In a game, 99% of the time you will kick the object under you forwards and out from under you.

    Try standing on a barrel in Half-Life 2 for example. The game even gives you a puzzle at the start where you have to stack crates to get through a window, where you can notice this problem.

All your files have been destroyed (sorry). Paul.

Working...