Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Graphics Software Businesses Hardware

NVIDIA To Buy AGEIA 160

The two companies announced today that NVIDIA will acquire PhysX maker AGEIA; terms were not disclosed. The Daily Tech is one of the few covering the news to go much beyond the press release, mentioning that AMD considered buying AGEIA last November but passed, and that the combination positions NVIDIA to compete with Intel on a second front, beyond the GPU — as Intel purchased AGEIA competitor Havok last September. While NVIDIA talked about supporting the PhysX engine on their GPUs, it's not clear whether AGEIA's hardware-based physics accelerator will play any part in that. AMD declared GPU physics dead last year, but NVIDIA at least presumably begs to differ. The coverage over at PC Perspectives goes into more depth on what the acquisition portends for the future of physics, on the GPU or elsewhere.
This discussion has been archived. No new comments can be posted.

NVIDIA To Buy AGEIA

Comments Filter:
  • by Macfox ( 50100 ) * on Monday February 04, 2008 @08:35PM (#22300710)
    This won't float unless they bundle it with the next generation GPU. AGEIA haven't been able to get traction with a dedicated card and neither will nVidia, unless a heap of games support it overnight.
  • by Tpl2000 ( 1174767 ) on Monday February 04, 2008 @08:38PM (#22300750) Journal
    I, for one, welcome our new fairy overlords. I also welcome whoever gets rid of this joke.
  • by Kyrubas ( 991784 ) on Monday February 04, 2008 @08:43PM (#22300816) Journal
    It might be that nVidia doesn't even intend to use the overall PhysX stuff at all, but instead wants to tear it apart for the patents on specific design patents further optimization of their GPUs.
  • by Crypto Gnome ( 651401 ) on Monday February 04, 2008 @08:52PM (#22300932) Homepage Journal
    1. Purchase Aegia
    2. Continue selling dedicated Physics addon-cards
    3. Integrate PPU onto Graphics Cards
    4. (somewhere along the line, get full Microsoft Direct-Something endorsement/support of dedicated physics processing)
      • possibly by licensing to AMD "PPU included on Graphics Card" rights, thusly invoking the power of Least Common Denominator
    5. Integrate PPU circuitry/logics into GPU (making it faster/more efficient/cheaper than equivalent solution licensed to AMD)
    6. ?? Profit ??
    In the end, for this to *really* succeed, it needs to be a "Least Common Denominator" factor. So it *requires* full support by Microsoft and Direct-X (them being The Big Factors in the games industry). And in order to get full support from The Windows Monopolist, you'll probably (not absolutely necessary, mut it'd make it much easier to convince Microsoft) need to enable AMD/ATI to leverage this technology, to some degree.

    Remember folks, Nvidia don't need to *kill* AMD/ATI, they only need to stay one or two generations ahead of them in technology. So they *could* license them "last years tech" for use on their cards, to make "Least Common Denominator" not a factor which excludes their latest-get tech implementations.
  • Interesting news. (Score:3, Insightful)

    by Besna ( 1175279 ) * on Monday February 04, 2008 @08:56PM (#22300982)
    The computing industry is seeing a dramatic shift towards single-package parallelism. Yet again, the x86 architecture largely holds back the CPU from becoming more all-purpose and doing GPU and PPU activities. There are actual engineering reasons you can't have a truly general-purpose ASIC (you can with an FPGA, but that would be too slow for the purpose). The GPU and PPU is where the interesting stuff is. They can actually write new macroarchitecture! They can design on-chip parallelism with far greater complexity without the need for a backwards-compatible architecture.

    The exciting aspect to this acquisition is the stronger fusion of two companies that have the ability to harness processing power without historical limitations. ATI/AMD really didn't have this, with AMD stuck with x86. Something like Cell is interesting in this space. However, it lacks flexibility in matching up the main core with the secondary cores. Why bring in PowerPC, for that matter?

    This will lead to great things. It is fun again to follow computer architecture.
  • by webmaster404 ( 1148909 ) on Monday February 04, 2008 @08:57PM (#22301008)
    I don't really think that all this will be better in the long run. While faster GPUs and better cards mean faster games, with all the DRM that Vista has it makes them more expensive and have poorer performance. Linux lacks in games to really test the cards out and getting drivers for ATI/Nvidia is a pain to say the least, and OS X really doesn't support non-apple internal hardware very well so that's not a test. Technology wise in the hardware department we are making leaps and bounds every day, however with the lack of a decent OS to test the new cards on, their true potential will be lost in DRM/Vista/Driver issues.
  • by RelliK ( 4466 ) on Monday February 04, 2008 @09:13PM (#22301164)
    Every GPU is similar in design to IBM Cell. It's just a simple but massively parallel DSP with very fast local memory.
  • Fab capability... (Score:4, Insightful)

    by Junta ( 36770 ) on Monday February 04, 2008 @09:56PM (#22301560)
    I would disagree with your characterization of the migration to P4 to core as 'quick'. I would also not declare Intel successfully turning around a product that was competitive across the board with AMD until Core 2, when they pulled in the good instruction per clock and the 64 bit instruction sets all together. It took years for Intel to develop something that *almost* completely dominates the AMD equivalents (one could still make a case for the AMD memory architecture at scale, which Intel will counter with QPI this year). And the clock didn't start ticking until AMD forced their hand.

    If it takes a company like Intel years to crank out something like that, a company with debatably the top notch fabrication capabilities in the world, what are nVidia's chances, given that only now they are feasibly able to leverage 65 nm fabrication processes for manufacture of their chips. Fabrication processes aren't everything, but it is a decent indicator of how the cards would be stacked for nVidia going into that market.

    I personally would love to see nVidia enter the market with a viable offering, if only because I fear AMD is blowing the situation and the market desperately needs comparable vendors to compete, but I'm not optimistic about nVidia's capabilities.
  • by Tolkien ( 664315 ) on Monday February 04, 2008 @10:04PM (#22301646) Journal
    I didn't think of this until after I posted, but how do you think graphics cards came about? They started off integrated with motherboards too, then it was discovered that dedicated hardware can perform MUCH better relatively cheaply. Same deal with sound cards, ditto network cards, what with the KillerNIC now. It's pure logic really, specialization leads to better performance.
  • by NeMon'ess ( 160583 ) * <flinxmid&yahoo,com> on Tuesday February 05, 2008 @02:35AM (#22303642) Homepage Journal
    With physics acceleration, the little things that don't feel real could be done.

    Running through grass could cause it to deform and brush the character, and some of it gets stepped on and stays bent down. Or in sports games, each limb could have a better defined clipping box and rules for how it can flex.

    Then when two players collide going for a ball, they hit more realistically and don't clip through each other. Especially on the slow motion replays it would look nice.

    Or in a racing game, when cars crash, they could really crash. Imagine bodywork deforming and "real" parts going flying, instead of only a flash of sparks.

    Also, it would be cool for grenades and other explosives to properly damage the room and buildings in games that want realism. Walls that crumble into rubble. Tables that break into chunks and splinters. Ceilings that collapse when the supports are destroyed or weakened too much.

    Then outside, no more indestructible walls. When I ram a truck or tank into an unreinforced building, something actually happens. As in the vehicle crashes through the wall, or continues through the building with momentum.
  • by Antarius ( 542615 ) on Tuesday February 05, 2008 @04:26AM (#22304204)

    Nvidia don't need to *kill* AMD/ATI, they only need to stay one or two generations ahead of them in technology. So they *could* license them "last years tech" for use on their cards, to make "Least Common Denominator" not a factor which excludes their latest-get tech implementations.
    I wish I still had Mod-Points, 'cos that deserves a +1, Insightful!

    Yes, people seem to forget that business doesn't have to be ruthless. Sure, you can take that path and it has been proven to be effective by people in many industries, including IT. Punctuating your sentences with chairs can also help emphasise a point.

    Many successful large companies quickly learn that the "Us vs Them" mentality isn't always necessary - and licensing IP or standards in this fashion can be quite lucrative! (Oh no... I made a positive reference on Slashdot that valid IP & standards being alright to license for profit... There goes my Karma!*)

    Intel's licensing of its' SSE extensions to their competitors is a good example of how a standard can be strengthened and made more effective by 'working with' their competitors, as was AMD's licensing of x86-64 to Transmeta.

    Of course, this is NVIDIA we're talking about. The likelihood of them licensing it, even for profit, is about as high as Microsoft donating millions (of dollars, not bugs) to the WINE project...

    *For the FRZs, I am against Patent Trolls, but for a company/individual's right to profit from a defined standard if another company wants to benefit from their R&D rather than re-invent the wheel! This is, of course, completely different to Joe Scumbag getting a Patent for some-general-nose-picking-device (idea only, no intention to develop) and then extorting any companies that then try to develop a real nose-picking-device. That would be "Just Plain Wrong(tm)"

    So you see, I'm a good sycophAnt... I hate Darl McBride too! Don't take it out on my posts, please!
  • by TheLink ( 130905 ) on Tuesday February 05, 2008 @06:56AM (#22304872) Journal
    The difference between graphics and physics isn't that subtle.

    For most games if you turn down the graphics the gameplay isn't supposed to change that much. So people with cheaper video cards can still play the game.

    Whereas what happens if you turn down the physics? For the gameplay to not change the crap that's bouncing around can't matter at all.

    I'd rather the physics mattered.

    But if the physics mattered, people with cheaper physics cards might not be able to play the game.

    The game makers won't like that :).

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...