Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software Businesses Hardware

NVIDIA To Buy AGEIA 160

The two companies announced today that NVIDIA will acquire PhysX maker AGEIA; terms were not disclosed. The Daily Tech is one of the few covering the news to go much beyond the press release, mentioning that AMD considered buying AGEIA last November but passed, and that the combination positions NVIDIA to compete with Intel on a second front, beyond the GPU — as Intel purchased AGEIA competitor Havok last September. While NVIDIA talked about supporting the PhysX engine on their GPUs, it's not clear whether AGEIA's hardware-based physics accelerator will play any part in that. AMD declared GPU physics dead last year, but NVIDIA at least presumably begs to differ. The coverage over at PC Perspectives goes into more depth on what the acquisition portends for the future of physics, on the GPU or elsewhere.
This discussion has been archived. No new comments can be posted.

NVIDIA To Buy AGEIA

Comments Filter:
  • by User 956 ( 568564 ) on Monday February 04, 2008 @07:34PM (#22300690) Homepage
    The Daily Tech is one of the few covering the news to go much beyond the press release, mentioning that AMD considered buying AGEIA last November but passed

    Well, that's because they were pondering a similar strategy to Microsoft, and were going to buy Yahoo.
    • by ma1wrbu5tr ( 1066262 ) on Monday February 04, 2008 @08:54PM (#22301538) Journal
      Here's a strange tangent... or two

      It's almost like some bizarre comic.

      Lets imagine that AMD and ATI teamed to to be the Super Friends.

      And Intel and nVidia are the Legion of Doom.

      Now, let the battle for the universe begin.


      At least that's how I feel when I read ./ers comments sometimes.
      We geeks tend to take ourselves entirely too seriously,

      Grammatical and spelling errors are bonuses.

      • Re:off on a tangent (Score:5, Informative)

        by mabhatter654 ( 561290 ) on Monday February 04, 2008 @11:20PM (#22302720)
        except Intel doesn't think they need Nvidia... that's why they've got nearly all the notebook vendors pumping out crappy built-in graphics that just barely run Windows Vista. ATI saw the writing on the wall and got themselves bought by AMD. Now AMD battles on CPUs, integrated graphics, and high end graphics... Intel can never buy Nvidia because they'd be instantly sued. Nvidia overpriced themselves, even with all the work they did for AMD, and the matching logos... stock holders were just too rich for AMD.

        This makes Nvidia the "odd man out" because they don't make processors. Both Intel and AMD have integrated solutions and obviously want physics processing on the CPU so that they can sell 7 core 3.21GHz processors. NVidia has to break the mold if they want sales... they got shunned the last round of consoles for IBM and ATI, and Microsoft pretty much let ATI write the book for DX10 this round. NVidia + Ageia only makes sense if they'll make an open source console that runs either AMD or Intel CPUs. Games would need to run flawlessly, without "installing" just like a console. There's a hole for PC gaming right now... Apple's not filling it (they think it's stupid) Wintel is not helping (Microsoft only wants Vista gaming, and Intel wants to sell integrated graphics) so a well done Linux console could help... but there's too much IP in the way to make it happen.
        • Re: (Score:2, Informative)

          by greazer ( 165571 )

          they got shunned the last round of consoles for IBM and ATI, and Microsoft pretty much let ATI write the book for DX10 this round.
          Last I checked, the graphics in PS3, a.k.a. RSX, was NVIDIA designed.
          • both Xbox and Wii were designed/manufactured by ATI... you see how PS3 is selling? nVidia got the short straw this round.
        • What about nVidia and Sun? Sun has plenty of intellectual property in processors area - too bad its current processors are optimised for many threads of lower performance
          • Re:off on a tangent (Score:4, Informative)

            by default luser ( 529332 ) on Tuesday February 05, 2008 @01:33PM (#22309772) Journal
            But Sun doesn't have an x86 processor, and this is the key.

            Nvidia needs a an x86 processor to compete. Sure, Nvidia could just adapt their GPU architecture and expand the language to make a general-purpose VLIW processor. They could package it and sell it as an Itanium competitor. But nobody wants to use a non-x86 chip in mainstream markets, and that's where the long-term money is.

            This is why Windows, Linux, Solaris, BSD, and now even OS X run on x86: if your OS has redeeming or unique qualities, more people will buy based on OS features alone if your hardware platform is agnostic.

            And herein lies two problems: one, while you can make x86 processors without a license, you are constantly in danger of litigation from Intel's massive patent portfolio. In the last two decades, every x86 chipmaker has eventually negotiated a cross-license agreement with Intel. The other problem is, it is hard to build a new x86 processor from-scratch. Thus, a takeover bid for an x86 processor manufacturer is likely the best way to solve Nvidia's problem; they get a license to keep Intel at-bay, and a solid starting point.

            I'm thinking Via, personally. Their sales have slumped in the last year, and they've stopped making Intel chipsets. In fact, Intel has been bullying poor Via for the last year, offering a new Intel chipset license if they just stop manufacturing CPUs. Either Nvidia will buy Via, or Via will spin-off their processor division for some cash. Thanks to the Intel cross-license Via purchased along with IDT, their processor arm is a goldmine in the long-run.
            • Via likes their processors too much... it's their president's pet project. Intel has already squeezed ATI out of OEM contracts just like Via on motherboards. Via is a bad match for nVidia though, they're way of thinking is good enough and cheap. nVidia and SUN sound good, if SUN would let somebody else make UltraSparc processors for cheap. That is what nVidia needs to do... combine with a Linux platform they could pull it off, but Sun doesn't really want to open up Sparc either... nVidia is just stuck fo
  • by Macfox ( 50100 ) * on Monday February 04, 2008 @07:35PM (#22300710)
    This won't float unless they bundle it with the next generation GPU. AGEIA haven't been able to get traction with a dedicated card and neither will nVidia, unless a heap of games support it overnight.
    • by Kyrubas ( 991784 ) on Monday February 04, 2008 @07:43PM (#22300816) Journal
      It might be that nVidia doesn't even intend to use the overall PhysX stuff at all, but instead wants to tear it apart for the patents on specific design patents further optimization of their GPUs.
      • by 644bd346996 ( 1012333 ) on Monday February 04, 2008 @08:35PM (#22301380)
        Don't forget that PhysX has software out there, too. It hasn't been doing well against Havok, but it's obviously in NVidia's best interests to promote the use of physics engines in games, seeing as they could provide the hardware acceleration for them. I expect the PhysX engine will soon have the ability to use NVidia GPUs, and it will pushed as a more viable competitor to Havok, especially since Intel cancelled Havok FX.
    • Re: (Score:3, Interesting)

      by mrxak ( 727974 )
      That's just it, really. Games need to support it in large enough numbers, and need to do it well enough to make a difference between those without the cards and those that have them. Most people seem to think this is a joke, and the way CPUs are going anyway with extra cores, I think we'd be better off seeing multithreaded games instead of physics cards.
      • by Rival ( 14861 ) on Monday February 04, 2008 @08:15PM (#22301190) Homepage Journal
        Games are great at motivating the development of better video cards, and to some extent bus speeds, processors and other non-gaming-specific components. This is a good thing, though I have some old-man opinions on how Moore's Law is spoiling many developers.

        That being said, I don't believe games drive the adoption of hardware as much as you might be thinking. As a case in point, look at Vista. Ugly and bloated, yes, but perforce nearly everywhere. And the minimum requirements for Aero (which is the one feature your average user is going to jump on -- ooh, it's pretty!) are going to do more to push the next large jump in base video card standards than any given game.

        Retailers don't have enough fiscal incentives to stop pushing Vista, even if they do try to gain positive PR by selling Ubuntu or XP on a few low-end models. And if they're pushing Vista, they want to support the pretty interface the public expects. By making hardware-accelerated rendering a practical requirement of the OS, Microsoft has raised the bar of the "minimum acceptable" video card.

        Right now we see physics cards as a niche product, barely supported. It has been the same with all technical developments. But if we're heading toward 3D interfaces (which I believe we are,)then physics can only play an increasing roll in such an environment. If that should become the case, then a dedicated processor will be much more valuable then assigning a generic CPU core to try and handle the calculations.
        • Intel has already killed that train.. they made GMA950 and X3100 to be "just enough" that Microsoft would certify them for "full" vista effects. Once that happened, gaming on any store-bought PC is pretty much dead under $1,000. Both Microsoft and Intel and the OEMs want to milk the market and charge twice the profit for "gaming" PCs even though the low end PC now is twice as fast as 3 years ago... except for the 5 year old graphics chip!
          • by Wesley Felter ( 138342 ) <wesley@felter.org> on Tuesday February 05, 2008 @01:41AM (#22303658) Homepage
            There's nothing stopping you from buying a low-end PC and installing a real GPU. AFAIK, most systems with integrated graphics still have a PCI Express slot so you can upgrade.

            I also don't see any gouging going in in gaming PCs. I recently built a $1000 gaming PC and prebuilt models with similar specs were selling for $1100-1200, which is not much of a markup.
            • But most users who buy PCs never upgrade their GPUs. Most game developers have two choices: write their software for the lowest denominator and limiting graphics complexity on the PC, or work on console titles where all of the hardware are guaranteed to be equally capable.

              PC gaming enthusiasts aren't a very large market compared to console gamers, therefore most studios are likely more focused on developing games for consoles.

              This hurts PC gaming, no matter how you look at it.
            • There's nothing stopping you from buying a low-end PC and installing a real GPU. AFAIK, most systems with integrated graphics still have a PCI Express slot so you can upgrade.
              Some do, some don't. Really depends on the model. For those that don't, your upgrade options are limited to PCI cards, or maybe a PCI-Ex1 card if you have those slots. None of the cards that will work on such slots are really going to do much for your gaming options.
    • Game makers won't require something people don't have, the mass majority won't buy something they don't need. NVIDIA however could throw this on their cards, and have it a big part of The way it's meant to be played. This would really give them something over ATI (other than faster cards :p)
    • Re: (Score:2, Interesting)

      by Rival ( 14861 )
      I see your floating point.

      The way I picture things, a Physics Processing unit (PPU?) will end up like FPUs: at first an optional, narrow-use add-on, then integrated on premium products, then more widespread as software vendors feel comfortable relying on it, and finally ubiquitous and practically indispensable.

      And then Slashdotters will be able to say, "You kids with your integrated PPUs nowadays -- when I was your age, we had to calculate trajectories and drag coefficients by hand, and we liked it that way
      • Re: (Score:2, Funny)

        by Grave ( 8234 )

        And then Slashdotters will be able to say, "You kids with your integrated PPUs nowadays -- when I was your age, we had to calculate trajectories and drag coefficients by hand, and we liked it that way!"
        But I already say that...
    • by RelliK ( 4466 ) on Monday February 04, 2008 @07:56PM (#22300970)
      I always thought that GPU + physics engine would be a perfect combination. Ultimately, the AGEIA card is just a DSP + software driver for calculating physics. A GPU is... also a DSP + software driver for calculating graphics. It wouldn't be too hard to write a driver that does both: some of the pipelines could be allocated to graphics, and some to physics. Might even make a software-configurable to dedicate more/less units to physics.
      • Re: (Score:2, Informative)

        by milsoRgen ( 1016505 )
        The interesting thing about the processor on an AGEIA card is it's similair in design to an IBM Cell processor [wikipedia.org]. Just a fewer number of SPE's...

        I can't seem to find the link to the paper that discussed it in detail, if I can find it I'll post it later...
      • Graphics and physics are subtly different tasks. GPUs aren't good at physics and vice versa. A chip which can do both will be a jack-of-all-trades, master of neither. They need to be separate/parallel processors, even if they're on the same chip.

        AGEIA's problem is that they're kinda obscure and don't make custom chips.

        What NVIDIA brings to the table is a strong brand name and a big manufacturing process. If they can get the price of the PPU down to half of what it is now (by integration into the graphics ca
        • Re: (Score:3, Insightful)

          by TheLink ( 130905 )
          The difference between graphics and physics isn't that subtle.

          For most games if you turn down the graphics the gameplay isn't supposed to change that much. So people with cheaper video cards can still play the game.

          Whereas what happens if you turn down the physics? For the gameplay to not change the crap that's bouncing around can't matter at all.

          I'd rather the physics mattered.

          But if the physics mattered, people with cheaper physics cards might not be able to play the game.

          The game makers won't like that :
    • by mikael ( 484 ) on Monday February 04, 2008 @08:31PM (#22301340)
      With the current PC architecture, the CPU has to send data to the Physics card, read the data back, then finally send it down to the GPU. This would have to be done for things like character animation (ragdoll motion), particle systems for visual effects (bouncing off the scenery/characters). Ideally, you would want the Physics processor to have a direct path to the GPU. Then you could avoid two of these steps.

      And if nothing else, Nvidia also get a team of engineers who have worked together and have both DSP and current game industry technology experience.
      • This is true as far as it goes, but not all the results of physics computations can simply be left on the GPU. This is OK for visual effects and improving animation, but if the outcome of a physics computation has an effect on the game world as a whole, then it needs to be sent to the CPU anyhow (as a rough example, suppose a defeated enemy drops his sword, which bounces down a cliff - this is more than just eye candy).

        'Jiggle physics' and particle systems, of course, can stay on-GPU.

      • With the current PC architecture, the CPU has to send data to the Physics card, read the data back, then finally send it down to the GPU. This would have to be done for things like character animation (ragdoll motion), particle systems for visual effects (bouncing off the scenery/characters). Ideally, you would want the Physics processor to have a direct path to the GPU. Then you could avoid two of these steps.

        Thats assuming you run your main loop on either the GPU or physics card which doesnt seem like a sensible idea....but then i dont really have a clue about game programming.

  • I, for one, welcome our new fairy overlords. I also welcome whoever gets rid of this joke.
  • by calebt3 ( 1098475 ) on Monday February 04, 2008 @07:41PM (#22300792)

    the future of physics
    I am personally hoping that the future of physics leads to warp engines.
  • by Scareduck ( 177470 ) on Monday February 04, 2008 @07:45PM (#22300838) Homepage Journal
    I don't pay close attention to the GPU market in general, though lately I've been interested in a few numerical modeling projects that could benefit from high-performance computing. The AMD Firestream 9170 [amd.com] is supposed to be released in the first quarter of this year, with a peak speed of 500 GFLOPS, most likely single-precision, but the beauty part is that it should also support double-precision, the numeric standard for most computational modeling. NVidia's option in this space is the Tesla C870 [nvidia.com]; I wonder whether this move to purchase another GPU line will divert resources away from their number-crunching-first GPUs.
  • Intel has Havok, Nvidia has Ageia, AMD/ATI (DAAMIT) has nothing.

    So developers will have to make 3 versions of the game, then?

    Can't wait for DirectX 11(tm) Now with Fizziks Power (tm).
    • Oh noes! There's no hope for Duke Nukem Forever if this happens! They can't complete one game let alone three.

      At least Satan can put away his parka for another year, and we can rest assured the sun is not going red giant anytime soon.
    • Can't wait for DirectX 11(tm) Now with Fizziks Power (tm).

      You say that sarcastically, but I actually would like to see something like "OpenPL" (Open Physics Language).

      • Re: (Score:2, Interesting)

        by DeKO ( 671377 )
        That's pretty much unfeasible. Every game needs a different physics simulation. Rigid bodies, ropes, soft bodies, particles, cloth, and so on; each requires a very different strategy. And there are many special cases where you can customize the algorithms for your specific simulation; using a more general algorithm when a specialized one is possible is less efficient.

        And this doesn't even get into the details about strategy; continuous vs fixed time steps, different orders of integration, collision detectio
      • Re: (Score:3, Informative)

        Well, there's already Open Dynamics Engine [ode.org].
  • by Crypto Gnome ( 651401 ) on Monday February 04, 2008 @07:52PM (#22300932) Homepage Journal
    1. Purchase Aegia
    2. Continue selling dedicated Physics addon-cards
    3. Integrate PPU onto Graphics Cards
    4. (somewhere along the line, get full Microsoft Direct-Something endorsement/support of dedicated physics processing)
      • possibly by licensing to AMD "PPU included on Graphics Card" rights, thusly invoking the power of Least Common Denominator
    5. Integrate PPU circuitry/logics into GPU (making it faster/more efficient/cheaper than equivalent solution licensed to AMD)
    6. ?? Profit ??
    In the end, for this to *really* succeed, it needs to be a "Least Common Denominator" factor. So it *requires* full support by Microsoft and Direct-X (them being The Big Factors in the games industry). And in order to get full support from The Windows Monopolist, you'll probably (not absolutely necessary, mut it'd make it much easier to convince Microsoft) need to enable AMD/ATI to leverage this technology, to some degree.

    Remember folks, Nvidia don't need to *kill* AMD/ATI, they only need to stay one or two generations ahead of them in technology. So they *could* license them "last years tech" for use on their cards, to make "Least Common Denominator" not a factor which excludes their latest-get tech implementations.
    • Re: (Score:2, Insightful)

      by Antarius ( 542615 )

      Nvidia don't need to *kill* AMD/ATI, they only need to stay one or two generations ahead of them in technology. So they *could* license them "last years tech" for use on their cards, to make "Least Common Denominator" not a factor which excludes their latest-get tech implementations.

      I wish I still had Mod-Points, 'cos that deserves a +1, Insightful!

      Yes, people seem to forget that business doesn't have to be ruthless. Sure, you can take that path and it has been proven to be effective by people in many in

  • Interesting news. (Score:3, Insightful)

    by Besna ( 1175279 ) * on Monday February 04, 2008 @07:56PM (#22300982)
    The computing industry is seeing a dramatic shift towards single-package parallelism. Yet again, the x86 architecture largely holds back the CPU from becoming more all-purpose and doing GPU and PPU activities. There are actual engineering reasons you can't have a truly general-purpose ASIC (you can with an FPGA, but that would be too slow for the purpose). The GPU and PPU is where the interesting stuff is. They can actually write new macroarchitecture! They can design on-chip parallelism with far greater complexity without the need for a backwards-compatible architecture.

    The exciting aspect to this acquisition is the stronger fusion of two companies that have the ability to harness processing power without historical limitations. ATI/AMD really didn't have this, with AMD stuck with x86. Something like Cell is interesting in this space. However, it lacks flexibility in matching up the main core with the secondary cores. Why bring in PowerPC, for that matter?

    This will lead to great things. It is fun again to follow computer architecture.
  • So, I'm assuming I'm not getting all the physics simulation quality I can get out of my games? The whole deal with the bridges collapsing in real time and all sorts of junk bouncing around isn't the ultimate physics experience? Is there... Another level of ragdoll I'm not experiencing? Is there some dynamic to a flaming barrel rolling down a hill my computer can't handle?! Or.. Or.. Is it Nvidia making one of its patented cash grabs?! Considering all the physics simulations in games to date have been done o
    • by dbIII ( 701233 )
      Consider something like WoW - lots of pretty pictures but you can run through people or monsters. Collision dectection requires a bit of effort when there are a lot of objects.
    • by NeMon'ess ( 160583 ) * <flinxmidNO@SPAMyahoo.com> on Tuesday February 05, 2008 @01:35AM (#22303642) Homepage Journal
      With physics acceleration, the little things that don't feel real could be done.

      Running through grass could cause it to deform and brush the character, and some of it gets stepped on and stays bent down. Or in sports games, each limb could have a better defined clipping box and rules for how it can flex.

      Then when two players collide going for a ball, they hit more realistically and don't clip through each other. Especially on the slow motion replays it would look nice.

      Or in a racing game, when cars crash, they could really crash. Imagine bodywork deforming and "real" parts going flying, instead of only a flash of sparks.

      Also, it would be cool for grenades and other explosives to properly damage the room and buildings in games that want realism. Walls that crumble into rubble. Tables that break into chunks and splinters. Ceilings that collapse when the supports are destroyed or weakened too much.

      Then outside, no more indestructible walls. When I ram a truck or tank into an unreinforced building, something actually happens. As in the vehicle crashes through the wall, or continues through the building with momentum.
      • Tables that break into chunks and splinters.

        Almost every game nowadays has breakable objects. The chunks are generally pre-determined, not generated based on where you hit, but it works.

        Or in a racing game, when cars crash, they could really crash. Imagine bodywork deforming and "real" parts going flying, instead of only a flash of sparks.

        If you want something like that that's out now, check out FlatOut and FlatOut 2 on Steam. The cars are fully physics simulated, and a lot of the track is destructible.
        But still, the reason cars can't be damages is normally a licensing issue. "You can use our cars, but they can't be damaged". FlatOut (2) has no licensed vehicles.

        Destructible buildings though, that WO

        • by grumbel ( 592662 )

          Almost every game nowadays has breakable objects. The chunks are generally pre-determined, not generated based on where you hit, but it works.

          It "works" in the sense that it looks rather ridiculous. You might not notice it in a fast action game, because you are in the next room before the chunks and splinters hit the ground, but for actual interaction with the game world the current stuff is just way to simple. One of the easiest examples is rag-doll animation, which looks nowhere near what a real human would look, its not even close and the reason is simply that accurate simulation would need a bunch more processing power then is available.

          But still, the reason cars can't be damages is normally a licensing issue.

          Lic

          • by TheThiefMaster ( 992038 ) on Tuesday February 05, 2008 @08:12AM (#22305516)
            The biggest problem in FPS game physics at the moment is a ridiculously trivial one:
            When the player moves forward he pushes whatever is under him forward.

            I'm not kidding, try standing on something and moving. In reality there's no real problem, as long as the centre of gravity of you+object is under the object's base. If it's light enough, you might kick it out behind you. In a game, 99% of the time you will kick the object under you forwards and out from under you.

            Try standing on a barrel in Half-Life 2 for example. The game even gives you a puzzle at the start where you have to stack crates to get through a window, where you can notice this problem.
            • That's because most games programmers implement their character as a solid object (until they ragdoll due to collision or death). The object is moved by pushing from behind rather than the "feet" pulling forward. It's not a problem with the physics so much as a problem with optimised implementation in games.
      • Running through grass could cause it to deform and brush the character

        Crysis [wikipedia.org] has this, it isn't entirely accurate but the grass and tree branches move as you push through them.

        grenades and other explosives to properly damage the room ....Tables that break into chunks and splinters.

        Most games from the last few years have destructable objects, as said elsewhere the 'pieces' are pre-determined but the effect is good enough.

        Ceilings that collapse when the supports are destroyed or weakened too much......W

        • by Justus ( 18814 )
          Ah, yes, that makes sense. Let's use Crysis as an example for why dedicated physics processors wouldn't have an impact on performance. After all, high-end systems running Crysis on higher quality settings (which presumably impacts the physics simulation) get an average of 30fps. No room for improvement there, right?
    • Re: (Score:3, Interesting)

      Actually the thing you are missing is the bridge pieces bending before it collapses, the barrel being dented as it rolls down the hill, or the rag doll limbs breaking or being ripped off with the proper application of force. Those things cannot be done in real time on a CPU.

      Unfortunately most of those things are only avialable in demos atm. UT3 has a couple of special maps that do some neat stuff, but then you start running into problems with the video card trying to keep up with the 100 or so bricks

      • Comment removed based on user account deletion
        • I wouldn't get too excited about Valve, last I read they are skipping over support for 2 and 3 core CPU's and going straight to 4 for all their future goodness, so it's going to be a while before that hits the mainstream.

          I still like the idea of a PPU, but it would be nice if it were intergrated straight into the GPU or the CPU, or a chip on the Mobo, just like so many other dedicated devices that have come before it.

          I just wish Nvidia and ATI and 3rd party software and start seeing some neat GPU based t

  • by idonthack ( 883680 ) on Monday February 04, 2008 @08:30PM (#22301328)
    With dual-core coming standard now on all new PCs, and multi-core rapidly approaching, physics cards are done for. Graphics cards are still a good idea because the kind of calculations they do can be heavily hardware-optimized in a way that general purpose CPUs are not, but physics cards don't do anything a second (or fourth) full speed CPU isn't capable of doing better and faster.
    • this is the key, quad core cpu's are around the corner, one core for physics. sure a cpu core may not be great at physics (well, not as good as a dedicated chip) but consumers won't see much difference. may use one of the cores of a quad core gpu. anyway, the dedicated physics card is dead in the water. pull the patents out and plug them into the gpu.
    • by XaXXon ( 202882 )
      Shoot. With all those available cores, let's move everything back to the CPU. Get rid of graphics cards.

      Oh wait. General purpose CPUs aren't very good at certain types of workloads.
    • by Tolkien ( 664315 )
      You're missing the point.

      The whole point of a physics card is to move the calculations away from the CPU (which is so generalized it can't be optimized better than a hardware implementation). Having a card dedicated to processing physics simulations means it gets 100% of the PPU's attention instead of 10% of CPU1's attention, 11% of CPU2's attention, 5% of CPU3's attention and 13% of CPU4's attention (This is after all a PPU, not a CPU). Not only that, the PPU and the hardware on which it is set is optimiz

      • Re: (Score:2, Insightful)

        by Tolkien ( 664315 )
        I didn't think of this until after I posted, but how do you think graphics cards came about? They started off integrated with motherboards too, then it was discovered that dedicated hardware can perform MUCH better relatively cheaply. Same deal with sound cards, ditto network cards, what with the KillerNIC now. It's pure logic really, specialization leads to better performance.
        • I didn't think of this until after I posted, but how do you think graphics cards came about?
          How did graphics cards come about? Wow, you must be young. Let me tell you.
          Sometimes, when a mainframe and a television set love each other very much...

          Naw, man. At least in terms of the PC architecture, graphics hardware has been available in expansion card form since the original IBM PC offered your choice of MDA (text only) and CGA (limited four-color) cards.
      • I get the point of a PPU. But at this point brute force is cheaper and easier. CPUs do well enough at processing physics, and they are fast enough and cheap enough that it's smarter to run respectable physics code there instead of spending an extra $100 to $200 on an expansion card that provides almost zero gameplay enhancement. PPUs may have been a good idea ten or fifteen years ago when CPUs were slower, but game physics engines haven't improved significantly since 2003. Faster processors are available ch

        • by sssssss27 ( 1117705 ) on Monday February 04, 2008 @11:27PM (#22302788)
          I don't think you are thinking grand enough. I remember the days where you didn't need a dedicated graphics card to play games and I'm only 21. You really didn't get improved game play per say but it did look a heck of a lot better. A dedicated physics processor though has the potential to vastly improve game play and realism.

          Imagine instead of designers creating models of buildings they actually built them. That is a brick building had individual bricks all stacked on each other. Whenever you hit it with an explosive it would actually crumble like a real building or burn like a real building. That is a lot of calculations which a general CPU isn't the best at.

          The thing is not enough people have PPUs in their computers so you can't include it into core game play yet. Hopefully nVidia acquiring Ageia will allow them to start bundling it with their GPUs or even better yet offer it embedded on their motherboards. While graphics are easily scaled, game play elements are not. I wouldn't be surprised if you see PPUs being crucial to the game on consoles before PCs.
    • Sure, but why do those cores need to be symmetrical? Wouldn't it make sense to have some cores for computationally intense work and others for tasks that require more memory bandwidth and shorter pipelines (and more of them)?
  • They've had a Linux version of their SDK for a long time, but it was a software-only version and didn't support their hardware. Given NVidia's lack of enthusiasm for Linux, I suppose if there was any chance that Ageia might have listened to those of us that wanted hardware support on Linux, it's gone now.
    • by moosesocks ( 264553 ) on Monday February 04, 2008 @08:41PM (#22301428) Homepage
      Unless something's changed in the past year or two it's been since I stopped using Nvidia, their drivers always tended to be quite good.

      They were Binary-only, but they were good in that they were fast, stable, and supported all the major functions of their cards. Hardly half-assed if you ask me.
      • I completly disagree, so nvidia dont open source theyre driver, but at the end of the day they release good binaries, I see no advantage to open source drivers for videocards:
        *a community isn't going to develop video card drivers as well as the people who make the cards
        *a community is much more likely to stall and slow down
        *in most cases the fact software is open source doesn't mean much as one company or another has complete control over the product (look at OO)
        the only arguments for it are that
        *more peopl
  • Considering that I have seen far more games that use Havok than PhysX, I think Intel is at least somewhat in the better position as far as propagation. However, Nvidia could come up with some cool integrated hardware and really push that API to the developers in order to gain some ground. On the other hand consumers would have to bite, and it doesn't seem many have yet caught the physics fever. I have seen Havoc used in numerous console games as well, but AFAIK that's only an API...will Nvidia try to push t
  • I'm surprised no one has mentioned CUDA [nvidia.com] yet, which is Nvidia's existing entry into the world of general purpose GPU computing. So far their target market is mostly dedicated calculations with limited interoperability with OpenGL/DirectX, but I expect we'll see future cards that can partition their compute resources between multiple tasks, like rendering and physics. Hopefully, porting over the PhysX SDK will help grow the GPGPU toolset, and make it easier to use.

    (CUDA already transforms the 8800 GTX i

  • A year ago both Nvidia and ATI/AMD both showed off their GPUs doing HAVOK acceleration equal or better than AGEIA. With ATI claiming to have a 7 month lead... Could this be a catchup move of patent grab by NVIDIA?

    http://www.reghardware.co.uk/2006/06/06/ati_gpu_physics_pitch/ [reghardware.co.uk]
    • How great is the pipeline?

      If I have a server with example game running and it's calculating physics, can I then grab back that batch of models I sent off for physics and then send the updated coordinates to the players computers? Or is it just part of the graphics rendering pipeline?
      • by Animats ( 122034 )

        Can I then grab back that batch of models I sent off for physics and then send the updated coordinates to the players computers? Or is it just part of the graphics rendering pipeline?

        It's mostly part of the graphics rendering pipeline. Ageia's "physics engine" is mostly used for particle effects (smoke, fire, rain, snowflakes, etc.) which don't affect the gameplay at all. There were attempts to use it for actual game physics, but the performance was no better than doing that on the main CPU. For part

  • Does this mean that all boats on the Aegean Sea will now have to use proprietary rudders?
  • When you can count the number of games that support hardware physics on one hand (actually I made that up, please correct me if I'm wrong), you can be pretty sure that there isn't much volume in the PPU market.

    Heck, fewer and fewer PC's come with dedicated GPUs. Integrated video can now handle dual monitor output and HDTV decoding. It's only gamers and graphics designers who need them now.
    • Heck, fewer and fewer PC's come with dedicated GPUs. Integrated video can now handle dual monitor output and HDTV decoding. It's only gamers and graphics designers who need them now.

      Correct me if I'm wrong but doesn't integrated video use an embedded GPU to do the bulk of the work? You can't run Vista with Aero turned on without a GPU so all the Vista PCs shipping with Aero enabled have GPUs in them. I think you meant fewer and fewer PCs come with dedicated graphics cards or high end GPUs. Which is the s
      • by WoTG ( 610710 )
        You're right, dedicated graphics cards is what I meant.
        Though, integrating mainstream GPU functionality into the CPU core is only a few years away, IMHO. AMD has said about as much with regards to their "Fusion" core plans. Time will tell.
        • That's what I figured just wanted to make sure.

          Yeah, Fusion should do wonders for the integrated GPU market but we'll always have the dedicated graphics card segment for the enthusiast. The more things we can get on a single chip the better for applications where size is important.
    • Ever since 3D hardware acceleration took off, leaving software rendering behind, the gaming market has been in its own world of accelerator cards.

      I think looking back at how 3dfx and glide-only games shows some important similarities. 3dfx managed to capture enough of the gamer market that games were made that would only work on their cards. Maybe only a dozen, but it was still notable that only part of the total market supported having those games. Many of the other games could be run in either Glide or Op
  • physics engines are still relatively simplistic due to the computational difficulty involved. I'd love to see what a good game designer could do with physics capabilities comparable to what modern graphics capabilities look like.
  • The summary makes it sound like AMD passed on buying AGEIA because they felt that GPU-Based physics acceleration was "dead".

    Actually, AMD designed this whole Hyper Transport bus with dedicated hardware acceleration co-processors in mind. In their world, you wouldn't need a dedicated add-in board, just an open HT socket on the motherboard. Then if you want to add the physics acceleration, just pop in the chip. Putting the accelerator on the GPU card would increase the costs of an already expensive board,

Sigmund Freud is alleged to have said that in the last analysis the entire field of psychology may reduce to biological electrochemistry.

Working...