Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Graphics Software Upgrades Hardware

New Multi-GPU Technology With No Strings Attached 179

Vigile writes "Multi-GPU technology from both NVIDIA and ATI has long been dependent on many factors including specific motherboard chipsets and forcing gamers to buy similar GPUs within a single generation. A new company called Lucid Logix is showing off a product that could potentially allow vastly different GPUs to work in tandem while still promising near-linear scaling on up to four chips. The HYDRA Engine is dedicated silicon that dissects DirectX and OpenGL calls and modifies them directly to be distributed among the available graphics processors. That means the aging GeForce 6800 GT card in your closet might be useful once again and the future of one motherboard supporting both AMD and NVIDIA multi-GPU configurations could be very near."
This discussion has been archived. No new comments can be posted.

New Multi-GPU Technology With No Strings Attached

Comments Filter:
  • by argent ( 18001 ) <peter@slashdot . ... t a r o nga.com> on Tuesday August 19, 2008 @08:16PM (#24666693) Homepage Journal

    Can it work with Linux or OS X?

  • Re:Interesting (Score:4, Insightful)

    by argent ( 18001 ) <peter@slashdot . ... t a r o nga.com> on Tuesday August 19, 2008 @08:19PM (#24666731) Homepage Journal

    It seems to be using feedback from the rendering itself. If one GPU falls behind, it sends more work to the other GPU. It may have some kind of database of cards to prime the algorithm, but there's no reason it has to run extra benchmarking jobs.

  • Re:quick (Score:3, Insightful)

    by jfim ( 1167051 ) on Tuesday August 19, 2008 @08:20PM (#24666743)

    Someone port java to opengl.

    Seriously. That would rock.

    You can use OpenGL from Java with JOGL [java.net]. Or were you thinking of running a Java or J2EE stack on your GPU? (That would be a really bad idea, in case there were any doubts)

  • by TibbonZero ( 571809 ) <Tibbon@@@gmail...com> on Tuesday August 19, 2008 @08:30PM (#24666839) Homepage Journal
    So its obvious that these cards could have been working together now for some time. They aren't as incompatible as AMD and NVidia would like us to think. Of course this leaves only one course of action; they must immediately do something "weird" in their next releases to make them no longer compatible.
  • Re:Latency. (Score:4, Insightful)

    by Bill, Shooter of Bul ( 629286 ) on Tuesday August 19, 2008 @08:35PM (#24666901) Journal
    What is this 1996? That was true of Doom vs WinDoom before Direct X. I don't think I've had a problem since then.
  • GeForce 6800 GT (Score:5, Insightful)

    by Brad1138 ( 590148 ) * <brad1138@yahoo.com> on Tuesday August 19, 2008 @08:36PM (#24666913)
    How many people feel this is an old card that should be in a closet? If your not a hard core gamer that is a very good video card. My fastest card (out of 4 comps) is a 256meg 7600GS (comparable to a 6800GT) on an Athlon 2500+ w/1 gig mem. Plays all the games I want without a prob and is more than fast enough to run any none game app.
  • by Anonymous Coward on Tuesday August 19, 2008 @08:49PM (#24667019)

    Why does the summary include two links to the same article? If there are two links, shouldn't there be two articles?

    And why does the summary link the phrase "allow vastly different GPUs to work in tandem?" Not only isn't it a quote from the article, it actually contradicts the article. The article says "To accompany this ability to intelligently divide up the graphics workload, Lucid is offering up scaling between GPUs of any KIND within a brand (only ATI with ATI, NVIDIA with NVIDIA)." How did anyone get "vastly different GPUs" from this?

  • Re:GeForce 6800 GT (Score:5, Insightful)

    by Pulzar ( 81031 ) on Tuesday August 19, 2008 @09:20PM (#24667223)

    Plays all the games I want without a prob and is more than fast enough to run any none game app.

    *Any* app? Try getting an HD camcorder and editing some video of your kid/dog/girlfriend/fish and see how well your PC does. It's easy to make generalizations about other people based on personal experience. Resist the urge to do it.

  • Re:My god... (Score:4, Insightful)

    by x2A ( 858210 ) on Tuesday August 19, 2008 @09:30PM (#24667311)

    "How long until my home computer is hooked up to a 50 amp 240 volt line?"

    Mine already is... how do you usually power your computer?

  • by Anonymous Coward on Tuesday August 19, 2008 @10:16PM (#24667645)

    Ugh..the fact Ageia didn't do so hot and was bought by nVidia had NOTHING to do with the fact they didn't make drivers for OSX or Linux (i.e. 10% or less of the market) it's because there was no need for dedicated hardware since it could be incorporated into the GPU as it has been done in nVidias latest GTX2x0 series of cards.

  • Re:Interesting (Score:5, Insightful)

    by ozphx ( 1061292 ) on Wednesday August 20, 2008 @01:12AM (#24668983) Homepage

    Put it this way, if it was a disparate CPU multiprocessor board, and the summary said "Perhaps my p4 will now be useful again", everyone would be laughing.

    A 6800GT would be an insignificant fraction of a new card, and would still be under 10% of a $50 no-name 8 series (while still sucking down the same wattage).

    Considering that matched SLI is usually a waste of money - you can buy that second card in a years time when your current one shows age, and end up with a better card than twice your previous one (and supporting Shader Model Super Titrenderer 7), which your old card can't do), I'm not sure how this is going to be of much benefit.

    If it was useful to jam a bunch of cheap chips in then the card manufacturers would be doing it on more of a scale than the "desk heater dual" cards (which are basically SLI-in-a-box) at double price. You can't get a card with 4x 6800 chips at $5 each, because they'd be destroyed by the $10 8 series chip on a $50 card.

  • Re:Latency. (Score:4, Insightful)

    by bitrex ( 859228 ) on Wednesday August 20, 2008 @06:42AM (#24670793)
    As a recording musician I can attest to the fact that using computer based virtual instruments that have an output latency of more than about 6 or 7 milliseconds causes noticeable "lag" that's very annoying, unless one is playing the keyboard quite slowly. If the latency increases to 12-15 ms, it becomes pretty much intolerable.
  • Re:GeForce 6800 GT (Score:3, Insightful)

    by Pulzar ( 81031 ) on Thursday August 21, 2008 @01:54PM (#24692441)

    What about it? HD editing doesn't use a fast CPU.. it's disk I/O bound entirely.

    Well, that's just not even close to truth. I wouldn't even know where to start to refute that... I guess googling for bitrates of common HD video formats, and looking at CPU benchmarks doing some video encoding will be a good start in getting a little more informed on this topic.

Scientists will study your brain to learn more about your distant cousin, Man.

Working...