New Multi-GPU Technology With No Strings Attached 179
Vigile writes "Multi-GPU technology from both NVIDIA and ATI has long been dependent on many factors including specific motherboard chipsets and forcing gamers to buy similar GPUs within a single generation. A new company called Lucid Logix is showing off a product that could potentially allow vastly different GPUs to work in tandem while still promising near-linear scaling on up to four chips. The HYDRA Engine is dedicated silicon that dissects DirectX and OpenGL calls and modifies them directly to be distributed among the available graphics processors. That means the aging GeForce 6800 GT card in your closet might be useful once again and the future of one motherboard supporting both AMD and NVIDIA multi-GPU configurations could be very near."
Non-Windows drivers? (Score:5, Insightful)
Can it work with Linux or OS X?
Re:Interesting (Score:4, Insightful)
It seems to be using feedback from the rendering itself. If one GPU falls behind, it sends more work to the other GPU. It may have some kind of database of cards to prime the algorithm, but there's no reason it has to run extra benchmarking jobs.
Re:quick (Score:3, Insightful)
Someone port java to opengl.
Seriously. That would rock.
You can use OpenGL from Java with JOGL [java.net]. Or were you thinking of running a Java or J2EE stack on your GPU? (That would be a really bad idea, in case there were any doubts)
Time to make them imcompatible! (Score:4, Insightful)
Re:Latency. (Score:4, Insightful)
GeForce 6800 GT (Score:5, Insightful)
Two Links, One Article? (Score:2, Insightful)
Why does the summary include two links to the same article? If there are two links, shouldn't there be two articles?
And why does the summary link the phrase "allow vastly different GPUs to work in tandem?" Not only isn't it a quote from the article, it actually contradicts the article. The article says "To accompany this ability to intelligently divide up the graphics workload, Lucid is offering up scaling between GPUs of any KIND within a brand (only ATI with ATI, NVIDIA with NVIDIA)." How did anyone get "vastly different GPUs" from this?
Re:GeForce 6800 GT (Score:5, Insightful)
*Any* app? Try getting an HD camcorder and editing some video of your kid/dog/girlfriend/fish and see how well your PC does. It's easy to make generalizations about other people based on personal experience. Resist the urge to do it.
Re:My god... (Score:4, Insightful)
"How long until my home computer is hooked up to a 50 amp 240 volt line?"
Mine already is... how do you usually power your computer?
Re:Non-Windows drivers? (Score:1, Insightful)
Ugh..the fact Ageia didn't do so hot and was bought by nVidia had NOTHING to do with the fact they didn't make drivers for OSX or Linux (i.e. 10% or less of the market) it's because there was no need for dedicated hardware since it could be incorporated into the GPU as it has been done in nVidias latest GTX2x0 series of cards.
Re:Interesting (Score:5, Insightful)
Put it this way, if it was a disparate CPU multiprocessor board, and the summary said "Perhaps my p4 will now be useful again", everyone would be laughing.
A 6800GT would be an insignificant fraction of a new card, and would still be under 10% of a $50 no-name 8 series (while still sucking down the same wattage).
Considering that matched SLI is usually a waste of money - you can buy that second card in a years time when your current one shows age, and end up with a better card than twice your previous one (and supporting Shader Model Super Titrenderer 7), which your old card can't do), I'm not sure how this is going to be of much benefit.
If it was useful to jam a bunch of cheap chips in then the card manufacturers would be doing it on more of a scale than the "desk heater dual" cards (which are basically SLI-in-a-box) at double price. You can't get a card with 4x 6800 chips at $5 each, because they'd be destroyed by the $10 8 series chip on a $50 card.
Re:Latency. (Score:4, Insightful)
Re:GeForce 6800 GT (Score:3, Insightful)
Well, that's just not even close to truth. I wouldn't even know where to start to refute that... I guess googling for bitrates of common HD video formats, and looking at CPU benchmarks doing some video encoding will be a good start in getting a little more informed on this topic.