Asus Crams Three GPUs onto a Single Graphics Card 115
Barence writes "PC Pro has up a look at Asus' concept triple-GPU graphics card. It's a tech demo, so it's not going to see release at any point in the future, but it's an interesting look at how far manufacturers can push technology, as well as just how inefficient multi-GPU graphics cards currently are. 'Asus has spaced [the GPUs] out, placing one on the top of the card and two on the underside. This creates its own problem, though: attaching heatsinks and fans to both sides of the card would prevent it from fitting into some case arrangements, and defeat access to neighbouring expansion slots. So instead, Asus has used a low-profile heat-pipe system that channels the heat to a heatsink at the back of the card, from where it's dissipated by externally-powered fluid cooling pipes.'"
Drivers first. (Score:5, Interesting)
However, it seems with all of these methods, the weak link is always driver support. I think that drivers will have to develop further before anything like this can take true form and be useful.
As an aside, did anyone notice that half of the Slashdot description sounded like an advertisement for Asus GPU cooling?
Re: (Score:2, Insightful)
Re: (Score:2, Insightful)
I am still looking for a decent high end card that does not need two slots in my case. How about fixing heat and size issues first?
Bingo. I thought the main point of multi-GPU graphics cards (and multi-core processors) was to build good gaming rigs (and workstations) without having to use a monstrous extended ATX uber-tower with multiple CPU sockets and video card slots.
Improved manufacturing processes and software/drivers have allowed us to have multiple processor cores and GPUs in a shoebox-sized Shuttle XPC. Asus's big, hot, and inefficient card just shows us that current manufacturing processes and software/drivers aren't ready
Re: (Score:1)
Re:Drivers first. (Score:5, Insightful)
I think the talk about the cooling is important since one of the most difficult tasks is not how to get three GPUs on a single chip, but to get a viable cooling solution that doesn't sound like a vacuum cleaner and one that doesn't require too much space (or it would essentially kill the whole concept).
Re:Drivers first. (Score:4, Insightful)
Multi GPU is the only way to keep that breakneck pace, just like the CPU world is trying to deal with hitting the wall (or, depending on who you ask, the low hanging fruit has already been picked). But the penalties for the reach exceeding the grasp is absolutely catching up with them.
Re:Drivers first. (Score:5, Insightful)
In other words, it doesn't work! I'll worry about cooling 3 GPUs when they are at least able to do something useful! Until then I would cool this board by unplugging 2 of the GPUs and enjoying practically the same performance.
Re: (Score:2)
Re: (Score:1)
Re: (Score:1, Offtopic)
Yes, the 325i has more mileage behind it, maintainance costs are higher (as much as 4x as a spanking new Fiat Palio), but when it comes down to it, it's a freaking BM
Re: (Score:2)
However, it seems with all of these methods, the weak link is always driver support. I think that drivers will have to develop further before anything like this can take true form and be useful.
Well, of course the weak link is driver support. I don't care how many nodes your Beowulf system has--if you (or your software) don't know how to efficiently partition the load, your system is not going to be significantly faster than a single node. Ignore software at your peril.
Unless nVidia or ATI publishes enough information that a competent programmer can publish a third party driver, you're stuck with a system that doesn't work. It doesn't matter how good the underlying hardware is if the drivers the
Re: (Score:3, Interesting)
Kinda sorta. Splitting rendering across multiple GPUS has afaict become much harder lately. GPUs used to be mostly fixed function pipelines, while the current generation has more in common with programmable stream processors (e.g., shader programs).
Re:Drivers first. (Score:5, Insightful)
onionistic (Score:5, Funny)
Re: (Score:3, Informative)
http://www.theonion.com/content/node/33930 [theonion.com]
It pretty much invented the extreme advertising [encycloped...matica.com] meme.
Finally! (Score:2, Funny)
Re: (Score:1)
.
.
.
Not on your life, punk.
(Okay, maybe that was five words...still can't play Crysis.)
Reminds me of Razors. (Score:5, Interesting)
Same thing with CPUs and now GPUs. Problem is, at what point dose it become a pissing contest rather than a way to provide more performance for an application that needs it.
And speaking of those demanding applications. Am I the only one who notice that some of the latest video games running on the best available hardware provide no improvement in appearance or game-play over older games of a similar type running on older hardware?
It's bad enough that I am tempted to think the programmers are just adding fat to make sure the game demands a more expensive video card.
Kevin.
Re: (Score:2, Informative)
Re: (Score:2)
Worse yet they have Tiger Woods selling it.
I say worse because I am willing to bet good money that he either uses an electric shear or a has his face waxed.
Most black guys have difficulty shaving with a razor. Mostly because out beards grow curly from the root. This causes razor bumps unless we either save some stubble (my solution) or uproot the hair with wax.
Re: (Score:2)
Re: (Score:1)
Or you could try shaving with a straight razor. Once you get over the fear of it and get to the point where you don't draw blood every time it touches your face (about 3-4 weeks), you'll find that it works really well. You get a nice clean shave (cuts down on bumps), it goes just as quickly as regular
Re: (Score:1)
When you shave close, the hair is stretched out from the tension, then cut. When you remove the tension, the hair recedes (no longer being stretched outward) back under the level of the skin. The natural curl results in a razor bump when it grows back out, because it doesn't line up with the pore. Instead, as it grows under the skin, it folds onto itself and causes irritation and pus and nastiness.
You then get some tweezers and dig it out.
There
Re:Reminds me of Razors. (Score:4, Funny)
The 7-blade razor: http://www.youtube.com/watch?v=KwlKN39wlBE [youtube.com]
The 16-blade razor: http://www.youtube.com/watch?v=GjEKt5Izwbo [youtube.com]
The 18-blade razor: http://www.youtube.com/watch?v=wYyxK2vGyVw [youtube.com]
Re: (Score:3, Interesting)
In a BadAnalogyGuy way I do hope that some computers (especially laptops) move in this direction. Why do I need a 2 gHz dual core processor for my EEE style laptop. Break it into a cheap, slower, power efficient general processor then have a few other small, cheap, power efficie
Re: (Score:1)
Re:Reminds me of Razors. (Score:4, Insightful)
Re:Reminds me of Razors. (Score:4, Informative)
Re: (Score:2)
Re: (Score:1)
Re:Reminds me of Razors. (Score:5, Interesting)
That being said I used to use the 2 blade Gilette razor and have since moved on to a 3. What I have noticed is that it does the job faster and the overall blade lasts longer. What i suspect is happening is that the first blade may dull, but its making the rough cut anyway, then one of the other blades which is sharper follows up with a cleaner cut.
I think the more important advancement has been all the other stuff on the blade head. the mounted springs, the lube strip, the rubber precut strips that tension the skin, etc. I suspect all those contribute more to the newer blades being better.
Re: (Score:2)
I think the more important advancement has been all the other stuff on the blade head. the mounted springs, the lube strip, the rubber precut strips that tension the skin, etc. I suspect all those contribute more to the newer blades being better.
I agree with that comment, but I understand the argument to sell lower quality blades in order to keep selling them (eg, they wear out), but why hasn't anyone made a good 2 blade razor with a push-to-clean bar, and ceramic (or Ginsu) blades that will last years? Seriously, I'd pay top dollar for such a razor (eg, 50 bucks or so).
Re: (Score:2)
What i suspect is happening is that the first blade may dull, but its making the rough cut anyway, then one of the other blades which is sharper follows up with a cleaner cut.
Razor blades do not generally dull from use - they dull from rust. [lifehacker.com]
You can dry them with a hair-dryer to increase longevity, or use use shaving oil [allaboutshaving.com] in the shower, which tends to leave the blades with a thin, protective layer of oil - I typically get at least 3 months, generally 6, out of a single blade cartridge that way.
Note - I have no financial interest in this particular shaving-oil vendor.
Re: (Score:1)
no real linux hippies (Score:2)
Re: (Score:2)
- are quicker
- don't cut you, ever
- don't require putting dumbass shaving foam on beforehand
- give just as smooth a shave, if not smoother, than most razors.
Then I just take a shower afterwards and don't get annoying itching on my face, either.
Re: (Score:1)
- quicker? I'm not so sure about that. At first, yes, it took a lot longer to shave with a blade. Now that I've been doing it for a while, it honestly only takes 5 minutes.
- don't cut you, ever? Probably wouldn't say "ever", but I agree, it's hard to cut yourself with an electric shaver. However, I always seemed to end up with nasty red splotches, almost like a burn or rash, all over my neck after shaving. Haven't had that
Re: (Score:2)
Re: (Score:2)
Re: (Score:2, Interesting)
Also, how you roll the blade has some effect on which blade gets the first cut... similar to a surf board, its flat... however when you put your weight to the back, its only the back of the board thats touching the water...
Re: (Score:1)
Re: (Score:1)
Re:Reminds me of Razors. (Score:5, Insightful)
Unlike the old days you can see a huge difference between a CGA, EGA, VGA and Super VGA. Then Super VGA held on for a while then the 3d Cards started coming out and there were huge improvements even now. But I think we are getting to a point again where the details they can produce is beyond what is needed.
Re:Reminds me of Razors. (Score:4, Interesting)
Which is probably why we're getting a lot more chatter on the raytracing issue. I believe that'll be the next big step.
Re:Reminds me of Razors. (Score:5, Interesting)
Re: (Score:2)
You probably pissed off somebody with mod points and they're taking it out on you, regardless of the post's contents. It happens. Maybe I'll catch it in M2.
Re: (Score:1)
Trollish of me, but maybe it's because you can't spell.
Re:Reminds me of Razors. (Score:4, Insightful)
For games on the desktop, the maximum resolution you have to push (realistically) is 1920x1200 (really, anything larger at 2-3 feet away is overkill), and the maximum resolution you have to push on a television (if you're into that) is 1920x1080. Funny, midrange $150-200 cards can do that today, with high quality, in all games except Crysis.
So yeah, I can see the slowdown in graphics tech coming around. The fact that you can play any modern game in medium settings at 1280x1024 with a $75 add-in card shows us exactly why we're hitting the developmental wall. Most people are happy with our current level of graphics, and the cost of new graphics architectures rises exponentially with every new revision; so, if you don't have the demand, you're not going to rush production on the next-generation of GPU architectures.
Unfortunately, this leaves the %1 of hardcore gamers bitching, and they tend to bitch the loudest, so Nvidia and ATI are trying to placate them with stop-gap SLI solutions.
Re: (Score:2)
Re: (Score:2)
Re: (Score:1, Interesting)
Re: (Score:2)
Yeah, they did this when Matrox released the G400 and brought multiple-monitor gaming to the world. No, it didn't catch on, except for flight simulators.
The fact is, for most games, multiple monitors add very little immersion and questionable utility. Basically, unless your platform makes it standard (like the Nintendo DS), you're not going to see wide support.
WHY mod this flamebait? (Score:2)
The difference between CGA and my Amiga was immense. The difference between no antiscopic filtering and 16x antiscoping filtering is best left to those with 20/10 vision.
Once the pixels got indistinguishably small, and the hues varied to the limit of human perception, we were left only with increasing art quality, animation, lighting and
Re:Reminds me of Razors. (Score:4, Insightful)
I think the biggest irony is that in multiplayer competitive people disable all these features anyway because
1) framerate is king
2) getting rid of advanced lighting, bump mapped animated textures, smoke, fog, clouds, falling snow, rain, etc, etc make your opponent easier to spot.
Re: (Score:2)
Re:Reminds me of Razors. (Score:5, Funny)
Re: (Score:2)
Re: (Score:2)
Let's take for example Doom3 and John Carmack. Listen to the guy talk about game engines sometimes. Peformance is king. And while when it hit the market the machine I played it with was fairly mid range but performed very well at high resolutions at clos
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
How much power does it need? (Score:4, Funny)
Re: (Score:2)
How many power supplies are required? Does it come with a 12 KV step-down transformer and 220V three-phase power hookup? Can I heat the basement with it?
Funny yes. Has some truth, definitely. The watts that go into some of these GPUs is more than the processor, or so it seems.
Me, I am opting for the ones without fans, they are quieter and less to go wrong.
I thought we all agreed... (Score:1)
It's not the GPU (Score:2, Interesting)
Re: (Score:2)
Over time the GPU's will be more flexible and that means it will be easier to offload calculations through some common API, but I think it will be a few years yet before this potential can be realized.
2 GPUs is the limit, for now. (Score:5, Informative)
http://www.xbitlabs.com/articles/video/display/zotac-9800gx2.html [xbitlabs.com]
While there are a few key games that get no boost out of 2-way SLI, the vast majority of games do see improvement. 3-way, on the other hand, can actually cause WORSE performance.
It probably has to do with limitations on how the SLI/Crossfire drivers can fake-out the game engine. There are probably limits to how many frames the game engine allows to be in-flight at once, limiting how much performance boost you can get from AFR SLI. And although you can get around game engine limitations with split-screen rendering, this mode needs specific game support, and shows less potential performance increase. Plus, split-screen rendering and has to be selected explicitly in Crossfire (AFR is the default).
Re: (Score:2, Funny)
Re: (Score:2)
Where are the more efficient GPUs? (Score:5, Insightful)
I want a fanless, 5W GPU with the power of GPUs from about 3 years ago. Can the new smaller transistors allow for this or am I asking for too much?
If ATI and nVidia keep pushing for raw power, they'll get beaten to the low-power finish line by the likes of intel and VIA.
Re: (Score:3, Informative)
The card features %20-30 more performance than the 8600GT (plenty to top GPUs from 3 years ago), and with a 65nm process, should consume around 30w or less at-load.
Re: (Score:1)
You could get the HD3450 or HD3470, both of which do well enough (they are the decendents of the X1300 in terms of placement) and use a big maximum of 20W-25W of power. The HD3650 does a good job too in terms of power, I think it's idle use is around 10W and maximum at 30W-40W. The HD3850 idles at ~10W too. (Don't quote me on these though, it's been a while since I looked this up)
Re: (Score:2)
I want a fanless, 5W GPU with the power of GPUs from about 3 years ago. Can the new smaller transistors allow for this or am I asking for too much?
Top of the line? Then no, they were eating ~|00W then and would still be eating 25W even if we inverse-applied Moore's "law". However, non-gamers look to be in for a treat, for example the Atom's chipset does HD decoding at 120mW. Yes, it got some cripplings but say within 0.2W it should do full HD. Gamers are going to be pretty alone with their power rigs in not that many years...
Re: (Score:1)
Fundamentally flawed? (Score:3, Insightful)
In fact, gaming and graphics scale amazingly well as a multi-threaded application. In fact, as many in the graphics/gaming community have been stating recently, ray tracing would benefit greatly from more GPUs. Being able to trace multiple rays at a time would speed up rendering.
They state that it is fundamentally flawed when they should have said that it would be ignorant to assume that an application designed to use a single-core or dual-core GPU would benefit from extra GPUs.
Re: (Score:1)
Since the multicoreness of the GPU's would be a "hardware thang", chances are that only the device drivers or the graphics API would even need to worry about it.
Multithreaded graphics and multithreaded gaming are two very separate things that are not to be confused with each other. Both operate on separate domains of information, using different code, probably
Re: (Score:1)
Re: (Score:1)
Re: (Score:2)
Re: (Score:1)
I do recognise that, done properly, multi-GPU rendering can be very effective. But when it comes to PC games and consumer graphics
Parallelism (Score:3, Insightful)
(Of course, there's the question of global illumination. I don't know if those can be parallelized as easily, but there was a story about distributed photon mapping here some time back, where they used Blue Gene.)
Fuck Everything, We're Doing Five Blades (Score:2)
http://www.theonion.com/content/node/33930 [theonion.com]
Had to be said... (Score:2)
Re: (Score:3, Funny)
Re: (Score:2)
Re: (Score:1)
FYI (Score:3, Insightful)
The future is a really long time.
Re: (Score:1)
What ASUS said (Score:3, Funny)
Re: (Score:1)
Bigger (Score:1)
Re: (Score:1)
As much as I'd love for Antec to make a bar-fridge sized chassis, conjugated with a monstrous Asus motherboard featuring a gazillion PCI-E channels, two dozen RAM slots and fifty SATA connectors, I don't expect to see any such orgiastic concoction, not ever.
The trend is to minify, which is great for the Average Joes and Janes, but is completely
Obligatory... (Score:2, Funny)
Cramming Breakthrough! (Score:3, Funny)
Why bother ? (Score:2, Interesting)
When they come up with a multi-GPU system that appropriately virtualizes the whole thing, enablin
So we are back to the Voodoo 2 now? (Score:3, Interesting)
Re: (Score:1)
Adding a Voodoo2 12MB to a system with an S3 Virge....talk about night and day
Why use a slot-mounted "card" as such? (Score:2)