Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

Asus Crams Three GPUs onto a Single Graphics Card 115

Barence writes "PC Pro has up a look at Asus' concept triple-GPU graphics card. It's a tech demo, so it's not going to see release at any point in the future, but it's an interesting look at how far manufacturers can push technology, as well as just how inefficient multi-GPU graphics cards currently are. 'Asus has spaced [the GPUs] out, placing one on the top of the card and two on the underside. This creates its own problem, though: attaching heatsinks and fans to both sides of the card would prevent it from fitting into some case arrangements, and defeat access to neighbouring expansion slots. So instead, Asus has used a low-profile heat-pipe system that channels the heat to a heatsink at the back of the card, from where it's dissipated by externally-powered fluid cooling pipes.'"
This discussion has been archived. No new comments can be posted.

Asus Crams Three GPUs onto a Single Graphics Card

Comments Filter:
  • Drivers first. (Score:5, Interesting)

    by San-LC ( 1104027 ) on Thursday April 10, 2008 @02:48PM (#23028016)
    The technology for multi-GPU processing is already out there (SLI, Crossfire), and now the companies are trying to increase the number of GPUs that can be daisy-chained (CrossfireX, 3-way SLI).

    However, it seems with all of these methods, the weak link is always driver support. I think that drivers will have to develop further before anything like this can take true form and be useful.

    As an aside, did anyone notice that half of the Slashdot description sounded like an advertisement for Asus GPU cooling?
    • Re: (Score:2, Insightful)

      by Vicarius ( 1093097 )
      I an content with the drivers that I have. However, I am still looking for a decent high end card that does not need two slots in my case. How about fixing heat and size issues first?
      • Re: (Score:2, Insightful)

        by MojoStan ( 776183 )

        I am still looking for a decent high end card that does not need two slots in my case. How about fixing heat and size issues first?

        Bingo. I thought the main point of multi-GPU graphics cards (and multi-core processors) was to build good gaming rigs (and workstations) without having to use a monstrous extended ATX uber-tower with multiple CPU sockets and video card slots.

        Improved manufacturing processes and software/drivers have allowed us to have multiple processor cores and GPUs in a shoebox-sized Shuttle XPC. Asus's big, hot, and inefficient card just shows us that current manufacturing processes and software/drivers aren't ready

    • Heat pipes are nifty from a modder perspective. As the TFA states however, this thing is never to be sold, and the pipes just a quick way to get something working.
    • Re:Drivers first. (Score:5, Insightful)

      by eebra82 ( 907996 ) on Thursday April 10, 2008 @03:00PM (#23028158) Homepage

      As an aside, did anyone notice that half of the Slashdot description sounded like an advertisement for Asus GPU cooling?
      Advertisement or announcement? Does it really matter, since most news items could be considered as advertisements?

      I think the talk about the cooling is important since one of the most difficult tasks is not how to get three GPUs on a single chip, but to get a viable cooling solution that doesn't sound like a vacuum cleaner and one that doesn't require too much space (or it would essentially kill the whole concept).
      • Re:Drivers first. (Score:4, Insightful)

        by Applekid ( 993327 ) on Thursday April 10, 2008 @03:33PM (#23028532)
        Unfortunately Nvidia set the pace for all video card manufacturers when they spouted they were going to do 6 month product cycles. Trying to go faster than Moore's Law had resulted in top end cards being massively more expensive than yesteryear's top end cards and taking exponential amounts more of energy than they used to... to the point where the high end can't even sustain itself without not one but TWO specificly designed supplmental power leads from the PSU.

        Multi GPU is the only way to keep that breakneck pace, just like the CPU world is trying to deal with hitting the wall (or, depending on who you ask, the low hanging fruit has already been picked). But the penalties for the reach exceeding the grasp is absolutely catching up with them.
      • Re:Drivers first. (Score:5, Insightful)

        by timeOday ( 582209 ) on Thursday April 10, 2008 @04:52PM (#23029554)

        Advertisement or announcement?
        It's neither. In fact isn't nothing more than a report of a failed experiment - I quote, "A standard 512MB HD 3850 running our Crysis benchmark in high detail at 1,280 x 1,024 averaged 26fps, while switching to the X3 increased that score by just 3fps."

        In other words, it doesn't work! I'll worry about cooling 3 GPUs when they are at least able to do something useful! Until then I would cool this board by unplugging 2 of the GPUs and enjoying practically the same performance.

      • The cooling focus is reasonable...

        In case you're wondering, Asus opted for the HD 3850 GPU rather than the faster HD 3870 for reasons of power. Even with this lesser chip, our X3 test rig peaked at a whopping 296W, compared with 186W for a standard 3850 system. With an HD 3870 X2, the system exceeded 300W, and that's with just two GPUs.
    • I didn't interpret the /. description or the underlying article as an advertisement. The subject of the piece is a just a prototype or proof-of-concept. Separately, but on a related note Popular Science has an article discussing how nVidia argues that the best bang for your computer buck is not to pay more for a faster CPU, but rather to upgrade your graphics card. Here is the link: http://www.popsci.com/gear-gadgets/article/2008-04/forget-cpu-buy-better-graphics-card [popsci.com]
      • Re: (Score:1, Offtopic)

        by menkhaura ( 103150 )
        Well... If I were a used cars salesman I'd say that the best bang for your buck is a 1995 BMW 325i with all the luxuries you can think about for about the same price as a 2008 1.0L four-banger Fiat Palio [wikipedia.org] with, at the very most, air conditioning (which, in a 1.0L, 70hp vehicle is ridiculous, but quite essential in a country like here, Brazil)...

        Yes, the 325i has more mileage behind it, maintainance costs are higher (as much as 4x as a spanking new Fiat Palio), but when it comes down to it, it's a freaking BM
    • However, it seems with all of these methods, the weak link is always driver support. I think that drivers will have to develop further before anything like this can take true form and be useful.

      Well, of course the weak link is driver support. I don't care how many nodes your Beowulf system has--if you (or your software) don't know how to efficiently partition the load, your system is not going to be significantly faster than a single node. Ignore software at your peril.

      Unless nVidia or ATI publishes enough information that a competent programmer can publish a third party driver, you're stuck with a system that doesn't work. It doesn't matter how good the underlying hardware is if the drivers the

    • Re: (Score:3, Interesting)

      by LarsG ( 31008 )
      the weak link is always driver support

      Kinda sorta. Splitting rendering across multiple GPUS has afaict become much harder lately. GPUs used to be mostly fixed function pipelines, while the current generation has more in common with programmable stream processors (e.g., shader programs).
  • onionistic (Score:5, Funny)

    by Anonymous Coward on Thursday April 10, 2008 @02:54PM (#23028092)
    Fuck Everything, Were doing five cores.
  • Finally! (Score:2, Funny)

    by Bovius ( 1243040 )
    So does that mean we can play Crysis now?
    • by San-LC ( 1104027 )
      In a word...
      .
      .
      .
      Not on your life, punk.
      (Okay, maybe that was five words...still can't play Crysis.)
  • by Forge ( 2456 ) <kevinforge AT gmail DOT com> on Thursday April 10, 2008 @02:58PM (#23028124) Homepage Journal
    Remember when All razors had a single blade? Then double blade razors were all the rage. These days, Triple and quad blade razors are around. Soon we will have 5 blades but I would call that a cheese grater.

    Same thing with CPUs and now GPUs. Problem is, at what point dose it become a pissing contest rather than a way to provide more performance for an application that needs it.

    And speaking of those demanding applications. Am I the only one who notice that some of the latest video games running on the best available hardware provide no improvement in appearance or game-play over older games of a similar type running on older hardware?

    It's bad enough that I am tempted to think the programmers are just adding fat to make sure the game demands a more expensive video card.

    Kevin.
    • Re: (Score:2, Informative)

      by San-LC ( 1104027 )
      But...we have a razor with six blades [gillettefusion.com] already.
      • by Forge ( 2456 )
        That's so cool. Gillette is selling a cheese grater for facial use?

        Worse yet they have Tiger Woods selling it.

        I say worse because I am willing to bet good money that he either uses an electric shear or a has his face waxed.

        Most black guys have difficulty shaving with a razor. Mostly because out beards grow curly from the root. This causes razor bumps unless we either save some stubble (my solution) or uproot the hair with wax.
        • by Jaysyn ( 203771 )
          Who knows, maybe he got his facial hair (or lack thereof) from his mom's side of the family, who I believe is Asian.

        • Most black guys have difficulty shaving with a razor. Mostly because out beards grow curly from the root. This causes razor bumps unless we either save some stubble (my solution) or uproot the hair with wax.

          Or you could try shaving with a straight razor. Once you get over the fear of it and get to the point where you don't draw blood every time it touches your face (about 3-4 weeks), you'll find that it works really well. You get a nice clean shave (cuts down on bumps), it goes just as quickly as regular
          • This doesn't help - the curl is at the root (under the skin).

            When you shave close, the hair is stretched out from the tension, then cut. When you remove the tension, the hair recedes (no longer being stretched outward) back under the level of the skin. The natural curl results in a razor bump when it grows back out, because it doesn't line up with the pore. Instead, as it grows under the skin, it folds onto itself and causes irritation and pus and nastiness.

            You then get some tweezers and dig it out.

            There
      • by Zymergy ( 803632 ) * on Thursday April 10, 2008 @03:33PM (#23028530)
        Baaaah! Just more proof that MORE is always better! Right?

        The 7-blade razor: http://www.youtube.com/watch?v=KwlKN39wlBE [youtube.com]

        The 16-blade razor: http://www.youtube.com/watch?v=GjEKt5Izwbo [youtube.com]

        The 18-blade razor: http://www.youtube.com/watch?v=wYyxK2vGyVw [youtube.com]
    • Re: (Score:3, Interesting)

      In defense of the razors, I absolutely love my Fusion (4 blades). It's each time they add a blade they reduce the spacing between the blades by a similar amount. So each blade only has to take off 1/4th of what a 1 blade model would.

      In a BadAnalogyGuy way I do hope that some computers (especially laptops) move in this direction. Why do I need a 2 gHz dual core processor for my EEE style laptop. Break it into a cheap, slower, power efficient general processor then have a few other small, cheap, power efficie
      • by fitten ( 521191 )
        So they fit four razor blades, with spacing, where it used to be the width of a single razor blade (like 2mm)? ;)
      • by Hatta ( 162192 ) on Thursday April 10, 2008 @03:28PM (#23028476) Journal
        I don't buy that. The first razor is always going to be doing most of the cutting because it's in front. The others are just there for show.
        • by 0100010001010011 ( 652467 ) on Thursday April 10, 2008 @03:46PM (#23028704)
          Say what you want. I used to always get the cheap $1 razors when I was in college because I was cheap. But the Mach 3 and furthermore the Fusion make a great improvement. Even when I shave every other day it still goes through it smooth and with no 'tearing' on my face.
          • by sconeu ( 64226 )
            I think we've had this before, but let me add another vote for the Mach3.
          • To be fair, this could just be because Gillette and Schick make high quality blades.
          • by Cheeko ( 165493 ) on Thursday April 10, 2008 @04:05PM (#23028974) Homepage Journal
            That probably has more to do with the quality of the individual blades. Cheap $1 razor will cut like a cheap $1 razor.

            That being said I used to use the 2 blade Gilette razor and have since moved on to a 3. What I have noticed is that it does the job faster and the overall blade lasts longer. What i suspect is happening is that the first blade may dull, but its making the rough cut anyway, then one of the other blades which is sharper follows up with a cleaner cut.

            I think the more important advancement has been all the other stuff on the blade head. the mounted springs, the lube strip, the rubber precut strips that tension the skin, etc. I suspect all those contribute more to the newer blades being better.
            • I think the more important advancement has been all the other stuff on the blade head. the mounted springs, the lube strip, the rubber precut strips that tension the skin, etc. I suspect all those contribute more to the newer blades being better.

              I agree with that comment, but I understand the argument to sell lower quality blades in order to keep selling them (eg, they wear out), but why hasn't anyone made a good 2 blade razor with a push-to-clean bar, and ceramic (or Ginsu) blades that will last years? Seriously, I'd pay top dollar for such a razor (eg, 50 bucks or so).

            • What i suspect is happening is that the first blade may dull, but its making the rough cut anyway, then one of the other blades which is sharper follows up with a cleaner cut.

              Razor blades do not generally dull from use - they dull from rust. [lifehacker.com]

              You can dry them with a hair-dryer to increase longevity, or use use shaving oil [allaboutshaving.com] in the shower, which tends to leave the blades with a thin, protective layer of oil - I typically get at least 3 months, generally 6, out of a single blade cartridge that way.

              Note - I have no financial interest in this particular shaving-oil vendor.

            • by Growlor ( 772763 )
              I find the best razors to be the Mach 3's (I used to be a huge fan of the diamond coated double-bladed ones before they stopped making them a few years back.) I found them to be the best because they cleaned the easiest - I usually throw away blades that are too clogged to clean long before the blades themselves wear-out. I took a look at the 4 and 5 bladed ones that are available now, but the space between the blades is so small, that I'm sure they would plug-up and be impossible to clean in no time. The s
            • Everybody knows real linux hippies don't shave and look like RMS.
            • by jez9999 ( 618189 )
              Why don't you guys just use a friggin electric shaver, honestly? I've never understood why people still waste time and effort with manual razors. Electric shavers:
              - are quicker
              - don't cut you, ever
              - don't require putting dumbass shaving foam on beforehand
              - give just as smooth a shave, if not smoother, than most razors.

              Then I just take a shower afterwards and don't get annoying itching on my face, either.
              • I used to feel the same way, but about a year ago, I ditched my electric razor:

                - quicker? I'm not so sure about that. At first, yes, it took a lot longer to shave with a blade. Now that I've been doing it for a while, it honestly only takes 5 minutes.

                - don't cut you, ever? Probably wouldn't say "ever", but I agree, it's hard to cut yourself with an electric shaver. However, I always seemed to end up with nasty red splotches, almost like a burn or rash, all over my neck after shaving. Haven't had that
                • by jez9999 ( 618189 )
                  smooth shave? I have to solidly disagree with this one. My previous electric was one of the fancy Braun models with the self-cleaning base, which I purchased after hearing the "shaves as close as a blade" pitch. Having never used a blade before, I really didn't have anything to compare to, so I just assumed it was true. However, now that I've been using a blade for several months, I can tell you without a doubt, my electric never came CLOSE to the shave a blade can offer. At least not the 4-blade razors tha
          • by Raenex ( 947668 )

            I used to always get the cheap $1 razors when I was in college because I was cheap. But the Mach 3 and furthermore the Fusion make a great improvement.
            I used to be on the Gillette upgrade treadmill, but got off it years ago at level 3. I switched to disposable, single blade Bic Sensitive. Cost less than a quarter a piece, and honestly, they work fine.
        • Re: (Score:2, Interesting)

          by Vectronic ( 1221470 )
          Thats true... "to a point"... but when that first point wears out, and starts just pushing the hair over, the second blade snags on, cuts the hair on an angle, the third cuts it almost straight (flush) and the forth cuts it flush...

          Also, how you roll the blade has some effect on which blade gets the first cut... similar to a surf board, its flat... however when you put your weight to the back, its only the back of the board thats touching the water...
      • The gillette fusion ( http://en.wikipedia.org/wiki/Gillette_Fusion [wikipedia.org] ) is 5 blades with a 6th in the back. 5 Blades close together for most of the shaving and a single blade for under the nose and other hard to reach with the 5 blades. If you happen to be using one of these 5 bladed razors and get bumped while shaving, you will get a nasty 5 way razor cut on your face. This has happened to me a few times. 5 cuts that close together are nasty. Nothing to do but let it bleed.
      • The Fusion has 5 blades.
    • by jellomizer ( 103300 ) on Thursday April 10, 2008 @03:12PM (#23028296)
      Well to be fair the new games do have better graphics. But the problem comes down to the fact that graphics are improving beyond the average persons eye and interpret graphics. Much like sound cards a decade ago. We had some major improvements from the Ad-Lib up to the SoundBlaster 16. But after that even though the cards have massive improvements average joe doesn't know the difference. The same thing is happening to video cards now. The new games that fully utilize the card adds effects that are very subtile, or on the old systems they cheated to add the effects (Fixed background Bitmaped Images, A semi-transparancy layer to simulate haze. While now the background is getting nearly fully rendered so if there is a mountain in the background and you have a powerful enough graphics card and a high enough resolution you can see each blade of grass on that mountain, and the Haze is like real haze not as uniformed as before more like real life. But the average Game player wouldn't really focus on these details, if they are actually interested in playing the games.

      Unlike the old days you can see a huge difference between a CGA, EGA, VGA and Super VGA. Then Super VGA held on for a while then the 3d Cards started coming out and there were huge improvements even now. But I think we are getting to a point again where the details they can produce is beyond what is needed.
      • by Captain Splendid ( 673276 ) <capsplendid@@@gmail...com> on Thursday April 10, 2008 @03:16PM (#23028326) Homepage Journal
        But I think we are getting to a point again where the details they can produce is beyond what is needed.

        Which is probably why we're getting a lot more chatter on the raytracing issue. I believe that'll be the next big step.
        • by jellomizer ( 103300 ) on Thursday April 10, 2008 @03:44PM (#23028680)
          While I am not sure why I got a Flamebate mod on my post. But Raytracing real time could have advantages on cleaning up issues that we do see. While now we can see each individual grass blade each grass blade looks and moves like a broken stick. But will it be good enough for a competive advantage. Or is it that the graphics artests are not good enough for realistice 3d images. Even the stuff that takes months to render for the movies still looks computer generated and seem unrealistic when things start moving. Much like the Final Fantasy movie (yea it is old but it is a great point) they showed adds before it was released with the people face as a solid image, and you couldn't tell if it was real or not. But when you see them moving and talking you knew it was fake, and lifeless. Pixar worked around these issues by not making people look realistic they use a cartoon aproach to make them seem like cartoon and have them different enough for us to connect with. But it is an issue is it performance and CPU or the fact that we don't have artests good enough for the work yet.
          • While I am not sure why I got a Flamebate mod on my post

            You probably pissed off somebody with mod points and they're taking it out on you, regardless of the post's contents. It happens. Maybe I'll catch it in M2.
          • by m50d ( 797211 )
            While I am not sure why I got a Flamebate mod on my post.

            Trollish of me, but maybe it's because you can't spell.

      • by default luser ( 529332 ) on Thursday April 10, 2008 @03:37PM (#23028592) Journal
        I have to agree, I'm seeing less and less use for a higher-resolution screen for home/gaming use.

        For games on the desktop, the maximum resolution you have to push (realistically) is 1920x1200 (really, anything larger at 2-3 feet away is overkill), and the maximum resolution you have to push on a television (if you're into that) is 1920x1080. Funny, midrange $150-200 cards can do that today, with high quality, in all games except Crysis.

        So yeah, I can see the slowdown in graphics tech coming around. The fact that you can play any modern game in medium settings at 1280x1024 with a $75 add-in card shows us exactly why we're hitting the developmental wall. Most people are happy with our current level of graphics, and the cost of new graphics architectures rises exponentially with every new revision; so, if you don't have the demand, you're not going to rush production on the next-generation of GPU architectures.

        Unfortunately, this leaves the %1 of hardcore gamers bitching, and they tend to bitch the loudest, so Nvidia and ATI are trying to placate them with stop-gap SLI solutions.
        • Well Hard core gamers won't be happy until they have something like the holodeck from ST:TNG, without the random failures to actually make what your doing a real problem vs. having fun.
          • by jez9999 ( 618189 )
            Yeah, those holodecks always seem remarkably prone to sudden and catastrophic failure. "Oh no, we can't get out! On no, by an annoying co-incidence, safeties have also been disengaged at the same time!!"
        • Re: (Score:1, Interesting)

          by Anonymous Coward
          I'm sure somebody clever out there in the game design world will make some must-have title that has multiple monitor support. Put all the up close action and such on one screen. The other screen will feature your stats, chat, mission map and player postitions, how much ammo you have, etc. Maybe have some other thing where another player can show you what he's viewing or something nifty like that which could help turn the tide of battle. If this is implemented well enough, the multi-monitor users would have
          • I'm sure somebody clever out there in the game design world will make some must-have title that has multiple monitor support.

            Yeah, they did this when Matrox released the G400 and brought multiple-monitor gaming to the world. No, it didn't catch on, except for flight simulators.

            The fact is, for most games, multiple monitors add very little immersion and questionable utility. Basically, unless your platform makes it standard (like the Nintendo DS), you're not going to see wide support.
      • His point is perfectly valid. I don't get nearly as excited about recent jumps in graphics technology because the difference in quailty is not as stunning as you'd once get.

        The difference between CGA and my Amiga was immense. The difference between no antiscopic filtering and 16x antiscoping filtering is best left to those with 20/10 vision.

        Once the pixels got indistinguishably small, and the hues varied to the limit of human perception, we were left only with increasing art quality, animation, lighting and
      • by vux984 ( 928602 ) on Thursday April 10, 2008 @05:18PM (#23029804)
        Well to be fair the new games do have better graphics. But the problem comes down to the fact that graphics are improving beyond the average persons eye and interpret graphics. Much like sound cards a decade ago.

        I think the biggest irony is that in multiplayer competitive people disable all these features anyway because

        1) framerate is king
        2) getting rid of advanced lighting, bump mapped animated textures, smoke, fog, clouds, falling snow, rain, etc, etc make your opponent easier to spot.

        • I would have thought in a well organised competition you would agree on settings before you started rather than trying to optimise your settings to try and gain an unfair advantage.
    • by jalefkowit ( 101585 ) <jasonNO@SPAMjasonlefkowitz.com> on Thursday April 10, 2008 @03:37PM (#23028604) Homepage
      No discussion of how many blades there are on razors these days is complete until somebody posts this [theonion.com]. So I figured I'd just get it out of the way :-)
    • by vimh42 ( 981236 )
      Sometimes I think that too, but then I look at some of the more graphic intensive games and some of the most financially successful games and come to the conclusion that few programmers deliberately hobble their games just to push the hardware venders sales.

      Let's take for example Doom3 and John Carmack. Listen to the guy talk about game engines sometimes. Peformance is king. And while when it hit the market the machine I played it with was fairly mid range but performed very well at high resolutions at clos
    • Oblig link: Fuck Everything, We're Doing Five Blades [theonion.com]
    • by eison ( 56778 )
      I used to believe that the graphics don't matter, but then I played the real Portal and the shockwave flash versions of portal and I had a lot more fun playing the full first person you-are-there good graphics portal. Better graphics made the you-are-there experience better.
  • by Anonymous Coward on Thursday April 10, 2008 @02:59PM (#23028136)
    How many power supplies are required? Does it come with a 12 KV step-down transformer and 220V three-phase power hookup? Can I heat the basement with it?
    • How many power supplies are required? Does it come with a 12 KV step-down transformer and 220V three-phase power hookup? Can I heat the basement with it?

      Funny yes. Has some truth, definitely. The watts that go into some of these GPUs is more than the processor, or so it seems.

      Me, I am opting for the ones without fans, they are quieter and less to go wrong.

  • that we never need more than 1 core-in-a-chip. 3? THIS IS HERESY!!! *KICK* when it reaches 128, it is the day of apocalypse!
  • It's not the GPU (Score:2, Interesting)

    The end of the article makes a good point. While we have dual- and tri-core graphics setups now, the programs are not designed to exploit them. This is the same issue that is being faced in the general CPU market as well. If you don't have a multi-threaded app for multiple CPUs, you only gain in multitasking, but not in a single program. A serial program can ONLY run serially. There's only so much parallelism that a CPU can infer. And at that point, you have out-of-order execution to make up for mutli
    • by Splab ( 574204 )
      Actually not a bad idea, quite a bit of objects are stationary in a game, so you can offload that to secondary GPU's - the problem is most games/engines haven't been designed for this in mind and you can't magically create something that can do this for them.

      Over time the GPU's will be more flexible and that means it will be easier to offload calculations through some common API, but I think it will be a few years yet before this potential can be realized.
  • by default luser ( 529332 ) on Thursday April 10, 2008 @03:04PM (#23028198) Journal
    It's not necessairly a limit of the board design, but a limit to what game engines can be optimized for. Most game engines do not scale well beyond two cards, as can be seen here:

    http://www.xbitlabs.com/articles/video/display/zotac-9800gx2.html [xbitlabs.com]

    While there are a few key games that get no boost out of 2-way SLI, the vast majority of games do see improvement. 3-way, on the other hand, can actually cause WORSE performance.

    It probably has to do with limitations on how the SLI/Crossfire drivers can fake-out the game engine. There are probably limits to how many frames the game engine allows to be in-flight at once, limiting how much performance boost you can get from AFR SLI. And although you can get around game engine limitations with split-screen rendering, this mode needs specific game support, and shows less potential performance increase. Plus, split-screen rendering and has to be selected explicitly in Crossfire (AFR is the default).
    • Re: (Score:2, Funny)

      by Anonymous Coward

      3-way, on the other hand, can actually cause WORSE performance.
      Please don't tell my girlfriend that, I've been working on her to do that with me for months now.
    • by Have Blue ( 616 )
      Split-screen SLI also causes problems with deferred rendering and other post-processing techniques that have become all the rage these days - the driver has to work around the fact that none of the chips has the complete framebuffer.
  • by Yvan256 ( 722131 ) on Thursday April 10, 2008 @03:31PM (#23028502) Homepage Journal
    Why can't we put our efforts toward more efficient GPUs? Just as most users won't ever be able to push their current CPUs at their maximum, most aren't even using the full power of their GPUs.

    I want a fanless, 5W GPU with the power of GPUs from about 3 years ago. Can the new smaller transistors allow for this or am I asking for too much?

    If ATI and nVidia keep pushing for raw power, they'll get beaten to the low-power finish line by the likes of intel and VIA.
    • Re: (Score:3, Informative)

      You're not asking too much, you're simply over-valuing what 3-year-old tech is capable of. The GeForce 6800 Ultra was the best Nvidia card in existence 3 years ago, and soon, you will be able to purchase the 9500 GT, which will have more performance.

      The card features %20-30 more performance than the 8600GT (plenty to top GPUs from 3 years ago), and with a 65nm process, should consume around 30w or less at-load.
      • And why wait for the 9500GT?

        You could get the HD3450 or HD3470, both of which do well enough (they are the decendents of the X1300 in terms of placement) and use a big maximum of 20W-25W of power. The HD3650 does a good job too in terms of power, I think it's idle use is around 10W and maximum at 30W-40W. The HD3850 idles at ~10W too. (Don't quote me on these though, it's been a while since I looked this up)
    • by Kjella ( 173770 )

      I want a fanless, 5W GPU with the power of GPUs from about 3 years ago. Can the new smaller transistors allow for this or am I asking for too much?

      Top of the line? Then no, they were eating ~|00W then and would still be eating 25W even if we inverse-applied Moore's "law". However, non-gamers look to be in for a treat, for example the Atom's chipset does HD decoding at 120mW. Yes, it got some cripplings but say within 0.2W it should do full HD. Gamers are going to be pretty alone with their power rigs in not that many years...

    • That's was integrated graphics is for. With a dedicated graphicscard you are probably wasting 5-10W on interconnect, outputs, and ram, while you haven't calculated a single triangle.
  • by sanosuke001 ( 640243 ) on Thursday April 10, 2008 @03:33PM (#23028540)
    The FA states that multi-threaded gaming is fundamentally flawed. How is this a valid statement? They are testing a multi-cored GPU using games that were most likely only developed to use, at MOST, two cores. Regardless of how many core you throw at it, the application (ie. your game) will NEVER use any of those cores.

    In fact, gaming and graphics scale amazingly well as a multi-threaded application. In fact, as many in the graphics/gaming community have been stating recently, ray tracing would benefit greatly from more GPUs. Being able to trace multiple rays at a time would speed up rendering.

    They state that it is fundamentally flawed when they should have said that it would be ignorant to assume that an application designed to use a single-core or dual-core GPU would benefit from extra GPUs.
    • In this case the game itself could very well be single threaded. The graphics rendering is a separate operation that can easily be abstracted away.

      Since the multicoreness of the GPU's would be a "hardware thang", chances are that only the device drivers or the graphics API would even need to worry about it.

      Multithreaded graphics and multithreaded gaming are two very separate things that are not to be confused with each other. Both operate on separate domains of information, using different code, probably
      • yes, but my point was that the idea of either working in a multithreaded environment was fundamentially flawed is blatantly wrong. The problem lies with the foundation not being written for the hardware they are trying to utilize.
        • I agree with you, just for entirely different reasons that are based on abstraction rather than concurrency.
    • by Ahnteis ( 746045 )
      They may be addressing the problem of gaming being a very synchronous activity, and thus very difficult to address in multiple threads. But if they didn't explain it, who knows what they meant. :)
    • by darien ( 180561 )
      Hello there - I wrote the article, so let me try to explain myself a little more clearly. I did say that multi-GPU gaming is fundamentally inefficient, but I meant that in the context of "current games and drivers". My point was that we can now run three and even four GPUs in parallel on a desktop PC, but, as you say, current games simply don't take advantage of all those cores.

      I do recognise that, done properly, multi-GPU rendering can be very effective. But when it comes to PC games and consumer graphics
  • Parallelism (Score:3, Insightful)

    by kvezach ( 1199717 ) on Thursday April 10, 2008 @03:35PM (#23028562)
    As opposed to raytracing, which is so extremely parallel [wikipedia.org] that it scales nearly linearly. In other words, your 3-way real-time raytracer graphics card (if/when such a beast is ever made) would perform at about 2.8x the one-GPU variant. And unless the Rapture^Wsingularity keeps you from getting a 192-GPU card, it'd render at 170x the reference or so.

    (Of course, there's the question of global illumination. I don't know if those can be parallelized as easily, but there was a story about distributed photon mapping here some time back, where they used Blue Gene.)
  • I wonder if their stategy discussions follow this tone:

    http://www.theonion.com/content/node/33930 [theonion.com]
  • Imagine a beowulf cluster of these!
    • Re: (Score:3, Funny)

      by cashman73 ( 855518 )
      Wow! A beowulf cluster of graphics cards! That certainly ought to be enough to run Duke Nukem Forever!
    • FINALLY SOMEONE SAID IT. I looked and looked. and thought IS this Slashdot? Where is the mention of the beowulf cluster where? But nope someone said it, another day has been saved.
  • FYI (Score:3, Insightful)

    by geekoid ( 135745 ) <dadinportlandNO@SPAMyahoo.com> on Thursday April 10, 2008 @04:26PM (#23029248) Homepage Journal
    " so it's not going to see release at any point in the future,"

    The future is a really long time.
  • by British ( 51765 ) <british1500@gmail.com> on Thursday April 10, 2008 @04:52PM (#23029562) Homepage Journal
    "Fuck everything, We're doing three GPUs!"
  • Maybe it's time PC's got bigger. Heat dissipation would be a much easier problem to solve if the computer was the size of my parents's RCA console stereo. And the thing would look much nicer in the family room.
    • I would like varying PC sizes, but that brings with it compatibility issues. Do "big" computers require big cards ? Will that big cards fit in a normal/small unit ?

      As much as I'd love for Antec to make a bar-fridge sized chassis, conjugated with a monstrous Asus motherboard featuring a gazillion PCI-E channels, two dozen RAM slots and fifty SATA connectors, I don't expect to see any such orgiastic concoction, not ever.

      The trend is to minify, which is great for the Average Joes and Janes, but is completely
  • ...this is old news. They've already done five cores [imageshack.us] some time ago.
  • by SparkleMotion88 ( 1013083 ) on Thursday April 10, 2008 @06:42PM (#23030564)
    This is a terrific advance in the field of cramming. I'm looking forward to seeing their presentation at the Cramming and Stuffing conference later this year.
  • Why bother ? (Score:2, Interesting)

    by billcopc ( 196330 )
    I fail to see why they even bothered to build this thing. Anyone who's been paying attention would know that multi-GPU technology is a clusterfuck. Nobody has a stable, reliable implementation that actually yields respectable performance. A 20-30% increase for GPU-bound applications is simply pathetic... often times that increase is 0%, if a game is not SLI-enabled by its developer or by the graphics drivers.

    When they come up with a multi-GPU system that appropriately virtualizes the whole thing, enablin
  • by seeker_1us ( 1203072 ) on Thursday April 10, 2008 @09:16PM (#23031724)
    http://en.wikipedia.org/wiki/Voodoo2 [wikipedia.org]

    The Voodoo2 was a set of three graphics processing units (GPU) on a single board, made by 3dfx. It was released in February 1998 as a replacement for the original Voodoo Graphics chipset.

    :)

    • Oh, the memories.

      Adding a Voodoo2 12MB to a system with an S3 Virge....talk about night and day :-)
  • Wouldn't it be easier to have a small "pseudo-card" go into the card slot, run a cable from it through the slot into a "graphics unit box" (with its own separate power and cooling), and attach the monitor to the graphics unit box?

Our OS who art in CPU, UNIX be thy name. Thy programs run, thy syscalls done, In kernel as it is in user!

Working...