Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

Gigabyte's Dual-GPU Graphics Card 252

kamerononfire writes "Tom's Hardware has an article on a new dual-GPU graphics card, to be released Friday, by Giga-byte: "According to sources, the SLI card will lift current 3DMark2003 record revels by a significant margin while being priced lower than ATI's and Nvidia's single-GPU high-end cards.""
This discussion has been archived. No new comments can be posted.

Gigabyte's Dual-GPU Graphics Card

Comments Filter:
  • Drivers? (Score:5, Informative)

    by BWJones ( 18351 ) * on Thursday December 16, 2004 @04:27PM (#11108691) Homepage Journal

    So, the question will be: Can we get drivers for this card that will work in Linux or OS X? It is based in Nvidia technology, so presumably one could write drivers for this card unless Gigabyte is keeping their stuff proprietary.....

    It looks interesting and I would certainly be more than interested in plugging one into my dual G5, but I don't have time (or the interest) to write my own drivers.

    • Re:Drivers? (Score:5, Insightful)

      by Ianoo ( 711633 ) on Thursday December 16, 2004 @04:33PM (#11108755) Journal
      It's almost certain that what Gigabyte have done is this:
      • Take the basic single GPU nVidia 6600 PCB
      • Lay down two on the same PCB with two GPUs
      • Link them together with a PCI Express switch
      • Reverse engineer the card bridge that nVidia is selling for SLI and connect whatever control signals are required as traces on the PCB.
      It seems they can do this for a signficantly lower price than you can build two single cards.

      The point is that if nVidia SLI is working under Linux, then this should too.
      • Re:Drivers? (Score:3, Interesting)

        by hattig ( 47930 )
        I agree, but not about a PCI-Express switch. Most likely 8 PCIe channels go to one GPU, and the other 8 to the other.

        What I want on the card is TWO DVI outputs though. And possibly another two available on the other GPU via a cable when not in SLI mode.
        • Agreed on the Dual-DVI thing. Who would make such a killer card then cripple it by having only one DVI output? I mean, it isn't like a DVI-DB15 connector is expensive enough to prohibit one being included in the box for the people on analog.
    • "So, the question will be: Can we get drivers for this card that will work in Linux or OS X?"

      Actually, to the vast majority of hardcore gamers (which this card is targetting) that won't matter. I have my Mac for desktop use, my Linux box for file serving and my Windows box for gaming. No need to get special drivers.
      • Well, I would not be considered a hardcore gamer per se and I am absolutely not going to purchase another computer just for games, but I did help with the beta test development of Halo on OS X and a fast graphics card was nice to have for that process. It also helps push all the pixels on my huge Cinema Displays and kept me from getting fragged by all the 12 year old twitch meisters out there. Damn some of those kids are monsters.

      • Altough I'm not a hardcore gamer for me it would be a real "no buy" if it doesn't work with stock nvidia drivers. Who knows when gigabyte decides to not support there the card.
    • Re:Drivers? (Score:2, Informative)

      Nvidias CURRENT 6629 linux drivers already support SLI.

      BBH
  • Uh oh... (Score:4, Funny)

    by koreth ( 409849 ) on Thursday December 16, 2004 @04:29PM (#11108720)
    record revels
    I guess now we know where Kim Jong Il's roach went.
  • Next year (Score:2, Funny)

    by Anonymous Coward
    They are coming out with a card that includes a gpu, cpu, hard drive, ram, motherboard, ethernet, sound AND it's a nuclear powered plus it will fit in your back pocket and transmit the monitor images straight to your visual cortex all the while making your breakfast and cleaning your basement.
  • Great idea (Score:4, Interesting)

    by Anonymous Coward on Thursday December 16, 2004 @04:30PM (#11108731)
    That makes a lot more sense, store the textures once in shared memory instead of storing it twice as you would have to do in a two card solution.

    Makes me wonder if Nvidia will have dual core gpus in the future.
    • Not if each GPU gets only half the memory bandwith it would have with a dedicated memory bank. The two GPUs will surely be accessing the same areas of texture memory for the most part.
  • by Anonymous Coward on Thursday December 16, 2004 @04:31PM (#11108743)
    eom
    • by orthancstone ( 665890 ) on Thursday December 16, 2004 @04:36PM (#11108800)
      Yeah, apparently the author hasn't kept up with the graphics card industry. I would say perhaps he is only considering graphics cards that are realistically retail, but this one isn't on the market yet so I hardly feel that's applicable.
      • Actually, the V5 did no T&L, AFAIK. The first "GPU" was the GeForce 256, and Nvidia motivated that by the fact that it had a (locked) T&L pipeline, not just triangle setup and texturing. (Hey, the Voodoo 2, fully normal, even hade 3 chips, two texture units and one triangle setup.) And to all of you talking about dual core chips: forget it. The current chips are parallel in every relevant way already and putting two of these highly parallel chips together on the same die wouldn't benefit compared
      • Vodoo 5 5500, the version with two VSA-100 chips was availibe, I actually almost got one but recieved a GF2 MX as a present. Damn.
    • Or the V5 5500 which had just two GPU's.

      Granted the thing was a POS (and is now in my junk box), it sure beat out this new one.

      While not the same, I do recall and old Voodoo 2 card that was nothing more than two cards stuck to each other in SLI mode.

    • Multi GPU cards have been around a long time. Back in the late 1990s, Sun had one with three or six graphics processors depending on the depth of your wallet. More recent cards like the XVR-1200 are being advertised as "dual pipe".

      The only news, here, is probably a price point somewhere.
  • Doom for Gigabyte! (Score:5, Insightful)

    by millisa ( 151093 ) on Thursday December 16, 2004 @04:34PM (#11108767)
    I bought my dual GPU 3DFx Voodoo5 around this time 4 years ago. . . and then the company was bought, support disappeared, and my fancy video card became worthless even quicker than it should have . . . I don't recollect seeing another 'dual gpu video card that will slay the market' announcement since . . .
    • 3DFx was already moribund at that point. Nothing could really save them.
    • I thought "GPU" wasn't widely used until nVidia introduced their GeForce cards, which, with the inclusion of transform and lighting processing on the graphics processor, they claimed to be the "first true GPU" or something like that. Apparently those bits had previously been handled by the CPU. Did 3Dfx ever make cards with T&L handled on-card? Seriously asking, as I don't recall. I remember 3Dfx in their final generation or two (wasn't Voodoo5 released before Voodoo4?) boasting about their new cinemati
      • Of course they didn't call their chips "GPU"s since nVidias marketing team hadn't invented the name yet. OTOH I think it is a pretty good name, at least today. (I'm not quite convinced that the GeForce 1 should be called GPU if you want a more technical term.)

        And since it's "Graphical Processing Unit" pretty much anything that uses special chips to do graphic calculators have one. IMHO if it can't be used as a simple processor then it hasn't deserved the term GPU.
  • Deja Voodoo (Score:5, Informative)

    by PurpleFloyd ( 149812 ) <zeno20@@@attbi...com> on Thursday December 16, 2004 @04:34PM (#11108770) Homepage
    As I recall, 3dfx used multi-GPU chips for its Voodoo 4 and 5 lines, and didn't do so well. Is there anything to indicate that this card will do better? After all, sticking with SLI and multicore technology after its prime was what killed 3dfx and allowed Nvidia to take its place; it'd be rather ironic to see Nvidia go down the same path.
    • Re:Deja Voodoo (Score:5, Interesting)

      by RealErmine ( 621439 ) <commerce@nOspaM.wordhole.net> on Thursday December 16, 2004 @04:53PM (#11109008)
      Is there anything to indicate that this card will do better?

      The Voodoo 4/5 were the most expensive cards on the market. This card is cheaper than a *SINGLE* Nv 6800 and outperforms it by a good margin.

      Why buy a 6800?
    • Re:Deja Voodoo (Score:3, Informative)

      by supabeast! ( 84658 )
      3DFX died not because of SLI, but because they put all the R&D funding toward anti-aliasing low resolution (640x480, 800x600) graphics. By the time they had it working well, Nvidia was producing chips that ran the same games just fine at 1024x768 and up with better texture filtering, which looked much better than anti-aliased low-res graphics.

      The idea of slapping multiple chips on a card, or using multiple cards is still a good one, as long as the cards come out before someone else does something bette
      • Re:Deja Voodoo (Score:3, Insightful)

        by suckmysav ( 763172 )
        "3DFX died not because of SLI, but because they put all the R&D funding toward anti-aliasing low resolution (640x480, 800x600) graphics."

        That was only one of their mistakes. The other two were;

        * Insistance that 16bit graphics were "all games require", (640K RAM anyone?) and their subsequent dogged refusual to offer 32bit cards. This allowed nVidia to leapfrog them and take a huge market lead, from which they never recovered.

        * Attempted to force the market into adopting their own proprietary standar
        • Don't forget buying STB so they could make their own cards, screwing all their current partners who turned to ATi and Nvidia.
    • Re:Deja Voodoo (Score:5, Interesting)

      by Ianoo ( 711633 ) on Thursday December 16, 2004 @04:54PM (#11109023) Journal
      The point was, I think, that the Voodoo 4 and Voodoo 5 were last ditch efforts for survival by 3DFX when faced with more competition from a fast-growing 3D acceleration industry. IIRC, the performance of those cards was nearly matched by a single GPU from nVidia, so they weren't an attractive deal (being large, expensive, power hungry beasts). This card, however, doesn't have any obvious competition, yet, and by the time it does, I'm sure nVidia will have added SLI to their latest and greatest too. Additionally, PC buyers and makers more readily accept large coolers, whereas in the days of the Voodoo 4, the cooling required for the heat generated by all the chips just seemed silly.
      • Actually, the Voodoo 2 was the first card capable of SLI and it sold like hotcakes. I remember everyone I knew had a Voodoo 2 SLI rig. By the time the Voodoo 4 and Voodoo 5 had hit the market, 3Dfx simply couldn't keep up with Nvidia and everyone that was serious about gaming had switched to Nvidia already.

        Also, I don't remember who made it, but there was an "Obsidian 3D" card that was dual Voodoo 2 chips on a single board. I know of one person who had one of these. The main problem with those cards wa

        • Voodoo Graphics was capable of SLI too. I have a workstation card with two Voodoo Graphics chipsets on it, made by Quantum 3D. I don't know of any consumer cards which were marketed as being SLI capable.
      • Re:Deja Voodoo (Score:2, Informative)

        by RipTides9x ( 804495 )
        Wrong wrong wrong wrong wrong. The voodoo 4 and 5 series was NOT a last ditch effort.

        The design that became the Voodoo4 & 5 series was in development from the moment the Voodoo 2 series was released. The SLI design on one card was the promise. But the design became a victim of feature creep and got held up in development for well over 3 years. (T-Buffer, first card to do usuable FSAA)

        By the time it was released the 3D market (Nvidia) had already Leap-Frogged them (Transform and Lighting on chip), mana
    • I assume you mean it would be ironic if Nvidia was killed because they didn't incorporate multicore GPU's... or that nvidia was killed because they did incorporate multicore. Either way it isn't irony unless your last name is morisette - here is a definition of irony:

      Irony involves the perception that things are not what they are said to be or what they seem.
      That's from our sacred cow, wikipedia.

      What you're looking for is poetic justice, which is defined as:
      Poetic justice refers to a person receiving pun
    • There were a whole lot of reasons why 3dfx went belly up, but SLI and multicore were not it. What I have always wondered and never heard was: What happened with the big Army contract that 3dfx got for running the new displays in helicopters? Did this project just go away or did another company step in? Nvidia?

  • by caerwyn ( 38056 ) on Thursday December 16, 2004 @04:35PM (#11108777)
    The article title at Tom's Hardware is a little misleading. This is certainly *not* the first graphics card with two chips on it- back in the days of the ATI Rage chips, ATI had a Rage Fury MAXX that used two chips to render alternate frames.
    • I had a Rage Fury Maxx. I never had that issue with the black screen and no reboot. I had issues with the performance. I had a Rage Fury, and it performed as well as the Maxx did. I did get it to work once, but then I rebooted.

      Also, there are the 3dfx cards, voodoo 4 and 5 like everyone else has said.

  • Dubious Information (Score:5, Interesting)

    by webword ( 82711 ) on Thursday December 16, 2004 @04:35PM (#11108778) Homepage
    Not based on actual data. Tom's Hardware has NOT run any tests yet. Take what you read with a grain of salt.

    "Sources told Tom's Hardware Guide..."

    "Tom's Hardware Guide's test lab staff will run the 3D1 through its benchmark track, as soon as the card becomes available."

    IMHO, this is a PR coup by Gigabyte to get something into Tom's Hardware. But more importantly, why post this on Slashdot now? Let's see some data first. Let's see the results of the tests.
  • Doom4 (Score:2, Funny)

    by kompiluj ( 677438 )
    You will need two such cards to play Doom4 in 640x480 at 25 fps :)
  • this reminds me of the voodoo2 cards. clearly we have hit another speedbump in video technology development, and if history serves as a good model we'll have to see a real revolution in architecture rather than speed before we can start moving away from brute-force improvement again.
    • Keep in mind that Gigabyte is just a company licensed to use nVidia's chipsets in their cards. From what I can tell, it's Gigabyte taking advantage of the SLI capabilities and putting two cards' worth of hardware on a single PCI-Express board. This isn't nVidia trying to squeeze the last bit of power for lack of something better. I'm sure they're hard at work on the GeForce 7x00 chipsets.
  • by Lethyos ( 408045 ) on Thursday December 16, 2004 @04:36PM (#11108810) Journal
    ...the SLI card will lift current 3DMark2003
    record revels by a significant margin...

    Unfortunately, it's only available in Asia.

  • by Anonymous Coward on Thursday December 16, 2004 @04:37PM (#11108820)
    ... the following Slashot community concerns:

    1) Does it run under Linux?
    2) Even better, can I install Linux on it?
    3) Does it increase Firefox's market share?
    4) Does it make Bill Gates look bad?
    5) Is it in any way related to Star Wars?
    6) Will it make my porn look better?

    Prompt reponses will be greatly appreciated.

    -Slashdot
    • You forgot the first one:

      0) does it play Ogg Vorbis files?
      • I'm curious about both this post and the GP. What's exactly wrong with wanting Linux drivers for something? If Gigabyte wants me to buy this card I want Linux drivers, period. I run Windows as well, not to mention OpenBSD, FreeBSD, OS X, whatever works. I will not purchase a graphics card without at least a comitment from the company that they will provide Linux drivers, case in point, I purchased a Centrino based laptop even though Centrino was not supported on Linux by Intel. Intel, however, had a ni
        • What is wrong with Firefox?...

          Someone please answer this if you know a solution. Whenever clicking on a picture link or thumbnail on a webpage, firefox always wants to open the picture through another window. And it always asks permission, no matter what settings I have. Even worse, when right clicking and selecting open in new window, firefox displays a blank window followed by the other extra picture display window.

          Am I missing something? Is there a way to set it so it functions like IE and displays
        • 1. It was a joke. Laugh, silly.

          2. The sum total of my post was "You forgot the first one: 0) does it play Ogg Vorbis files?" The rest of the shit you're talking about has nothing to do with me or anything I said, so why not address the person who you're clearly pissed off at instead of taking it out on my ass?
    • 6) Will it make my porn look better?

      7) Or download faster?
      8) Will it shield me from the **AA?
      9) Do I get Profit!?

  • And what are the chances of a dual GPU pci-express card coming out after this, with the compatibility to be run DUAL SLI mode with a 2nd Dual GPU card? ~CYD
  • Oh I see that these latest cards are finally taking the modder's advice and adding integrated blue LEDs, for that extra burst of raw rendering power.

    I know that people are cutting holes in their cases so people can admire their wiring, but I'd like to pay a bit less and save the R&D costs on the appearance-enhancing design. Plus, if this is a budget card, will appearance matter as much? It's like putting nice rims on a Yugo, I see the point but you're not fooling anyone.
    • If they put the bling on their card, fewer modders will try doing the work themselves, and there will be fewer 'warranty' returns that wouldn't have failed without that extra little bit of help.

      Plus, consider your market. Many (if not most) gamers who will pay for high end equipment want it to be easily distinguishable from low end equipment so they can show off their stuff. How many gamers paying $600 for a video card can recite exactly whats on their system at any given time? A very high percentage.
    • Plus, if this is a budget card...
      I think you've got the wrong article. This is the one about the record-breaking-fast dual GPU card, not the one from earlier about the system-RAM-using value card.
  • by Anonymous Coward
    For the Bitboys card I pre-ordered.

  • I just spent a few buckazoids buying an Arctic Cooler for my 9800 Pro to quiet it down, and it had just one medium-speed fan. I can't imagine what this beast will sound like.

    Okay, well, I guess I can...

    - Leo
  • How the hell am I supposed to watercool that? My box got uber-hot once I stuck a GeForce 5900XT in, combined with my HD's and CPU. I got a water-cooling system to combat this problem. Do they even make graphics card water blocks that support multiple chips?

    • The best solution I can think of for your cooling problems is a <a href="http://www.apple.com/powermac>cheesegrater</ a>. Go out to an applestore (the only computer shop you risk bumping into hot chicks foolin' 'round with ipods) and stare in awe... At the girls! Oh man, not the beedin' mac... it's cool yeah, but it won't get you laid...
  • So this is basically two 6600GT cards glued together on one PCB. That's all well and good, but it's barely faster than one 6800 Ultra. It would probably be slower when gobs of video memory are required, because the quoted 256MB on a 256-bit bus is really split half and half between the two GPUs, and texture data will have to be duplicated in both, so there's less usable video memory on one of these contraptions than on a single-GPU 256MB card.

    Two 6800s on a single card would be nice, though, as it would
  • I didn't see anything in the article that specifies whether the card is PCI-E only. I suspect it is, but thought I'd ask if anyone has more details.
  • One wonders whether or not one could hear ones self think with one of these installed.

    Seriously though, three cheers to Gigabyte. They've outsped their competition by thinking outside of the box.

  • no.. (Score:2, Interesting)

    by destiney ( 149922 )

    I strongly advise you to not do business with Gigabytes Technologies.

    Dealing with them on a bad motherboard (brand new) proved to take me nearly 3 months. Meanwhile _none_ of my emails or phone calls were _ever_ returned. They only took progressive action when I called them and waited to be spoken with. The support person even hung up on me once when the conversation became heated over the long wait time. They refused to send me a replacement/loaner motherboard and had no other alternatives for me but
    • Boy if you think this is bad, you obviously haven't seen companies like Sapphire Tech. They respond quick, too quick. When I send them a broken RMA product, I swear they just resend me someone else's broken RMA product.

  • It's perfectly reasonable to have multiple graphics processors, but if they weren't designed to work together, don't expect a huge performance gain. Dynamic Pictures used to make boards with 1, 2, or 4 GPUs. But the GPU was designed for that. Display lists and textures were shared, and Z-buffer and screen buffer were not, which is what you want. If you have to duplicate everything for each GPU, as apparently is done here, it's much less of an improvement. Not only do you need more RAM, but the main CPU
    • I would expect it to work similar to two separate cards in SLI - with maybe some extra efficiencies from being on the same card, with the exception of having to split the memory...unless of course they have found a way for the chips to avoid redundant memory use.

      I would expect it to be at least as good as 2 128MB 6800s.
  • They didn't (sorry if this is a dup submit, winblows got jumpy on me), they were bought by nvidia. Who says that the 6600 isnt designed for multi core? They own the research to do it, thanx to 3dFX. Who wants to bet the sli infrastructre is based off of that tech. and the first multi chip card was way before this. I dont know if you want to count daisychained voodoo II's or not, but the two chips did outperform one. But i dont think that should count.
  • I have an Athlon XP, KT400 Chipset and a Radeon 9700Pro, which is a rig I've been running since Christmas 2002. I game quite a lot.

    I've been really annoyed by the dilemma of which way to go -
    SLI board with a 6800 GT/Ultra (a second one to be added in a bit later when they're cheaper on ebay), or a gradual upgrade from my Athlon XP and KT400 Board to an Athlon 64/VIA KT890 Pro (which will have *both* PCIe x16 and AGP) allowing me to retain my Radeon and squeeze several more months, maybe a year, out of it,
  • Great availability (Score:3, Informative)

    by haraldm ( 643017 ) on Friday December 17, 2004 @04:12AM (#11113892)
    "The card is cooled by two on-board fans." Suuuper. Really cool. Statistically, one out of two fans will fail twice as often as a single fan. In other words, the MTBF is halved, while the noise is raised by 3 dB. And the assembly doesn't exactly look like you can easily replace the fans by aftermarket fans. I wonder how this spiffy card performs when one of the GPUs blows up. But maybe the PCB has some predetermined breaking points to punch out a blown GPU. This will also reduce the blue light by 3 dB. Bad for gamers.

"Protozoa are small, and bacteria are small, but viruses are smaller than the both put together."

Working...