Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

Nvidia Reintroduces SLI with GeForce 6800 Series 432

An anonymous reader writes "It's 1998 all over again gamers. A major release from ID software, and an expensive hotrod video card all in one year. However, rather than Quake and the Voodoo2 SLI, it's Doom3 and Nvidia SLI. Hardware Analysis has the scoop, 'Exact performance figures are not yet available, but Nvidia's SLI concept has already been shown behind closed doors by one of the companies working with Nvidia on the SLI implementation. On early driver revisions which only offered non-optimized dynamic load-balancing algorithms their SLI configuration performed 77% faster than a single graphics card. However Nvidia has told us that prospective performance numbers should show a performance increase closer to 90% over that of a single graphics card. There are a few things that need to be taken into account however when you're considering buying an SLI configuration. First off you'll need a workstation motherboard featuring two PCI-E-x16 slots which will also use the more expensive Intel Xeon processors. Secondly you'll need two identical, same brand and type, PCI-E GeForce 6800 graphics cards.'"
This discussion has been archived. No new comments can be posted.

Nvidia Reintroduces SLI with GeForce 6800 Series

Comments Filter:
  • For Rich Folks Only (Score:5, Interesting)

    by Brain Stew ( 225524 ) <zackwag@@@verizon...net> on Monday June 28, 2004 @08:55AM (#9549858) Homepage
    These cards are expensive enough, now they are suggesting we buy 2!?

    I guess if you have a lot of money and want to play with a (marginal) advantage, an SLI setup is for you.

    As for myself, I am a poor college student not even able to afford 1 of these cards. A situation I think is similar to a lot of other geeks/gamers.

    Which begs the question, who is this aimed at?
  • Just a band aid.. (Score:5, Interesting)

    by eddy ( 18759 ) on Monday June 28, 2004 @08:57AM (#9549871) Homepage Journal

    ... till we have multi-core and/or multi-GPU consumer cards. (they're already available [darkcrow.co.kr] at the high-end)

    Questionmark.

  • by gl4ss ( 559668 ) on Monday June 28, 2004 @08:59AM (#9549883) Homepage Journal
    like 3dfx they bought?

    maybe they shouldn't have.. sure they probably had some great people and so on but ultimately "it didn't work out".

    "hey, we can't keep up! let's just use brute force on increasing our cards capabilities!!! that's cheap and economical in the long run keeping our company afloat, right? right??"

  • by FrO ( 209915 ) on Monday June 28, 2004 @08:59AM (#9549889)
    Can you hook up 4 monitors to this badass configuration?
  • Power Requirements (Score:5, Interesting)

    by Anonymous Coward on Monday June 28, 2004 @09:00AM (#9549895)
    So, One card that requires a 400 Watt power supply + Another card that requires a 400 Watt power supply = The need for an 800 Watt power supply?!
  • New Motherboards (Score:3, Interesting)

    by Anonymous Coward on Monday June 28, 2004 @09:00AM (#9549896)
    It's a bit presumptuous to assume that when these SLI cards come out, the only motherboards supporting multiple PCI-E x16 slots will be Intel Xeon based. As far as I knew, AMD were planning on doing 939 based motherboards with multiple PCI-E.

    At any rate, doesn't this sort of make the whole Alienware Video-Array seem like a bust?
  • by vi (editor) ( 791442 ) on Monday June 28, 2004 @09:01AM (#9549906)
    Why don't they make a graphics card with two GPU and double memory size ? Or wouldn't fit on of these buggers into a computer case ? Yes they exploit the dual PCIex busses, but it doubt that they really use the would bandwith.
  • ALX (Score:3, Interesting)

    by paradesign ( 561561 ) on Monday June 28, 2004 @09:04AM (#9549931) Homepage
    How does this stack up against Alienwares ALX dual graphics card system. I remember reading an article where the Alienware guys bashed the SLI method. Theirs, each card renders half the screen, either top or bottom, not every other line.
  • by PIPBoy3000 ( 619296 ) on Monday June 28, 2004 @09:09AM (#9549979)
    I picked up a Voodoo 2 card way back when for the incredibly high price of $300 (which was a ton close to ten years ago with the money I was making). A couple years later, I picked up my second Voodoo 2 for $30.

    Think of it as an inexpensive way to nearly double your video card's performance at a fairly cheap price when others are upgrading to the new version of the card that is only 40-50% faster (unlike the SLI mode which is rumored to be 75-90% faster).

    The tricky part will be that you have to have a motherboard to support it, which for now will only be the ones made for high-end workstations.
  • Re:Reliability (Score:5, Interesting)

    by PhrostyMcByte ( 589271 ) <phrosty@gmail.com> on Monday June 28, 2004 @09:13AM (#9550020) Homepage
    Never mind how they are held together. The Geforce 6 already requires a shitload of power (2 molex connectors on the rear of it) and puts out a lot of heat. So you have two very hot cards right next to eachother, one of them getting really bad airflow. If one of your $500 video cards doesn't die, your PSU surely will!
  • Only Nvidia? (Score:3, Interesting)

    by ViceClown ( 39698 ) * on Monday June 28, 2004 @09:24AM (#9550079) Homepage Journal
    This is the mobo design Alienware came up with, right? My understanding is that you can use ANY two video cards that are the same and are PCI-X. You could just as well do two ATI cards. Who submitted this? Nvidia marketing? :-)
  • ATI Rage Fury Maxx (Score:2, Interesting)

    by Xocet_00 ( 635069 ) on Monday June 28, 2004 @09:58AM (#9550348)
    ATI released a card with multiple GPUs on it a few years ago, putting a pair of Rage 128 chips onto a single card. It provided at best a marginal performance increase, but was still a neat idea at the time.

    More info here [firingsquad.com].
  • Re:Just a band aid.. (Score:2, Interesting)

    by eddy ( 18759 ) on Monday June 28, 2004 @10:16AM (#9550506) Homepage Journal

    >Are you suggesting that the average consumer go out and purchase a 6000q to play HalfLife2?

    No? I'm suggesting that connecting two cards via SLI is not very likely to become commonplace, because in the future we'll have multicore or multi-GPU consumer cards instead. They'll perform "twice as fast" but won't use twice the power and twice the space and they won't cost twice the cost.

    The reasons for this belief, given current implementation of SLI[0], are many:

    1. It req. two slots / lots of space.
    2. The cards are so expensive, buying two at the same time/price is not very tempting.
    3. The development is going so fast that buying a matching card at a later date is going to be a performance and feature loss compared with buying a new single card.
    4. Buying a matching card later is only really an option if the drivers for your old one is still being updated (this is somewhat better nowadays than back at V2 time). If not, your stuck with a fast config that only runs well on older hardware and older games. ("Please downgrade to DirectX9!")
    5. Do you think nVidia and ATI would like you to buy a new card, or pick up an older one to run in SLI?([0])

    6. Multi-core is "the in thing".
    7. The new busses are so fast that putting multi-core or multi-GPUs on one card isn't going to starve them.

    >fact that your comparing apples and oranges.

    Comparing?! I'm not comparing. I'm contrasting two solutions. High-end use both interconnected and multi-GPU tech. I'm simply pointing out that I find it much more likely for consumer hardware to go multi-core instead of SLI.

    [0] Must have matching cards, can't mix one generation with the next. If this changed, SLI might have a future.

  • by PixelSlut ( 620954 ) on Monday June 28, 2004 @10:41AM (#9550712)
    Which begs the question, who is this aimed at?
    I think that's one of the most interesting questions asked here today. Of course NVIDIA's biggest market right now is gaming, but that's not their only market. They've recently gotten into the business of film CG with their Gelato [nvidia.com] product, a hardware-accelerated finale-frame renderer for high-level graphics like that in film. Multiple GeForce 6800 Ultra video cards doesn't sound very useful for gamers, but multiple next-gen Quadro cards sounds extremely useful for a Gelato system, assuming that it will really take advantage of all the features of both boards.

    Something that I'm really interested in knowing is how it compares to Alienware's Video Array [alienware.com] technology. Video array suffers from the problem of being totally useless, because it only increases fillrate but it doesn't improve shader performance. Almost no game these days is fillrate-limited, but many games are limited by shader performances. I hope that NVIDIA's SLI solution is more interesting.

    Obviously, the other big thing here is just to have another one-up over ATI. :)

  • by afidel ( 530433 ) on Monday June 28, 2004 @11:00AM (#9550878)
    Unless you are trying to do something really cool like use the GPU as a really high performance SIMD coprocessor. With the advent of general purpose languages for the vertex shaders this is exactly what the GPU is, but you can't use it as such because of AGP's abysmal writeback performance. I know of quite a few high performace computing people that are interested in what will be capable with multiple PCIe GPU's plus dual host CPU's. The performance per case might be high enough to justify the cost of the cards since more communication can be local avoiding the need for more expensive interconnect channels.
  • That's hilarious (Score:3, Interesting)

    by JMZero ( 449047 ) on Monday June 28, 2004 @11:00AM (#9550885) Homepage
    my Boxx FX53 + X800

    Their target market is apparently "you" - you're just in the wrong place in your cycle. Right now, you're in the sour grapes phase, denying the possibility that anyone could want a better computer than yours (they already do). Soon you'll be in the lust phase, then you'll be in the "MUST BUY SHINEY THING! PLEASE TAKE CREDIT CARD!" phase.

    I remember a time when it was unimaginable who might need a 386.
  • by TubeSteak ( 669689 ) on Monday June 28, 2004 @11:18AM (#9551018) Journal
    forget power requirements, what about the effin cooling? Does it strike anyone else as extremely stupid to put two scorching hot graphics cards back to back? [hardwareanalysis.com] I mean... come on!

    Alienware took a very different tack with their solution [pcper.com] because it requires a 3rd PCI slot AND it's analog (3rd & 4th pics). I guess its a series of tradeoffs: Space vs flexibility, with Nvidia winning the battle for space but losing on flexibility.

    That aside, its rediculous that nvidia is expecting their OEM cooling solutions to do any kind of justice to the heat from those cards. Alienware already expects water cooling to be part of the solution and has cases designed accordingly... couldn't NVIDIA have done it any other way? Do they absolutely have to have a hardware link between their cards?

    "A power draw of 250 Watts for the 6800 Ultra SLI solution is very realistic."
    Then explain how this will work [tomshardware.com].

  • by tstoneman ( 589372 ) on Monday June 28, 2004 @12:25PM (#9551617)
    Ahhh, what sweet memories.

    I bought a shitload of 3DFX stock back in the late 90s because they were the king of 3D. I remember walking into a computer store, and seeing something on the screen... I thought it was clip from a movie, but they told me it was Mechwarrior 2 (I think 2) playing on a Voodoo card. My mind was blown. How they got movie-like graphics onto a computer was beyond my capacity to understand. I dropped the $350 and bought one immediately and played with it and loved it.

    Then, after a while, I thought, 3DFX is the king and they will never die. I put my money where my mouth was and forked over my entire savings to buy 3DFX, around $15k. There-in I learned a few great lessons:

    1) The best technology doesn't mean the best company. "Good enough" with a better run company will usually blow you away. Ask Microsoft or nVidia (well, at the time nVidia wasn't the top runner that it is today).

    2) No matter how great of an explanation you make, the stupidest things like 16-bit color vs 32-bit color can kill you (22-bit color just doesn't cut it to the dumb-ass consumers). It's better to just cross your t's and dot your i's in the first place so that you don't have any such vulnerabilities.

    They went tits up, and I basically lost my money. nVidia bought the remaining pieces of 3DFX, and that includes all their patents. I'm not surprised they went SLI, and for companies that use it like 3d effect companies, it will probably save them bundles of time.
  • by LoocSiMit ( 760771 ) on Monday June 28, 2004 @12:34PM (#9551693)
    Sometimes turning up the resolution gives you an advantage, but often turning off some eye-candy gives you an advantage too. Draw distances for leaves and trees reduce, textures are less complex making it easier to spot enemies...

    I play Wolf ET (and so should you) and I have the eye candy turned up. Fairly often I will get shot by someone I can't see behind the smoke billowing out the top of a broken tank because the person shooting at me doesn't have that bit of eye candy switched on (they still see smoke, but it's easier to see through).

    I play with the eye candy turned up as far as I can without getting crappy frame rate because it gives a more immersive experience, not for a tactical advantage.

  • by angulion ( 132742 ) on Monday June 28, 2004 @05:34PM (#9554895)
    If you look closly at the picture of the nVidia cards, you'll notice that they are double-high - ie. eating a total of 4 card-slots.

THEGODDESSOFTHENETHASTWISTINGFINGERSANDHERVOICEISLIKEAJAVELININTHENIGHTDUDE

Working...