Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

Nvidia Reintroduces SLI with GeForce 6800 Series 432

An anonymous reader writes "It's 1998 all over again gamers. A major release from ID software, and an expensive hotrod video card all in one year. However, rather than Quake and the Voodoo2 SLI, it's Doom3 and Nvidia SLI. Hardware Analysis has the scoop, 'Exact performance figures are not yet available, but Nvidia's SLI concept has already been shown behind closed doors by one of the companies working with Nvidia on the SLI implementation. On early driver revisions which only offered non-optimized dynamic load-balancing algorithms their SLI configuration performed 77% faster than a single graphics card. However Nvidia has told us that prospective performance numbers should show a performance increase closer to 90% over that of a single graphics card. There are a few things that need to be taken into account however when you're considering buying an SLI configuration. First off you'll need a workstation motherboard featuring two PCI-E-x16 slots which will also use the more expensive Intel Xeon processors. Secondly you'll need two identical, same brand and type, PCI-E GeForce 6800 graphics cards.'"
This discussion has been archived. No new comments can be posted.

Nvidia Reintroduces SLI with GeForce 6800 Series

Comments Filter:
  • Re:When (Score:5, Insightful)

    by dave420 ( 699308 ) on Monday June 28, 2004 @08:57AM (#9549870)
    So you don't want to hear about the cure for cancer until it's in your pharmacy? News is just that - new stuff. Just because you can't fork over some money for something doesn't mean it's not newsworthy or of interest to the /. community. Are you American? :-P
  • by King_of_Prussia ( 741355 ) on Monday June 28, 2004 @08:58AM (#9549876)
    14 year old 1337-sp33king white boys living with their rich parents. The same people who will use these computers to play counterstrike with hacks on.
  • Reliability (Score:5, Insightful)

    by lachlan76 ( 770870 ) on Monday June 28, 2004 @09:02AM (#9549912)
    Am I the only person who thinks that holding the two together with a non-flexible medium and is held on only with solder is a bit dangerous? Not that the solder would break, but when it is removed, it could be a bit tricky. Perhaps a cable on there would be safer.

    Other than that the only problem I can see is that you need about AU$2000 worth of video card, and at least AU$1000 worth of Xeon to use it. Maybe for engineers and artists, but will the average person have any use for it? I don't feel that an extra AU$3000 is worth it for the extra frame rate in games.

    For the pros though it would be very good though.
  • 4 slots (Score:5, Insightful)

    by MoreDruid ( 584251 ) <moredruid@gmai[ ]om ['l.c' in gap]> on Monday June 28, 2004 @09:04AM (#9549933) Journal
    OK, I'm all for performance gain and pushing the limit, but geez, 2 of these cards take up 4 slots. How are you supposed to squeeze in your Audigy card with extra connectors and still put in your extra firewire/usb?

    And I'm also wondering how the heat is going to be transferred away from the cards. It looks like you need some serious cooling setup to keep those two babies running.

  • Bah... (Score:5, Insightful)

    by mikis ( 53466 ) on Monday June 28, 2004 @09:05AM (#9549947) Homepage
    Call me when they put two GPUs on one card... Or even better, when they put two cores on one chip. Soon enough motherboard will be an add-on to graphic card.

    Plus, many people were upset about power and cooling requirements. This monster would occupy FOUR slots and require, what, a 600W PSU? (ok, just kidding, "only" 460W should be enough)
  • by Chas ( 5144 ) on Monday June 28, 2004 @09:09AM (#9549981) Homepage Journal

    1: Resort to idiotic 3DFX-like measures to get high performance

    Note: A 77% increase in gaming performance isn't "high performance". Considering that the 6800 is ALREADY a massive leap forward over it's predecessor, it's INSANE PERFORMANCE!

    How would something like 1600x1200 with maxed FSAA and maxed AF, while never dropping below 60fps, grab you by the short and curlies?

    2: Watch company slowly die.

    Nobody's suggesting that everyone and their brother run out and get SLI'd GeForces on a Xeon platform. (Those already spending 4-5000 dollars on such a platform aren't necessarily going to shrink from an additional $4-500, especially if it nearly doubles video performance.)

    This is going to probably be limited to those who'd normally use Quadro cards (productivity) and the elite few with more money than sense.

    Not that everyone won't WANT one...

  • NDA Leak. (Score:2, Insightful)

    by Anonymous Coward on Monday June 28, 2004 @09:11AM (#9549998)

    All great news.. but WHEN can I find it available in the stores, that would be NEWS.

    Dude, this is an NDA leak! If you're trying to imply nVidia is peddling vaporware, well, you might be right, but in this case they're actually not the ones doing the peddling, because their SLI setup is still under NDA.

  • Re:ALX (Score:3, Insightful)

    by kawaichan ( 527006 ) on Monday June 28, 2004 @09:12AM (#9550005) Homepage
    dude, alienware basically is using nvidia's SLI method for their alx boxes

    noticed that they were using two 6800s for their benchmarks?
  • by mustardayonnaise ( 685416 ) on Monday June 28, 2004 @09:14AM (#9550028)
    John Carmack said about a year and a half ago that Doom 3 would run 'well' on a top-end system of that time- which was a 3.06 GHz P4 equipped with a Radeon 9700 Pro. What's frightening/upsetting is that this SLI setup really isn't coming into play to satisfy the games of today like Doom 3- it's coming into play for the games of next year and the year after. It's just a little off-putting that in order to play the newest games you need a SET of graphics cards with those kind of power and space requirements.
  • by GuyinVA ( 707456 ) on Monday June 28, 2004 @09:14AM (#9550030)
    If you wait a couple of months after it's release, you can probably save 50%. It's just another graphics car that will be outdated in a few months.
  • by Kjella ( 173770 ) on Monday June 28, 2004 @09:15AM (#9550035) Homepage
    ...priorities. If gaming is your life (or if you're a working man with a gaming fix), two of these aren't that "extreme". People easily spend 10k$+ more on a car than a car that'd get them from A to B just as safely and easily, just for style and more luxury.

    If gaming is what you do a considerable number of hours of your life, why not? Even as a student, it'd be some weekends without being completely wasted (and maybe work an hour or two as a weekend extra), and you'd have it.

    All that being said, from what I saw with the last cards it looked to me like GPU speed was starting to go beyond what conventional monitors and CPUs could do. And those really huge monitors are usually far more expensive than the GFX cards, even two of them.

    2xGF6800 = 10000 NOK
    Sony 21" that can do 2048 x 1536/86 Hz = 14000 NOK ...and that was the 3rd webshop I had to go to in order to actually find one of those - most now have some legacy 17 and 19" CRTs and the rest LCDs, which go no further than 1600x1200 (even at 21") and don't need an SLI solution.

    Personally, I'll probably stick to GF4600 until hell freezes over, I just don't manage to get hyped up on the FPS games anymore. I'd rather go with a HDTV + HD-DVDs, should they ever appear...

    Kjella
  • by Fizzleboink ( 700171 ) on Monday June 28, 2004 @09:26AM (#9550098)
    I seem to remember that one of these cards took up 2 slots, and needed a third just for good air flow. How much space are these going to take up? Also, just one of these bad boys needed something like 400-500W of power. What kind of power supply is needed for 2???
  • Xeons? (Score:5, Insightful)

    by ameoba ( 173803 ) on Monday June 28, 2004 @09:27AM (#9550105)
    Why would they design something like this and force it to use a Xeon?

    For starters, the Xeon is still stuck at a 533MHz FSB, limiting its performance. Add in the fact that they're ridiculously overpriced & most games show little to no performance improvement when running on an SMP system. A single P4 or Athlon64 will stomp the Xeon in almost all gaming situations.

    Of course, with this tech a ways away & there not really being any PCI-E motherboards on the market now that Intel's recalled them all, I guess they're betting on high-end enthusiast boards to ship with the second x16 slot by the time this thing is actually ready for market...

    Really, the biggest application for this kinda power that I can forsee would be game developers who want to see how well their games scale for next-gen video hardware...
  • by The Mad Duke ( 222354 ) on Monday June 28, 2004 @09:28AM (#9550110)
    Hanging in my closet is a "souvenier" from my last adventure with SLI: A Quantum3D Obsidian t-shirt. In my eagerness to own the latest and greatest graphics card I paid 600 bucks up front to preorder this card which was developed by a spin-off from 3DFx. The card shipped 6 weeks late, suffered from overheating since it crammed the components from 2 cards into a single PCI slot, and was soon equaled in performance by a simple pair of Voodoo 2 cards in adjacent slots. I expect a similar fate for this monstrosity since the GeForce 6800 pulls what - 75 watts ? I assume a 500 watt power supply will be required. Thanks, but no thanks.
  • by PhrostyMcByte ( 589271 ) <phrosty@gmail.com> on Monday June 28, 2004 @09:31AM (#9550143) Homepage
    Which begs the question, who is this aimed at?

    Well, I bet the developers of the beautiful Unreal Engine 3 [unrealtechnology.com] are using this. Current hardware can't run it at very playable framerates. I remember them saying you'll need 2GiB of RAM to play it maxed out.
  • by flsquirrel ( 115463 ) on Monday June 28, 2004 @09:32AM (#9550146)
    Ok, I'm not too much older than you if you're still in college, but I'm going to play old curmudgeon anyway.

    you can put together a decent solution for computing now for around a grand. Kick in another $250 for needing a good workstation board to get the right slots and say $600 ($300x2) for the two cards and you're still just under $2000. THIS IS CHEAP. I'm sorry. I know how many lawns I had to mow as a youngin to buy my first pentium 60. That was $2k for JUST the computer and monitor. That included a baseline 1 meg video card, no cdrom and no sound. The cdrom and sound card cost me another $400 a couple months later.

    So cry me a freaking river. Get a weekend job. Stop spending so much money on booz. If this is a priority for you, then you'll find the money. If it's not a priority, then quit your pissing and moaning.
  • Slightly O/T (Score:2, Insightful)

    by baudilus ( 665036 ) on Monday June 28, 2004 @09:32AM (#9550150)
    Sony 21" that can do 2048 x 1536/86 Hz

    Every serious gamer knows that 86Hz is unacceptable. True gamers know: CRT > LCD / PLASMA. Until you can find me a plasma that can refresh at 125Hz or greater, I'll stick with my 80lb. CRT.

    Any gamer extreme enough to buy two of these cards plus the requisite hardware should be smart enough to know that a flat panel is a waste of money for games. Then again, they are gamers...
  • by real_smiff ( 611054 ) on Monday June 28, 2004 @09:33AM (#9550159)
    no you won't, by the time these games appear there'll be a single card with more power than these. that's how it usually goes anyway. i've been bitten by "2 of anything" (CPUs, video cards) bad value before.
  • by Doppler00 ( 534739 ) on Monday June 28, 2004 @09:39AM (#9550190) Homepage Journal
    performance is not just about "screen resolution". Many people would like to turn more details on in games. In some cases, these details could give you a tacticle advantage. For example, turning shadows on. Sure, it wastes some CPU time, but if you have it turned on, you might spot someone around the corner that you would not have otherwise.
  • Well, duh. (Score:1, Insightful)

    by Anonymous Coward on Monday June 28, 2004 @09:39AM (#9550192)

    Does using 2 video cards really help when we can't use what's alread there?

    Ummm.. yes?!

    Video cards use bus bandwidth. If we, as you point out, have bus bandwidth to spare, then one solution to using it up would be to put in more cards.

    Come on, this isn't exactly rocket science at this'a'here news for nerds site!

  • Sounds good to me (Score:5, Insightful)

    by not_a_product_id ( 604278 ) on Monday June 28, 2004 @09:43AM (#9550215) Journal
    No point in complaining. Let the folk rich enough (stupid enough?) to afford it, buy it. Either it just won't take off (in which case you've saved yourself a load of cash) or it'll go great, the price will drop, the bugs will be ironed out and you'll get it at a price you can afford.
    What is there to complain about?
  • Re:When (Score:1, Insightful)

    by OptimoosePrime ( 612749 ) on Monday June 28, 2004 @09:48AM (#9550254)
    He didn't say he couldn't afford it. In fact, he said quite the opposite. He wanted to know *when* *he* *could* buy it. That suggests that he has the money and is only waiting for the goods. And do you think it wouldn't be newsworthy when cancer meds actually hit the pharmacy shelves?

    Cancer Patient at the Pharmacy: "What the....? When did you get cancer medicine? I've got like 2 weeks to live!"
    News Anchorman who also happens to be in the Pharmacy: "Well, we told you last year that they were *making* new medicines."

    Go have another 420 dave.
  • Cost? (Score:4, Insightful)

    by Watcher ( 15643 ) on Monday June 28, 2004 @09:51AM (#9550283)
    So, lemme get this straight-in order to get a 77% speed increase, I'm going to have to blow hundreds on a second card ($400), xeon processor, motherboard, memory, and a damned good cooling system so it all doesn't melt and I don't go deaf? Wouldn't it make more sense to buy a decent card now, and wait two years for them to put out the single GPU card that does the same performance for $200? Unless you're really worried about dropping under 100 frames, or you have a lot of high end rendering to do, I can't imagine this really being worth it. At least with the Voodoo 2 SLI system you could buy a second card without having to invest in a huge honking system that makes a dual G5 look cheap.
  • by telstar ( 236404 ) on Monday June 28, 2004 @10:10AM (#9550457)
    "I guess if you have a lot of money and want to play with a (marginal) advantage, an SLI setup is for you."

    "On early driver revisions which only offered non-optimized dynamic load-balancing algorithms their SLI configuration performed 77% faster than a single graphics card. However Nvidia has told us that prospective performance numbers should show a performance increase closer to 90% over that of a single graphics card."
    • I wouldn't consider 77% or 90% to be marginal.
  • by Phrogger ( 230179 ) on Monday June 28, 2004 @10:14AM (#9550483) Homepage
    > Which begs the question, who is this aimed at?

    It will be aimed at the hardware reviewers. The resurrection of SLI will get back Nvidia's ranking as Number 1 for high performance video. I would imagine a few gamers with more money than they need will also setup a dual Nvidia system but the primary audience will be those who publicise performance ratings.
  • by Zebbers ( 134389 ) on Monday June 28, 2004 @11:10AM (#9550969)
    Please don't get your education on Slashdot.

    While in logic the phrase does indeed mean what you say, in everyday speech it is acceptable to beg an obvious question.

    This has been supported by Oxfords Guide to English Usage, since the 80's.
  • by celerityfm ( 181760 ) on Monday June 28, 2004 @12:26PM (#9551627) Journal
    And believe it or not I wish I knew a more pithy way of saying that. Shi, is that even grammatically correct? Anyways-

    This entire thing is really an exercise in besting the competition. In this case, ATI and in this case, they are badly in need of being able to say that they are #1, because ATI stole the crown and hasn't really given it back just yet. This whole mess is really just an exercise in brand building/marketing. Just trying to get people to think of nVidia as the best/etc-- its aimed at those who can only afford one card, trying to influence their overall feeling for nVidia.

    nVidia's 3DFX property didn't have much competition back when it first introduced the voodoo sli.. ahh how times have changed.
  • by quantax ( 12175 ) on Monday June 28, 2004 @12:34PM (#9551686) Homepage
    Given that this configuration requires a Xeon based system w/ the dual PCI-E slots, this seems geared more towards the 3D development end of things, with Maya, Softimage and such. I've yet to meet a gamer-only with a Xeon rig, so this would seem to be a boon for the new Gelato systems, allowing for more GPU power. I just hope Nvidia doesn't end up emulating 3DFx's later moves in which it decides raw speed > innovation, as that is not really a winning strategy, especially these days where we're on the brink of a new age of gaming graphics using advanced shading techniques previously only seen in pre-rendered footage.
  • by Have Blue ( 616 ) on Monday June 28, 2004 @12:49PM (#9551817) Homepage
    If you're getting 120fps and this rig pushes it up to 240fps, the increase in your score and enjoyment of the game is far smaller than if the jump was from, say, 20fps to 40fps. No matter how hardcore you are, there's always an upper limit beyond which the additional frames simply don't matter, and this combination will push any game on the market well beyond it.
  • by Have Blue ( 616 ) on Monday June 28, 2004 @12:55PM (#9551865) Homepage
    Yeah because cutting off a site's primary revenue stream is a great way to ensure they continue to provide free content for you.
  • by IncohereD ( 513627 ) <mmacleod@ieeeEULER.org minus math_god> on Monday June 28, 2004 @01:51PM (#9552448) Homepage
    Yeah because cutting off a site's primary revenue stream is a great way to ensure they continue to provide free content for you.

    Just because you don't see them doesn't mean you don't download them. And if you never click on them anyway...there's no difference to their revenue.
  • by timeOday ( 582209 ) on Monday June 28, 2004 @02:28PM (#9552872)
    I doubt even NVidia expects to sell any of these. It's just the cheapest way to get the name "NVidia" to the top of the hardware review sites' benchmark bar charts.
  • by cr0sh ( 43134 ) on Monday June 28, 2004 @06:10PM (#9555292) Homepage
    External rendering boxes are nothing new - Evans and Sutherland have been doing this for years with their high-end OpenGL rendering hardware for simulation use. I don't know what their current hardware uses (or if it is still external), but they used to use a SCSI-based bus system...
  • by Fweeky ( 41046 ) on Monday June 28, 2004 @07:07PM (#9555788) Homepage
    Get two moderately priced cards, SLI them together and get much better performance than a single high end card for around the same price. Plenty of people pay that for a mere 10% performance difference, what makes you think they won't pay it for 70%+?

    Not all geeks are poor, and not all poor geeks are beyond saving up and spending a large amount of their income on what interests them.

If all else fails, lower your standards.

Working...