Nvidia Reintroduces SLI with GeForce 6800 Series 432
An anonymous reader writes "It's 1998 all over again gamers. A major release from ID software, and an expensive hotrod video card all in one year. However, rather than Quake and the Voodoo2 SLI, it's Doom3 and Nvidia SLI.
Hardware Analysis has the scoop, 'Exact performance figures are not yet available, but Nvidia's SLI concept has already been shown behind closed doors by one of the companies working with Nvidia on the SLI implementation. On early driver revisions which only offered non-optimized dynamic load-balancing algorithms their SLI configuration performed 77% faster than a single graphics card. However Nvidia has told us that prospective performance numbers should show a performance increase closer to 90% over that of a single graphics card. There are a few things that need to be taken into account however when you're considering buying an SLI configuration. First off you'll need a workstation motherboard featuring two PCI-E-x16 slots which will also use the more expensive Intel Xeon processors. Secondly you'll need two identical, same brand and type, PCI-E GeForce 6800 graphics cards.'"
Damn (Score:4, Funny)
Re:Damn (Score:4, Funny)
Now the option actually exists for me to play Doom3 on one of those very high rez LCDs if only I had the balls to mortgage the house for one of these setups.
Thank you nvidia, now I can dream again for something I'll never touch! Bugati Veyron, Liv Tyler, Fort Knox; what would life be like without the pleasure of the untouchable dream?
TW
Re:Damn (Score:4, Informative)
Don't worry about dual PCIe x16 motherboards ... remember nVidia make chipsets as well. Expect an nForce4 chipset at the end of this year supporting their new SLI technology with two PCIe x16 slots supported [theinquirer.net]
Of course, when you are spending $400 apiece on graphics cards, will you really be skimping on the processor and motherboard?
Hmmm... (Score:4, Funny)
But perversly exhilarating to hold an SLI configuration in my hands instead..
Re:Hmmm... (Score:3, Funny)
Please daddy don't trade me for some extra FPS. Everyone knows the human eye can't tell the difference.
Shut up you little brat. I can tell and it ruins my game man!
For Rich Folks Only (Score:5, Interesting)
I guess if you have a lot of money and want to play with a (marginal) advantage, an SLI setup is for you.
As for myself, I am a poor college student not even able to afford 1 of these cards. A situation I think is similar to a lot of other geeks/gamers.
Which begs the question, who is this aimed at?
Re:For Rich Folks Only (Score:4, Insightful)
My Voodoo 2 SLI Story (Score:5, Interesting)
Think of it as an inexpensive way to nearly double your video card's performance at a fairly cheap price when others are upgrading to the new version of the card that is only 40-50% faster (unlike the SLI mode which is rumored to be 75-90% faster).
The tricky part will be that you have to have a motherboard to support it, which for now will only be the ones made for high-end workstations.
Re:My Voodoo 2 SLI Story (Score:5, Informative)
There are very blatant reasons why SLI killed 3DFX as a company. And yes, their downfall began with the Voodoo 2.
The Voodoo 2 with SLI was so incredibly fast that they had no competitor, so every graphics company in the game was making 3DFX cards, and they were all reference designs (with the exception of Canopus and Obsidian).
All those players and 3DFX themselves overestimated demand for their extremely high-priced product. Even worse, they overestimated demand for the SLI add-on at the $300 pricepoint. 3DFX was losing a lot of sales because they didn't have a competitive low-end product until the Banshee, and by then Nvidia had made quite a dent in their marketshare.
All the vendors who used the reference design got bit in the ass once more because the market discovered you could mix Voodoo 2 cards in SLI, so you could buy ANY card (read: cheapest), so all those late upgraders got a sweet deal.
While I don't see SLI destroying Nvidia (they have the diversified product like that 3DFX was lacking), I do expect it to blow up in their faces and lose them money in the long run. The market couldn't bear a $600 graphics solution in 1998, what makes them think it can handle a $900 solution 5 years later?
Re:For Rich Folks Only (Score:2, Insightful)
Never underestimate... (Score:5, Insightful)
If gaming is what you do a considerable number of hours of your life, why not? Even as a student, it'd be some weekends without being completely wasted (and maybe work an hour or two as a weekend extra), and you'd have it.
All that being said, from what I saw with the last cards it looked to me like GPU speed was starting to go beyond what conventional monitors and CPUs could do. And those really huge monitors are usually far more expensive than the GFX cards, even two of them.
2xGF6800 = 10000 NOK
Sony 21" that can do 2048 x 1536/86 Hz = 14000 NOK
Personally, I'll probably stick to GF4600 until hell freezes over, I just don't manage to get hyped up on the FPS games anymore. I'd rather go with a HDTV + HD-DVDs, should they ever appear...
Kjella
Slightly O/T (Score:2, Insightful)
Every serious gamer knows that 86Hz is unacceptable. True gamers know: CRT > LCD / PLASMA. Until you can find me a plasma that can refresh at 125Hz or greater, I'll stick with my 80lb. CRT.
Any gamer extreme enough to buy two of these cards plus the requisite hardware should be smart enough to know that a flat panel is a waste of money for games. Then again, they are gamers...
Re:Slightly O/T (Score:3, Informative)
I think he was talking about a CRT. LCDs aren't capable of rendering even 86 frames per second.
However, if you want the absolute highest resolution, a 3840x2400 LCD may be the way to go.
More O/T (Score:3, Funny)
This equation just made me laugh. Like...if plasma is less than one, just think of how much smaller LCD has to be than CRT!
OK, I'm done.
Re:Never underestimate... (Score:5, Insightful)
Re:Never underestimate... (Score:3, Funny)
"begs the question" (Score:2, Informative)
I recently learned this here, so please don't take this as a criticism.
The phrase "begs the question" doesn't mean what you think it means. It does not mean, "this leads to the question."
Rather, it is a term used in logic to indicate a fallacy in which the question or statement itself tries to prove its truth by asserting its own truth. This is commonly known as circular reasoning. More here [nizkor.org].
I agree with you about wondering who the product is aimed
Re:For Rich Folks Only (Score:3, Insightful)
Well, I bet the developers of the beautiful Unreal Engine 3 [unrealtechnology.com] are using this. Current hardware can't run it at very playable framerates. I remember them saying you'll need 2GiB of RAM to play it maxed out.
Re:For Rich Folks Only (Score:5, Insightful)
you can put together a decent solution for computing now for around a grand. Kick in another $250 for needing a good workstation board to get the right slots and say $600 ($300x2) for the two cards and you're still just under $2000. THIS IS CHEAP. I'm sorry. I know how many lawns I had to mow as a youngin to buy my first pentium 60. That was $2k for JUST the computer and monitor. That included a baseline 1 meg video card, no cdrom and no sound. The cdrom and sound card cost me another $400 a couple months later.
So cry me a freaking river. Get a weekend job. Stop spending so much money on booz. If this is a priority for you, then you'll find the money. If it's not a priority, then quit your pissing and moaning.
Re:For Rich Folks Only (Score:5, Informative)
For reference (not just for you), PCI-X is PCI on steroids, a faster, wider (64-bit) edition of the PCI bus which is used in high end servers and the Apple PowerMac G5.
PCIe (aka PCI Express aka 3GIO) is the brand new multi-channel serial expansion bus that will be appearing on consumer-level motherboards in the next few months and will eventually replace both AGP and PCI.
Sounds good to me (Score:5, Insightful)
Re:Sounds good to me (Score:4, Insightful)
Re:For Rich Folks Only (marginal?) (Score:3, Insightful)
"On early driver revisions which only offered non-optimized dynamic load-balancing algorithms their SLI configuration performed 77% faster than a single graphics card. However Nvidia has told us that prospective performance numbers should show a performance increase closer to 90% over that of a single graphics card."
Re:For Rich Folks Only (Score:3, Insightful)
It will be aimed at the hardware reviewers. The resurrection of SLI will get back Nvidia's ranking as Number 1 for high performance video. I would imagine a few gamers with more money than they need will also setup a dual Nvidia system but the primary audience will be those who publicise performance ratings.
Re:For Rich Folks Only (Score:3, Insightful)
Not all geeks are poor, and not all poor geeks are beyond saving up and spending a large amount of their income on what interests them.
ATI X800 advertisement (Score:3, Funny)
Re:ATI X800 advertisement (Score:2, Funny)
Re:ATI X800 advertisement (Score:5, Funny)
Yes, because you are the only one not using Flash Blocker or AdBlock in your Moz/Firefox. *rolls up newspaper and lightly smacks you on head* "Bad web surfer! Bad!"
Re:ATI X800 advertisement (Score:4, Insightful)
Re:ATI X800 advertisement (Score:3, Insightful)
Just because you don't see them doesn't mean you don't download them. And if you never click on them anyway...there's no difference to their revenue.
Just a band aid.. (Score:5, Interesting)
... till we have multi-core and/or multi-GPU consumer cards. (they're already available [darkcrow.co.kr] at the high-end)
Questionmark.
Re:Just a band aid.. (Score:2, Informative)
I have an Indigo2 with MaxIMPACT graphics. It has 2 Geometry Engines and 2 Raster Managers. I believe that each set handles a different scan line. Because it is done entirely in hardware MaxIMPACT is twice as fast as a single GE/RE board like HighIMPACT.
I beleive that ATI's modern GPUs have been designed to work in parallel (up to 32 chips?). It's very cool to see a card using 4 R300s.
SGI is startin
Voodoo 5 had up to 4 (Score:3, Informative)
Time to... (Score:2, Funny)
is nvidia seeming more and more.. (Score:5, Interesting)
maybe they shouldn't have.. sure they probably had some great people and so on but ultimately "it didn't work out".
"hey, we can't keep up! let's just use brute force on increasing our cards capabilities!!! that's cheap and economical in the long run keeping our company afloat, right? right??"
Re:is nvidia seeming more and more.. (Score:3, Informative)
If you remember the last days of 3dfx, what they were selling was more expensive, slower, had a lower resolution and a distinctly washed-out look compared to comparable Nvidia parts. In fact, I remember convincing several people at a LAN party to dump their Voodoo 2 cards for the TNT, because although the frame rat
Now, the question becomes... (Score:2, Interesting)
Re:Now, the question becomes... (Score:5, Informative)
Yes, it's
Re:Now, the question becomes... (Score:5, Informative)
Ohh I see... (Score:2, Funny)
Oh wait...
Power Requirements (Score:5, Interesting)
Re:Power Requirements (Score:2, Informative)
Re:Power Requirements (Score:3, Informative)
Re:Power Requirements (Score:5, Informative)
Cooling Requirements? (Score:5, Interesting)
Alienware took a very different tack with their solution [pcper.com] because it requires a 3rd PCI slot AND it's analog (3rd & 4th pics). I guess its a series of tradeoffs: Space vs flexibility, with Nvidia winning the battle for space but losing on flexibility.
That aside, its rediculous that nvidia is expecting their OEM cooling solutions to do any kind of justice to the heat from those cards. Alienware already expects water cooling to be part of the solution and has cases designed accordingly... couldn't NVIDIA have done it any other way? Do they absolutely have to have a hardware link between their cards?
"A power draw of 250 Watts for the 6800 Ultra SLI solution is very realistic."
Then explain how this will work [tomshardware.com].
Re:Power Requirements (Score:3, Funny)
Nah. The reason why the new graphics cards run so hot is that they're self-powered. Each carries its own RTG [doe.gov].
As long as you have a lead-lined case and follow local, state, and federal ordinances regarding disposal of nuclear materials--then you should be fine.
Glad I could clear that up.
New Motherboards (Score:3, Interesting)
At any rate, doesn't this sort of make the whole Alienware Video-Array seem like a bust?
This raises the question: (Score:2, Interesting)
Re:This raises the question: (Score:2)
a 16-channel PCIe slot is 2.4GB/sec of bandwidth...
I bet a high-resolution FX card would use most of that. But then again, they probably use PCIe-16 because PCIe-4 would be far too little.
Re:This raises the question: (Score:4, Interesting)
Reliability (Score:5, Insightful)
Other than that the only problem I can see is that you need about AU$2000 worth of video card, and at least AU$1000 worth of Xeon to use it. Maybe for engineers and artists, but will the average person have any use for it? I don't feel that an extra AU$3000 is worth it for the extra frame rate in games.
For the pros though it would be very good though.
Re:Reliability (Score:4, Funny)
Look on the bright side; most Xeon systems already have the second PSU that you are going to need to power the extra card and turbofan based cooling system.
Re:Reliability (Score:5, Interesting)
Re:Reliability (Score:3, Funny)
Get out of here you heathen...........
ALX (Score:3, Interesting)
Re:ALX (Score:3, Insightful)
noticed that they were using two 6800s for their benchmarks?
Re:ALX (Score:2)
this nvidias own solution doesn't really seem like the alienware's.
Re:ALX (Score:2)
4 slots (Score:5, Insightful)
And I'm also wondering how the heat is going to be transferred away from the cards. It looks like you need some serious cooling setup to keep those two babies running.
Bah... (Score:5, Insightful)
Plus, many people were upset about power and cooling requirements. This monster would occupy FOUR slots and require, what, a 600W PSU? (ok, just kidding, "only" 460W should be enough)
Re:Bah... (Score:2)
You can also look at it as a means of obtaining what was previously quadro only class quality, which is awesome for all of us maya and 3d max hobbyist
At any rate, if you dont like it, dont buy it! For the rest of us, its xmas early.
Re:Bah... (Score:5, Informative)
So I wonder how you can calculate the rating need for ths PSU?
Wonder no longer! Power Supply Article [firingsquad.com]
Well, the power rating... (Score:2)
Hard disks are not a major draw of current anyway, I checked Seagate's website and they draw 13-14W at max each. It's the 100W+ CPUs + 100W+ GPUs + cooling systems that make up the most part. Each PCI slot also adds a lot to the requirement, since they can each draw a lot (even if they fairly rarely do, depends on type
Buy all the hardware were want (Score:2)
If I remember correctly... (Score:3, Insightful)
Re:If I remember correctly... (Score:2, Insightful)
POWER requirements (Score:2)
Only Nvidia? (Score:3, Interesting)
Currently... (Score:2, Funny)
How many of you are there hitting refresh just to see the hit counter go up?
SLI (Score:5, Informative)
SLI stands for Scan Line Interleave [gamers.org].
The article says otherwise: (Score:4, Informative)
Indeed, it uses a top/bottom 50/50 split for rendering rather than per-line interleaving.
Insane Requirements? (Score:2, Insightful)
My question... (Score:3, Informative)
Xeons? (Score:5, Insightful)
For starters, the Xeon is still stuck at a 533MHz FSB, limiting its performance. Add in the fact that they're ridiculously overpriced & most games show little to no performance improvement when running on an SMP system. A single P4 or Athlon64 will stomp the Xeon in almost all gaming situations.
Of course, with this tech a ways away & there not really being any PCI-E motherboards on the market now that Intel's recalled them all, I guess they're betting on high-end enthusiast boards to ship with the second x16 slot by the time this thing is actually ready for market...
Really, the biggest application for this kinda power that I can forsee would be game developers who want to see how well their games scale for next-gen video hardware...
Won't Get Fooled Again (Score:2, Insightful)
Screw Xeon, go for a G5! (Score:2)
Usually the cost of G5s are a bit too steep, but compared to Xeon they are an absolute steal!
And you get a cool machine/OS to boot!
Re:Screw Xeon, go for a G5! (Score:5, Informative)
PCI Express is denoted by some of the following: PCIe, PCI-E, PCI-Ex, PCI-Express
PCI-X is just PCI with higher throughput thanks to a higher clock rate among other things. It kinda sucks that they ever settled on PCI-X as the name for PCI-X, it now causes confusion on a mass scale.
Flamebait, ATI loving, and nvidia bashing? (Score:5, Informative)
This is going to be great when it matures, and is one of the huge advantages to PCI-Express when that becomes the standard on future motherboards over AGP. Yes, I know Intel is making motherboards with this, but who the hell wants to pay all that money for such a small jump?
Since people seem to be lost on the nvidia cards, here goes a run down of what they are releasing and the price area:
300$ - nvidia 6800
400$ - nvidia 6800gt
500$ - nvidia 6800 Ultra
600$+ - nvidia 6850/6800 Ultra Extreme
The 6800 and GT are single slot cards with a signle Molex connector. Those can be used in the SLI configuration as well. Get the facts straight before you post flamebait and troll.
Cost? (Score:4, Insightful)
98 ? NOT ! (Score:3, Funny)
No, in '98 I had a great job and salary.
Alienware beat them to the framerate punch... (Score:3, Informative)
Alienware purchased a former 3dfx licensee who had outstanding patents on some of their own SLI tech. Alienware has wisely furthered the research and will be marketing it soon. And it doesn't require a Xeon processor...
Here's the press release:
http://www.alienware.com/press_release_pages/pr
Makes sense - they bought 3DFX's technology (Score:4, Interesting)
I bought a shitload of 3DFX stock back in the late 90s because they were the king of 3D. I remember walking into a computer store, and seeing something on the screen... I thought it was clip from a movie, but they told me it was Mechwarrior 2 (I think 2) playing on a Voodoo card. My mind was blown. How they got movie-like graphics onto a computer was beyond my capacity to understand. I dropped the $350 and bought one immediately and played with it and loved it.
Then, after a while, I thought, 3DFX is the king and they will never die. I put my money where my mouth was and forked over my entire savings to buy 3DFX, around $15k. There-in I learned a few great lessons:
1) The best technology doesn't mean the best company. "Good enough" with a better run company will usually blow you away. Ask Microsoft or nVidia (well, at the time nVidia wasn't the top runner that it is today).
2) No matter how great of an explanation you make, the stupidest things like 16-bit color vs 32-bit color can kill you (22-bit color just doesn't cut it to the dumb-ass consumers). It's better to just cross your t's and dot your i's in the first place so that you don't have any such vulnerabilities.
They went tits up, and I basically lost my money. nVidia bought the remaining pieces of 3DFX, and that includes all their patents. I'm not surprised they went SLI, and for companies that use it like 3d effect companies, it will probably save them bundles of time.
Obviously not a gamer's market (Score:3, Insightful)
Re:When (Score:5, Insightful)
NDA Leak. (Score:2, Insightful)
All great news.. but WHEN can I find it available in the stores, that would be NEWS.
Dude, this is an NDA leak! If you're trying to imply nVidia is peddling vaporware, well, you might be right, but in this case they're actually not the ones doing the peddling, because their SLI setup is still under NDA.
BUY THEM HERE (Score:2, Funny)
Re:SLI? (Score:2, Informative)
Scan Line Interleave. Every other line of the screen is drawn by the other graphics card.
Re:SLI? (Score:5, Informative)
Re:SLI? (Score:3, Informative)
Both specified in the article. They really are confusing the issue more than required.
from the article:
in something called an SLI, Scan Line Interleave, configuration.
and then:
Both 6800 series PCI-E cards are connected by means of a SLI, Scalable Link Interface, dubbed the MIO port, a high-speed digital interconnect
removing the bumf however leaves the following definition of SLI:
"Buy 2 cards so you can do the same job as an ATI".
note: I'm only jealous, I made
Re:SLI? (Score:2)
The Scalable Link Iinterface is according to the article what two Voodoo cards used to communicate with. nVidia calls their port MIO. I think the correct acronym is Scan-line Interleave mode, so the article might have gotten the acronym wrong. But I remember fow 3dfx used sli to connect two Voodoos together. It's also all there in the articl
Re:Math experts (Score:5, Informative)
The lower-than-100% increase reflects the fact the cards aren't working together fully. As they said, it's still early days, and expect to get that figure to nearer 90%.
Re:Math experts (Score:2)
77% more performance because the drivers are not yet optimised, and because I'd assume that the tests were done on the same machine.
Not all of the processing is done on the video card, what makes you think that another one will take all the load off the CPU?
Re:Math experts (Score:2)
Re:Math experts (Score:3, Informative)
Of course that is a terrible over simplification. There are cases in which 2 cpus are actually slower than one, notably SMP P1 chips that had the L2 cache on the motherboard.
Re:Math experts (Score:4, Funny)
200% more expense on said GPU.
And 200% more problems calculating percentages!
Re:nvidia is going under! (Score:5, Insightful)
1: Resort to idiotic 3DFX-like measures to get high performance
Note: A 77% increase in gaming performance isn't "high performance". Considering that the 6800 is ALREADY a massive leap forward over it's predecessor, it's INSANE PERFORMANCE!
How would something like 1600x1200 with maxed FSAA and maxed AF, while never dropping below 60fps, grab you by the short and curlies?
2: Watch company slowly die.
Nobody's suggesting that everyone and their brother run out and get SLI'd GeForces on a Xeon platform. (Those already spending 4-5000 dollars on such a platform aren't necessarily going to shrink from an additional $4-500, especially if it nearly doubles video performance.)
This is going to probably be limited to those who'd normally use Quadro cards (productivity) and the elite few with more money than sense.
Not that everyone won't WANT one...
Re:Not true SLI (Score:3, Informative)
For nVidia, it means "scalable link interface", according to this article.
It's not trying to be the same thing, but it is exploiting the brand/trademark nVidia acquired from 3dfx.
That's hilarious (Score:3, Interesting)
Their target market is apparently "you" - you're just in the wrong place in your cycle. Right now, you're in the sour grapes phase, denying the possibility that anyone could want a better computer than yours (they already do). Soon you'll be in the lust phase, then you'll be in the "MUST BUY SHINEY THING! PLEASE TAKE CREDIT CARD!" phase.
I remember a time when it was unimaginable who might need a 386.
Re:Does this really make much sense? (Score:3, Informative)
Aren't we to the point where CPU and (single) GPU power is high enough for just about any game without needing a SLI solution?
No, we aren't, and you're being a troll for suggesting that any advancement is "not needed." Maybe it's not desirable for you, but it is for someone.
Seems to me this SLI bit is only to induce a boner in the geekiest of geeks, and at a high price to boot.
And here y
Re:It's nice to be proven right (Score:3, Insightful)