Nvidia Reintroduces SLI with GeForce 6800 Series 432
An anonymous reader writes "It's 1998 all over again gamers. A major release from ID software, and an expensive hotrod video card all in one year. However, rather than Quake and the Voodoo2 SLI, it's Doom3 and Nvidia SLI.
Hardware Analysis has the scoop, 'Exact performance figures are not yet available, but Nvidia's SLI concept has already been shown behind closed doors by one of the companies working with Nvidia on the SLI implementation. On early driver revisions which only offered non-optimized dynamic load-balancing algorithms their SLI configuration performed 77% faster than a single graphics card. However Nvidia has told us that prospective performance numbers should show a performance increase closer to 90% over that of a single graphics card. There are a few things that need to be taken into account however when you're considering buying an SLI configuration. First off you'll need a workstation motherboard featuring two PCI-E-x16 slots which will also use the more expensive Intel Xeon processors. Secondly you'll need two identical, same brand and type, PCI-E GeForce 6800 graphics cards.'"
Re:SLI? (Score:2, Informative)
Scan Line Interleave. Every other line of the screen is drawn by the other graphics card.
Re:SLI? (Score:5, Informative)
Re:Math experts (Score:5, Informative)
The lower-than-100% increase reflects the fact the cards aren't working together fully. As they said, it's still early days, and expect to get that figure to nearer 90%.
Re:SLI? (Score:3, Informative)
Both specified in the article. They really are confusing the issue more than required.
from the article:
in something called an SLI, Scan Line Interleave, configuration.
and then:
Both 6800 series PCI-E cards are connected by means of a SLI, Scalable Link Interface, dubbed the MIO port, a high-speed digital interconnect
removing the bumf however leaves the following definition of SLI:
"Buy 2 cards so you can do the same job as an ATI".
note: I'm only jealous, I made a booboo and bought an fx5900
Re:Math experts (Score:3, Informative)
Of course that is a terrible over simplification. There are cases in which 2 cpus are actually slower than one, notably SMP P1 chips that had the L2 cache on the motherboard.
"begs the question" (Score:2, Informative)
I recently learned this here, so please don't take this as a criticism.
The phrase "begs the question" doesn't mean what you think it means. It does not mean, "this leads to the question."
Rather, it is a term used in logic to indicate a fallacy in which the question or statement itself tries to prove its truth by asserting its own truth. This is commonly known as circular reasoning. More here [nizkor.org].
I agree with you about wondering who the product is aimed at, though.
Re:Now, the question becomes... (Score:5, Informative)
Yes, it's
Re:ATI X800 advertisement (Score:1, Informative)
Actually I think you might be the only one not viewing the site in Mozilla / Firefox [mozilla.org] with adblock [mozdev.org] installed.
Look at all these great sites you could block:
I just wish adblock came with these as defaults :o)
Re:Power Requirements (Score:2, Informative)
Re:Bah... (Score:5, Informative)
So I wonder how you can calculate the rating need for ths PSU?
Wonder no longer! Power Supply Article [firingsquad.com]
SLI (Score:5, Informative)
SLI stands for Scan Line Interleave [gamers.org].
Re:is nvidia seeming more and more.. (Score:3, Informative)
If you remember the last days of 3dfx, what they were selling was more expensive, slower, had a lower resolution and a distinctly washed-out look compared to comparable Nvidia parts. In fact, I remember convincing several people at a LAN party to dump their Voodoo 2 cards for the TNT, because although the frame rate was much lower (sometimes by half), games were still playable and the performance hit for using higher resolutions was greatly reduced and achieving a res like 1024x768 or even 1280x1024 didn't require an additional card. Which meant that I could snipe from a further distance with more precision. Not to mention the clarity of those 32 bit textures; my god, I shudder at the thought of going back to banding 16 bit hell.
My question... (Score:3, Informative)
Re:Just a band aid.. (Score:2, Informative)
I have an Indigo2 with MaxIMPACT graphics. It has 2 Geometry Engines and 2 Raster Managers. I believe that each set handles a different scan line. Because it is done entirely in hardware MaxIMPACT is twice as fast as a single GE/RE board like HighIMPACT.
I beleive that ATI's modern GPUs have been designed to work in parallel (up to 32 chips?). It's very cool to see a card using 4 R300s.
SGI is starting to use ATI's chips in their own graphics boards, though I've not seen any multi-GPU boards from them yet. Of course no gamer would ever be able to afford an SGI graphics supercomputer, but it still makes one drool.
Re:Xeons? (Score:2, Informative)
Why would they design something like this and force it to use a Xeon?
No one. Who did you have in mind?
(Hint: Nforce 4 [theinquirer.net])
Now it's time for ATI to reintroduce MAXX (Score:2, Informative)
I feel that this dual card thing will not be as short lived as the old 3dfx SLI. I mean, it wasn't possible to use 2 AGP cards because we lacked the second slot, but with PCI-E, the problem is gone. Remember that all the Voodoo2 had the SLI plug ? I bet that all the next gen cards will have a dual mode plug (it's already the case with the new GeForce).
The next step is to allow this kind of thing with non-identical cards. It would be nice to be able to keep your old card even after you've bought a brand new one. But it seems that synchronization is a bit of a problem.
The article says otherwise: (Score:4, Informative)
Indeed, it uses a top/bottom 50/50 split for rendering rather than per-line interleaving.
Re:Power Requirements (Score:3, Informative)
Flamebait, ATI loving, and nvidia bashing? (Score:5, Informative)
This is going to be great when it matures, and is one of the huge advantages to PCI-Express when that becomes the standard on future motherboards over AGP. Yes, I know Intel is making motherboards with this, but who the hell wants to pay all that money for such a small jump?
Since people seem to be lost on the nvidia cards, here goes a run down of what they are releasing and the price area:
300$ - nvidia 6800
400$ - nvidia 6800gt
500$ - nvidia 6800 Ultra
600$+ - nvidia 6850/6800 Ultra Extreme
The 6800 and GT are single slot cards with a signle Molex connector. Those can be used in the SLI configuration as well. Get the facts straight before you post flamebait and troll.
Voodoo 5 had up to 4 (Score:3, Informative)
Re:Screw Xeon, go for a G5! (Score:5, Informative)
PCI Express is denoted by some of the following: PCIe, PCI-E, PCI-Ex, PCI-Express
PCI-X is just PCI with higher throughput thanks to a higher clock rate among other things. It kinda sucks that they ever settled on PCI-X as the name for PCI-X, it now causes confusion on a mass scale.
Re:Now, the question becomes... (Score:5, Informative)
Re:Not true SLI (Score:3, Informative)
For nVidia, it means "scalable link interface", according to this article.
It's not trying to be the same thing, but it is exploiting the brand/trademark nVidia acquired from 3dfx.
Re:Power Requirements (Score:5, Informative)
Re:Slightly O/T (Score:3, Informative)
I think he was talking about a CRT. LCDs aren't capable of rendering even 86 frames per second.
However, if you want the absolute highest resolution, a 3840x2400 LCD may be the way to go.
Re:Does this really make much sense? (Score:3, Informative)
Aren't we to the point where CPU and (single) GPU power is high enough for just about any game without needing a SLI solution?
No, we aren't, and you're being a troll for suggesting that any advancement is "not needed." Maybe it's not desirable for you, but it is for someone.
Seems to me this SLI bit is only to induce a boner in the geekiest of geeks, and at a high price to boot.
And here you discover that, indeed, this is useful, even if only for the "geekiest of geeks." But then your lame boner reference and "high price to boot" jab reinforces the trollness of your comment. Note that no price is too high for some.
Just doens't make much sense to me.
Classic troll hallmark. No one cares whether you like this or not. If you don't have something interesting, informative, or useful to say, then STFU.
If there's a game my Boxx FX53 + X800 won't play well, then it's probably not worth playing.
Ding! Trolling grand prize. That, I bet, is the asshat comment that sealed it.
Hope that helps!
yeah (Score:1, Informative)
They make them so people will think they've got the fastest product out, and so therefore are the best brand, and so people will then go out and buy a moderately-priced card of the said brand.
ATI has gone ahead in leaps and bounds of marketshare because their Radeon 9700/9800Pro cards have been faster than nVidia's best. Demographically, hardly anyone buys these high-end cards, and only a couple percent of their profits are made out of these things, but people think whomever has the best out at the time is the company to buy from.
nVidia knows this release will make it to computer magazine reviews. Novices will read it, think nVidia is good, and go out and buy a 5200fx. This is how it works.
Alienware beat them to the framerate punch... (Score:3, Informative)
Alienware purchased a former 3dfx licensee who had outstanding patents on some of their own SLI tech. Alienware has wisely furthered the research and will be marketing it soon. And it doesn't require a Xeon processor...
Here's the press release:
http://www.alienware.com/press_release_pages/pr
Re:My Voodoo 2 SLI Story (Score:5, Informative)
There are very blatant reasons why SLI killed 3DFX as a company. And yes, their downfall began with the Voodoo 2.
The Voodoo 2 with SLI was so incredibly fast that they had no competitor, so every graphics company in the game was making 3DFX cards, and they were all reference designs (with the exception of Canopus and Obsidian).
All those players and 3DFX themselves overestimated demand for their extremely high-priced product. Even worse, they overestimated demand for the SLI add-on at the $300 pricepoint. 3DFX was losing a lot of sales because they didn't have a competitive low-end product until the Banshee, and by then Nvidia had made quite a dent in their marketshare.
All the vendors who used the reference design got bit in the ass once more because the market discovered you could mix Voodoo 2 cards in SLI, so you could buy ANY card (read: cheapest), so all those late upgraders got a sweet deal.
While I don't see SLI destroying Nvidia (they have the diversified product like that 3DFX was lacking), I do expect it to blow up in their faces and lose them money in the long run. The market couldn't bear a $600 graphics solution in 1998, what makes them think it can handle a $900 solution 5 years later?
Re:For Rich Folks Only (Score:5, Informative)
For reference (not just for you), PCI-X is PCI on steroids, a faster, wider (64-bit) edition of the PCI bus which is used in high end servers and the Apple PowerMac G5.
PCIe (aka PCI Express aka 3GIO) is the brand new multi-channel serial expansion bus that will be appearing on consumer-level motherboards in the next few months and will eventually replace both AGP and PCI.
Re:Just a band aid.. (Score:2, Informative)
I have toyed with the idea of throwing the card into one of my older computers to play some of the Glide only games that came with it or my Voodoo1 Card. Whiplash 3D, while not stunning to look at by todays standards, was damn fun to play.
Re:Damn (Score:4, Informative)
Don't worry about dual PCIe x16 motherboards ... remember nVidia make chipsets as well. Expect an nForce4 chipset at the end of this year supporting their new SLI technology with two PCIe x16 slots supported [theinquirer.net]
Of course, when you are spending $400 apiece on graphics cards, will you really be skimping on the processor and motherboard?