Gigabyte's 3D1 brings SLI to a single card 153
An anonymous reader writes "Gigabyte have implemented nVidia's SLI on a single graphics board, dubbed the "3D1." The card features two GeForce 6600GT cores (I would imagine two 6800 cores would draw too much power and create too much heat for a single PCB.) Hexus.net have a review of the board, which in various tests was able to compete with a 6800GT, but will it be marketed at a favourable price? You may also want to read Hexus' article - 'An Introduction to SLI' - for a look at how SLI technology works."
Too Much power? (Score:2, Funny)
I don't know whether it would or not, but I will be willing to test that for you. Send me one, and I will fill out all the forms and keep track of heating and power levels.
Re:Too Much power? (Score:1)
Re:Too Much power? (Score:2)
So, (Score:1)
Re:So, (Score:3, Informative)
Re:So, (Score:4, Insightful)
Giving 8 lanes per slot, which is still more than enough bandwidth.
They exist (Score:2)
One of the new tyan motherboards for dual opterons has 32 pci-e lanes. It has two Nforce4 chipsets on board giving it two pci-e slots with the full 16 lanes each. Of course the board costs something like $500-$700.
Re:So, (Score:4, Informative)
Re:So, (Score:2)
I don't see where it would go any faster, honestly.
Look at the charts - when they are not using AA/Oversampling the single 6600GT goes just as fast as the 3D1 and the 6800GT - at all resolutions. The applications are CPU bound and better video isn't going to change that.
Granted there is always the 1337 crew pointing at the 8xOversampling / 4xAA numbers - but quite honestly the FPS hit going there on any card isn't worth the questionable increase in quality. Given the 60Hz an
PCBs (Score:4, Informative)
Re:PCBs (Score:5, Informative)
Re:PCBs (Score:3, Interesting)
Liquid cooling will do it easily, but it would be unusual and non-standard to require it,
Re:PCBs (Score:3, Interesting)
Re:PCBs (Score:1)
That is easy to solve. You can get power from one of those power connectors used for disk drives. Some single-GPU graphics cards already require that.
Re:PCBs (Score:2)
my 6600 GT AGP requires one (The PCI may supply more power and not need one).
The install guide for my card (that includes the 6800) recomend using 2 seperate power leads from the power supply if possible.
Re:PCBs (Score:2)
Re:PCBs (Score:3, Informative)
Voodoo 5500 (Score:2)
The card also basically took two slots since both GPUs had big honkin cooling fans.
Ah, the good old days. Quake 3 at decent FPS in 1600x1200 was sweet, but not as sweet as nethack on a vterm...
multicore GPU's (Score:4, Interesting)
Jerry
http://www.syslog.org/ [syslog.org]
Re:multicore GPU's (Score:4, Insightful)
~phil
Re:multicore GPU's (Score:2)
Oh. Wait. Nevermind.
hawk, back to his 8 bit memories
If I read the article right (Score:4, Interesting)
Am I missing something here?
And what's this all about [hexus.net]? Putting a video card on the carpet? Or a towel? Static electricity kills, people!
Re:If I read the article right (Score:2)
Re:If I read the article right (Score:2)
Re:If I read the article right (Score:2)
More importantly, about the performance, there are a massive number of 2 card SLI reviews for the 6600 series, and the general consensis is that they just can't keep up, especially when you start throwing more AA/AF and higher resolutions at them. While it may be more cost effective for some people to buy a 6800 in two pieces over a period of time, t
Re:If I read the article right (Score:2)
You're right, I probably haven't. The previous stories on Slashdot dealt with the technology in general and the Gigabyte card in particular, that is, the same card this story is on. (Dupe?) Sorry about that. [slashdot.org]
Re:If I read the article right (Score:2, Insightful)
Static (Score:2)
So the whole static thing is dependant on where you live to a large degree.
Not really. I forget the exact number, but IIRC as little as 20V is enough to zap a component or cause a soft failure. Visible discharge is around 1000V or so. If you spend enough time working with hardware (eg, board design) you will learn the dangers of static.
Re:Static (Score:2)
If humidity is high enough, static discharge is virtually impossible.
Re:Static (Score:2)
Human skin may be able to detect a DC potential at 15VDC, but I have attended several training seminars that have stated that humans don't feel static discharges less than 3500 volts (reference [esdsystems.com]). Most seminconductors fall in the Class 1 category in the chart on that page, so they can be damaged by a discharge that you can't feel.
As for humidity, check out this [esdsystems.com]. You will see that humidity helps, but it will not prevent damaging discharges.
If you ever visit a manufacturing facility, or a hardware lab
Re:If I read the article right (Score:1)
Anandtech review (Score:3, Informative)
Source: http://www.anandtech.com/video/showdoc.aspx?i=231
Re:Anandtech review (Score:2)
http://www.hexus.net/content/reviews/review.php?dX JsX3Jldmlld19JRD05NjMmdXJsX3BhZ2U9OA==
example..
Top graph:
3D is blue
6600 is red
6800 is yellow
Next graph:
3D is red
6600 is green
6800 is yellow
Re:Anandtech review (Score:2)
No, Anandtech is not known for changing its colours
Re:Anandtech review (Score:2)
http://www.anandtech.com/video/showdoc.aspx?i=231
Power (Score:2)
I would imagine two 6800 cores would draw too much power and create too much heat for a single PCB.
The power issue doesn't really have to do with the PCB. It mainly has to do with the connector and the number of power pins. If a card draws too much power, the pins or fingers on the connector act as fuses and melt. (I have seen this happen on VME and cPCI boards.)
Re:Power (Score:1)
But splitting a heat source (the chip) into two or more separately packaged chunks can lower the separate die temperatures. This can be a good thing.
Refrigerate your PC? (Score:1)
Re:Refrigerate your PC? (Score:1)
Re:Refrigerate your PC? (Score:1)
Gigabyte's Designs (Score:2, Interesting)
I dunno what they have in mind, but they sure are stiring things up a bit, but arn't they risking alienating nVidia with these "almost" SLI competetor alternatives?
Anandtech not too impressed. (Score:5, Informative)
From Anandtech: [anandtech.com] Unfortunately, in light of the performance tests, there really isn't much remarkable to say about the 3D1. In fact, unless Gigabyte can become very price competitive, there isn't much reason to recommend the 3D1 over a 2-card SLI solution. Currently, buying all the parts separately would cost the same as what Gigabyte is planning to sell the bundle.
The drawbacks to the 3D1 are its limited application (it will only run on the GA-K8NXP-SLI), the fact that it doesn't perform any better than 2-card SLI, and the fact that the user loses a DVI and an HD-15 display connection when compared to the 2-card solution.
Something like this might be very cool for use in a SFF with a motherboard that has only one physical PCIe x16 connector with the NVIDIA SLI chipset. But until we see NVIDIA relax their driver restrictions, and unless Gigabyte can find a way to boot their card on non-Gigabyte boards, there aren't very many other "killer" apps for the 3D1
They pretty much say Stick with true SLI unless size restraints force you into a single card solution
Re:Anandtech not too impressed. (Score:2)
Course if it only works on gigabyte boards as mentioned, its near useless
Re:Anandtech not too impressed. (Score:2)
Work on the Graphs (Score:2)
Re:Work on the Graphs (Score:2)
Don't laugh - we had a guy in our office that was color blind, none of us knew it until we let him set up Windows 3.0's color scheme to best suit his needs (ok it was a long time ago.) Made the Hotdog Stand color scheme look mild in comparison - it was frightful to us, but the contrast worked great for him.
Driver corruption caused it. (Score:1)
It's causing this color flickering on the graphs.
Because they can, should they? (Score:2)
But at what point do people say, "Gee, that's neat but call me later?" I'm not against the expansion of technology, but there becomes a point of diminishing returns for the price. Is this that point?
Also the article points out "...t
Why? (Score:3, Insightful)
Seriously, what modern PC game wont run well with just one card? I've got an FX 5900 non-ultra 128MB. Doom3 and Half-Life 2 are both my bitch. And if I recall there haven't been any other PC games this year worth mentioning. And if you're not using the extra power to play games, and you're doing some serious 3d work you should have some professional SGI style equipment. The only reason I can really see to have this is if you were developing a PC game that is going to come out in a year or two and you need to have hardware as fast as what we will probably have then.
So um yeah. Who's wasting their moneys? In fact, with those moneys you can buy a better monitor. Which makes a much bigger difference if the monitor you have is not super awesome.
Re:Why? (Score:2)
But at what settings? Are they your bitch at full quality on all settings, with the resolution up as far as your monitor can display and your eyes can cope with?
Re:Why? (Score:5, Insightful)
Really? You can run both of those titles at maximum detail settings, at 1600x1200, with 16x oversampling and 8x full-screen anti-aliasing at 60fps+ on an FX5900? I've gotta get me one of those FX5900 cards, as my 6800GT basically turns into a thermonuclear device when I try those settings.
The point is that there are plenty of people out there who DO want to run their games at the maximum possible resolutions and image quality, and quite a few of those people are willing to spend the $500 plus necessary to get the performance they desire.
Re:Why? (Score:2)
Re:Why? (Score:2)
Are you serious?
Re:Why? (Score:3, Informative)
It is the rare 17" that can have a pixel density that is high enough to display 1280x1024 with no blurriness.
I have a Relisys 17" monitor that has a max resolution of 1600x1200, but can only display withour blurriness up 1152x864.
A lot of recent 17" monitors had only a max res of 1280x1024. Running at 1600x1200 is nice theory but only those with 21" displays are likely
Re:Why? (Score:2)
Sorry? I was running 1600x1200 on a 17" CRT 4 years ago, and it looked fine. Now I'm just abotu to get a LCD which can finally go up to that same res. Hmm....progress..
Re:Why? (Score:2)
Re:Why? (Score:2)
Of course, the price of good gaming LCD's are still coming down...
Re:Why? (Score:3, Interesting)
Re:Why? (Score:5, Insightful)
The point is - most people do not have upper teir graphics cards. Just like most people do not run the absolute top of the line AMD & Intel processors. They are too expensive and all about marketing.
Myself with a laptop currently only have a Radeon 7500 onboard. My previous desktop had a Radeon 8500. YOU do not need an SLI or next-gen top teir card because you already have a last gen top teir card.
Those of us, and there are many, who don't do need an upgrade.
Why the SLI thing?
I buy one 6600GT for my motherboard. I'm happy, I like it. 2 years later my games start to suffer, I buy another one. Go look at the benchmarks comparing hte Single to Double... its a 50-100% boost in performance depending on the application. That is really significant and considering where the prices of those cards will be in a year or two - has a lot of bang for the buck.
Your comment about buying a "Monitor" is ridiculous. If you have a 17" and a crappy graphics card and then go buy a 19" and still have that crappy graphics card - you won't be able to take advantage of the higher resolutions availble on that monitor. Yes some monitors just have better picture quality, Mitsubishi Diamondtron comes to mind, but again your argument doesn't make sense.
Re:Why? (Score:2)
Far Cry was the only game which really stood out this year.
Offtopic (Score:1)
I've played and own all three of these games and I have to say HL2 > Farcry. They both have a good story and both have outstanding graphics. But HL2 was infinitely more enjoyable.
Peace
Re:Why? (Score:2)
Why you ask?
Haven't you heard the quote: "A fool and his money are easily parted"
That the only answer some people will piss away a small fortune just to always have the latest and greatest hardware so they can play games with the graphic options all maxed out.
To each his own.
Re:Why? (Score:2, Insightful)
Most people can't understand why I'm willing to drop $500 on a new graphics card and 1k on 1Terabyte of storage.
It has less to do with 'a fool and his money' as it has more to do with 'different strokes for different fokes.'
Someone dropping $500 on a graphics card just to play Solitare would be a waste of money. But most people who drop that much money aren't into it for that. The same way that pimpin
Re:Why? (Score:2)
Last decade's technology. No one doing serious 3D work is using SGIs any more, at least not in the DoE. More precisely, SGI is in bed with nVidia and ATI at this stage of the game, so a good number of people are "rolling their own," as it were. Simple fact, a cluster of Linux nodes with nVidia 6800s can toast an SGI any day. And it's a lot cheaper. Check out this article [llnl.gov] for more information.
Cards like this ARE "SGI style equipment" (Score:2)
If you're using 3D Studio Max (which may displace Maya as the Gold Standard the way things are going -- sure flame me and say it never will) then you have no choice but to use PC hardware.
Huge texture memory sizes and the ability to manipulate large polycount scenes in real time are far more important to such folk than gamers.
Re:Why? (Score:2)
That being said, there is definitely such a thing as TOO fast. How fast is too fast? Too fast is when your framerate exceeds 1. the refresh rate of your monitor, 2. the refresh rate of your human visual system, or 3. both. Since both top out at 120, anything be
Re:Why? (Score:2)
I can go out and buy a 6600GT and it will run Everything at a very good frame rate.
In a year or two from now though it won't be as good, but I'll be able to go out and buy a second really cheap 6600GT and have a system that can run everything at a very good frame rate again.
That's what I see as the advantage of SLI. Whether I'll be able to do that I'll have to wait and see.
Re:Why? (Score:2)
Its insane and no graphics card is powerfull enough in my book.
I would agree witn 1600 x 1800 as excessive but ID software keeps rewriting the rules whenever a new game comes out.
Some numbers for you (Score:5, Informative)
The lowdown (using individual boards here but the dual is about the same):
Doom3 1600x1200:
6600GT SLI = 77.1fps, Cost = $376 (188x2)
6800Ultra = 73.9fps, Cost = $489
According to a great article on www.Anandtech.com it doesn't really outperform two individual boards though. It may be wiser to get a single 6600GT now and SLI later.
This board somewhat defeats one of the great features of SLI: future upgrades. The idea is you can buy a "good" card today and at some point when it gets a little bit dated you can add more performance at a lower future cost.
However, a single board SLI solution should help offset the nasty cost of an SLI motherboard right now. The NF4 SLI boards are running about $100-$150 over where they should be simply due to shortages (spanking new product overdemand).
$255, Gigabyte NF4 SLI mobo
$188, 6600GT today
$59, 6600GT 2 years from now (Based on the cost today of a $200 graphics card two years ago, the GF4 4200)
Total: $502
Or you can opt for 6600GT performance today and tomorrow without SLI in the picture:
$149, Gigabyte NF4 non-SLI mobo
$188, 6600GT today
$269, 6800Ultra 2 years from now (Based on the cost today of a $500 graphics card two years ago, the GF 5900 Ultra)
Total: $606
As you can see even with the badly overpriced SLI motherboards it's still a better deal in the long run. If SLI motherboards get back to reality you could see the savings increase from $104 to ~$200 as well but that's just speculation.
References:
All new prices are from www.newegg.com. For the older boards (4200 & 5900U) that are not available at Newegg I used pricegrabber. Anandtech was used for the benchmark and 2 year old reference articles.
Re:Some numbers for you (Score:3, Insightful)
No! No! If you plan to SLI, buy two matching cards now. You'll pull your hair out trying to find the exact same model and revision to match the one you already have.
Same goes for multiple CPUs or dual channel RAM. Buy a matching set now, or you're in for a headache down the road.
PC Video cards have reached the point where, unless you're an "enthusiast" who likes to spend money, you don't need to spend more than 150-200 bucks.
Nowadays the race i
Re:Some numbers for you (Score:2)
But what detail can you get at 800x600?
Thats the resolution I use in counterstrike, and in some maps you just cant see stuff, like the new map de_cpl_contra's fences. From far away, you just can't see it at all. If I turn on full AF and FSAA, I see through it fine. The pro
Re:Some numbers for you (Score:2, Interesting)
Re:Some numbers for you (Score:1, Informative)
Good coverage (Score:3, Funny)
http://slashdot.org/article.pl?sid=04/12/16/191
Re:Good coverage (Score:2)
No Video Standards (Score:2)
Changing from the 3 or 4 versions of AGP to PCI express is going to create a large enough ripple when it comes to upgrading and purchasing new motherboards and video cards, do we really need to have PCI-X that work on one MB and not another?
--
So who is hotter? Ali or Ali's sister?
Um.... (Score:1)
And heat dissipation is a job best for a chip/set fan(s).
Email me to send me my check.
Kenny P.
Visualize Whirled P.'s
Hang on one sec.... (Score:2)
Something bugging me about SLI... (Score:2)
Re:Something bugging me about SLI... (Score:2, Funny)
Unfortunately, it won't become actually useable until version 3, and only stops being an almighty kludge at version 4. Presumably, they'll make it multi-user at 3.11, too.
Re:Something bugging me about SLI... (Score:3, Informative)
You don't need a full x16 slot. Tests have shown that you don't see any performance drop with a single card until you're in an x4 slot - two x8 slots are much more than enough for current-generation video cards.
steve
A Much Better Review Here (Score:4, Informative)
http://www.anandtech.com/video/showdoc.aspx?i=231
3d1x2? (Score:2)
Review by Hanners, same as the site crashing. :) (Score:2, Informative)
He used to love crashing Elite Bastards all the time, but this is his first official time crashing Hexus.
I'm so proud of him I could cheer, he's one of the good guys.
Heat... (Score:2)
Cheap DDL card for PC coming? (Score:2)
Nvidia has the 6800 DDL for the Mac (to drive the 30" cinema display) but nothing for the PC as of yet.
Pat
Wasted opportunity (Score:2)
Given that there are single-card solutions better than this dual-GPU single card, and that it only works on one motherboard are a rea
Re:Hello (Score:1, Offtopic)
Re:Hello (Score:1)
Quite simply, look at the benchmarks and decide for yourself.
Re:Hello (Score:2, Insightful)
Re:Hello (Score:1)
Re:Son of a bitch ! (Score:1)
At any rate, no real worry. You've still got real SLI, meaning you can use two 6800GT's, or two of whatever comes along in the future. As others are saying, this runs slightly less than a 6800GT and has disadvantages.
Re:Son of a bitch ! (Score:2)
Re:Son of a bitch ! (Score:1)
From the aritcle: The drawbacks to the 3D1 are its limited application (it will only run on the GA-K8NXP-SLI), the fact that it doesn't perform any better than 2-card SLI, and the fact that the user loses a DVI and an HD-15 display connection when compared to the 2-card solution.
Re:Son of a bitch ! (Score:1)
Some people just have far too much disposable income IMHO. WTF do you need a dual 6600GT video card setup for anyway? The most demanding games out there (Doom 3 and Half-Life 2) work just fine with a "lowly" GeForce FX 5600. Sheesh.
Re:Son of a bitch ! (Score:1)
Four or Five year old HW is still very useable you know.
Re:Son of a bitch ! (Score:2)
I use to work at liz claiborne and all the folks in the graphics department for retail design wanted them over the WildCat 1 cards.
So in essence they do have commercial use and appeal. Last I want to code again and coding opengl and directX apps are more fun to me.
Maybe its a guy thing? I dunno.
But writing a mariokart clone on my spare time sounds fun and I have a right to spend money on personal things to make me happy just like s
Re:Tacky (Score:2)
If you are not the type to do that then you see the gold and chrome for the 5 minutes it takes to install the thing and you don't have to see it again until the next upgrade and probably won't care.
incorrect. (Score:2)
You are Wrong, Good Sir (Score:2)
The SLI drivers are optimized on a game-by-game basis, running in either half-screen render mode or alternate-frame render mode.