Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Software IT Hardware

Gigabyte's 3D1 brings SLI to a single card 153

An anonymous reader writes "Gigabyte have implemented nVidia's SLI on a single graphics board, dubbed the "3D1." The card features two GeForce 6600GT cores (I would imagine two 6800 cores would draw too much power and create too much heat for a single PCB.) Hexus.net have a review of the board, which in various tests was able to compete with a 6800GT, but will it be marketed at a favourable price? You may also want to read Hexus' article - 'An Introduction to SLI' - for a look at how SLI technology works."
This discussion has been archived. No new comments can be posted.

Gigabyte's 3D1 brings SLI to a single card

Comments Filter:
  • by Kjuib ( 584451 )
    "I would imagine two 6800 cores would draw too much power and create too much heat for a single PCB."
    I don't know whether it would or not, but I will be willing to test that for you. Send me one, and I will fill out all the forms and keep track of heating and power levels.
    • When I see that the latest Nvidia occupies to PCI slots in the Powermacs, I'd sugegst they engineered these in order to replace these noisy fans with some refrigerating packaging... I am sure the supplemental volume is not an issue anymore, due to how these video boards size and power consumption increased in the last years.
  • Let me just be the first, of many, to say when can we SLI this card (or a revision of this card) and have the power of 4 6600 chips?
    • Re:So, (Score:3, Informative)

      by stupidfoo ( 836212 )
      This card requires 16 PCI-Express lanes. So, you would first need a motherboard that gives you 32 lanes. I don't believe one currently exists. Is there a limit in the PCI-Express spec?
      • Re:So, (Score:4, Insightful)

        by Gaima ( 174551 ) on Monday January 10, 2005 @11:12AM (#11310484)
        The current SLI solutions, from what little (admitidly) I've seen, actually split the 16 lanes between the 2 PCIe x16 slots.
        Giving 8 lanes per slot, which is still more than enough bandwidth.
      • So, you would first need a motherboard that gives you 32 lanes. I don't believe one currently exists.

        One of the new tyan motherboards for dual opterons has 32 pci-e lanes. It has two Nforce4 chipsets on board giving it two pci-e slots with the full 16 lanes each. Of course the board costs something like $500-$700.

    • Re:So, (Score:4, Informative)

      by Peldor ( 639336 ) on Monday January 10, 2005 @10:45AM (#11310269)
      You can already do that in terms of "having the power of 4 6600 chips". Just SLI two 6800 Ultras. Same number of total pipes. Probably faster memory too.
    • when can we SLI this card

      I don't see where it would go any faster, honestly.

      Look at the charts - when they are not using AA/Oversampling the single 6600GT goes just as fast as the 3D1 and the 6800GT - at all resolutions. The applications are CPU bound and better video isn't going to change that.

      Granted there is always the 1337 crew pointing at the 8xOversampling / 4xAA numbers - but quite honestly the FPS hit going there on any card isn't worth the questionable increase in quality. Given the 60Hz an
  • PCBs (Score:4, Informative)

    by Anonymous Coward on Monday January 10, 2005 @10:40AM (#11310230)
    There are PCBs available which have a thicker copper layer thus are able to be used for even higher current. You can make PCBs for some 100s of Amperes.
    • Re:PCBs (Score:5, Informative)

      by GigsVT ( 208848 ) on Monday January 10, 2005 @10:42AM (#11310253) Journal
      The question is not moving the power around, it's dissipating that much power. Useless to have 100 amp traces if the load gets so hot the solder melts.
      • Re:PCBs (Score:3, Interesting)

        by hughk ( 248126 )
        Having worked with dinosaurs, well at least later generations, I can assure you that modern PCs aren't that dense from a thermal viewpoint. The higher-end dinosaurs had the chips sitting in special modules that provided thermal coupling. Unfortunately, you are certainly going to break the ATX spec if you start to generate so much heat and get rid of it.

        Liquid cooling will do it easily, but it would be unusual and non-standard to require it,

    • Re:PCBs (Score:3, Interesting)

      by ppanon ( 16583 )
      Sure but the interface specs for AGP, PCI, et al. specify how much power a card can draw. If you try to draw too much power it may stress components on the motherboard and lead to failure. Perhaps the parent meant that putting two 6800 cores on a single board would draw power in excess of the maximum ratings for AGP or PCI-X?
      • Sure but the interface specs for AGP, PCI, et al. specify how much power a card can draw.

        That is easy to solve. You can get power from one of those power connectors used for disk drives. Some single-GPU graphics cards already require that.

        • Yes, but the 6800 (Ultra anyway) requires 2 of them already.

          my 6600 GT AGP requires one (The PCI may supply more power and not need one).

          The install guide for my card (that includes the 6800) recomend using 2 seperate power leads from the power supply if possible.
      • Which is why a lot of high-end AGP video cards have molex connectors, so they can draw power directly from the power supply, and *not* fry the AGP slot.
        • Re:PCBs (Score:3, Informative)

          by swv3752 ( 187722 )
          A regular 6800 sometimes requires 2 molex connectors. Now building a SLI 6800 would need 4 molex connectors and probably need a 400w+ Power Supply just to provide the needed amps. That is a lot of power going to a single pcb card. It would be difficult to dissipate that much heat. Plus you might need special bracing just to support the card from the weight of hefty heatsinks you will need.
      • My PCI based VooDoo 5500 had an extra cable to hook into an unused hard drive power cord. I assume they could not pull enough power off the PCI bus.

        The card also basically took two slots since both GPUs had big honkin cooling fans.

        Ah, the good old days. Quake 3 at decent FPS in 1600x1200 was sweet, but not as sweet as nethack on a vterm...

  • multicore GPU's (Score:4, Interesting)

    by confusion ( 14388 ) on Monday January 10, 2005 @10:41AM (#11310237) Homepage
    This seems like a lot to pack onto a single board - heat and power for sure. With all the talk from AMD & Intel about multi-core CPUs, a multi-core GPU seems like the best plan. Otherwise, we're going to be back to the full length PCI cards soon.

    Jerry
    http://www.syslog.org/ [syslog.org]

    • Re:multicore GPU's (Score:4, Insightful)

      by pmjordan ( 745016 ) on Monday January 10, 2005 @11:01AM (#11310402)
      Well, if you look at the specs of today's GPUs, they already feature 8-16 pixel processors and 2-8 vertex processors. (numbers somewhere in that order of magnitude) Therefore, they already have multiple cores in a sense. What ATI and nVidia are doing is increasing the complexity of each pixel/vertex pipelines to add features, and adding new pipelines, and widening the memory bus to increase speed. You'll notice that clock speeds are in the 300-500MHz range, presumably for the same reasons why dual-core CPUs (will) have lower clock speeds than their single-core counterparts.

      ~phil
    • It's about time we had a multicore 6800. Some of us have been waiting 30 years for this! With today's technology, it should be no problem to put a couple of thousand of them on the mask, each with 32k of ram for itself, 16k of rom, and a clever bank switching of the other 16k for oodles of memory.

      Oh. Wait. Nevermind.

      hawk, back to his 8 bit memories
  • by stupidfoo ( 836212 ) on Monday January 10, 2005 @10:41AM (#11310242)
    It's basically the a little less than performance of the 6800 at the cost of the 6800, but with more heat than the 6800. Didn't the multiple 6600s perform better than this at a lower cost?

    Am I missing something here?

    And what's this all about [hexus.net]? Putting a video card on the carpet? Or a towel? Static electricity kills, people!
    • Maybe they have an anti-static carpet [gndzero.com]. Then again, maybe they shot that after they finished the review, and didn't really care anymore :-).
    • I haven't read this article, but this isn't the first dual-6600 card, and others claimed to have a higher performance than a single 6800. So more performance, roughly the same cost, more heat.
    • By the way.. I never got the deal about static electricity destroying things since I've always lived and worked in Florida. On a trip to New York it was suddenly all clear... In pitch blackness I ruffled my bed sheets to see a trail of lightning bolts blue in the blackness... I was like WOAH! So that's why there are all those warnings and wrist straps and such! So the whole static thing is dependant on where you live to a large degree.
      • So the whole static thing is dependant on where you live to a large degree.

        Not really. I forget the exact number, but IIRC as little as 20V is enough to zap a component or cause a soft failure. Visible discharge is around 1000V or so. If you spend enough time working with hardware (eg, board design) you will learn the dangers of static.

        • Human skin has a threshold of greater than 15v so at 20v you will feel a shock. If you are not feeling the shock then it is not high enough to do damage.

          If humidity is high enough, static discharge is virtually impossible.
          • Human skin may be able to detect a DC potential at 15VDC, but I have attended several training seminars that have stated that humans don't feel static discharges less than 3500 volts (reference [esdsystems.com]). Most seminconductors fall in the Class 1 category in the chart on that page, so they can be damaged by a discharge that you can't feel.

            As for humidity, check out this [esdsystems.com]. You will see that humidity helps, but it will not prevent damaging discharges.

            If you ever visit a manufacturing facility, or a hardware lab

      • heh, those lightning bolts in the blackness from the bed sheets are really cool to look at, at night.
  • Anandtech review (Score:3, Informative)

    by asliarun ( 636603 ) on Monday January 10, 2005 @10:46AM (#11310279)
    Anandtech has a review on the same.
    Source: http://www.anandtech.com/video/showdoc.aspx?i=2315
  • I would imagine two 6800 cores would draw too much power and create too much heat for a single PCB.

    The power issue doesn't really have to do with the PCB. It mainly has to do with the connector and the number of power pins. If a card draws too much power, the pins or fingers on the connector act as fuses and melt. (I have seen this happen on VME and cPCI boards.)

    • Yes - and even before they fuse they can start dropping non-trivial amounts of supply V, increasing V ripple at the point-of-use, and lowering operating margins and reliability.

      But splitting a heat source (the chip) into two or more separately packaged chunks can lower the separate die temperatures. This can be a good thing.
  • No, not yet. But it looks like in a not so distant future it might be cheaper to use one standard air-conditioning system for the whole case than one for each of the 10 processors on various video, audio and processing boards (most of which you will absolutely need in order to play the next [Doom|HL|Halo|...]).
    • I've been contemplating doing that. My computer room gets pretty hot during the summer anyways (upstairs in a townhouse), and it'd be nice to have a portable A/C unit in the room. I wonder how much it'd help the temps inside my computer if I took some ducting and pumped nice cold A/C air straight into the intake fans.
    • I suspect that components that dissipate a lot of heat (like the CPU and GPU) would need some sort of individual attention. For example if you had a single source of cold air you would probably need ducts that blow it on those heat sinks.
  • Gigabyte's Designs (Score:2, Interesting)

    by Hiigara ( 649950 )
    I can't figure out what Gigabye's roadmap is. I mean, the dual 6600GTs on a single card came out of no where. Now there is the dual PCI express board coming out that allows any two Video Cards to run in parellel, it's not SLI. Now they come out with this.

    I dunno what they have in mind, but they sure are stiring things up a bit, but arn't they risking alienating nVidia with these "almost" SLI competetor alternatives?
  • by Deathlizard ( 115856 ) on Monday January 10, 2005 @10:49AM (#11310313) Homepage Journal
    Anandtech didn't seem to be too impressed by this solution.

    From Anandtech: [anandtech.com] Unfortunately, in light of the performance tests, there really isn't much remarkable to say about the 3D1. In fact, unless Gigabyte can become very price competitive, there isn't much reason to recommend the 3D1 over a 2-card SLI solution. Currently, buying all the parts separately would cost the same as what Gigabyte is planning to sell the bundle.

    The drawbacks to the 3D1 are its limited application (it will only run on the GA-K8NXP-SLI), the fact that it doesn't perform any better than 2-card SLI, and the fact that the user loses a DVI and an HD-15 display connection when compared to the 2-card solution.

    Something like this might be very cool for use in a SFF with a motherboard that has only one physical PCIe x16 connector with the NVIDIA SLI chipset. But until we see NVIDIA relax their driver restrictions, and unless Gigabyte can find a way to boot their card on non-Gigabyte boards, there aren't very many other "killer" apps for the 3D1


    They pretty much say Stick with true SLI unless size restraints force you into a single card solution
  • So when hexus reviews the card, they seem to have problems keeping there graphs consistent. Mainly the colours of the lines. The 6800 was yellow in all the tests, but the other two cards would switch colours from one set to the next, and go from blue to red for the SLI 6600 and from orange to green for the single 6600. It would be more logical that if you wanted to switch colours, you'd go orange to red, and then blue to some other blue shade or something. And especially don't us similiar colours (the or
    • Maybe the guy working on it is color-blind.
      Don't laugh - we had a guy in our office that was color blind, none of us knew it until we let him set up Windows 3.0's color scheme to best suit his needs (ok it was a long time ago.) Made the Hotdog Stand color scheme look mild in comparison - it was frightful to us, but the contrast worked great for him.
    • Of course! They're using a beta card and beta drivers!

      It's causing this color flickering on the graphs. ;-)
  • At what point is all this academic? I looked at TFA and didn't see a price mentioned anywhere for the 3D1, but I imagine it's not affordable. Let me qualify that, I don't expect cheap because it's a new toy and a dual toy at that, which will mean there is a price preimum.

    But at what point do people say, "Gee, that's neat but call me later?" I'm not against the expansion of technology, but there becomes a point of diminishing returns for the price. Is this that point?

    Also the article points out "...t

  • Why? (Score:3, Insightful)

    by Apreche ( 239272 ) on Monday January 10, 2005 @10:57AM (#11310371) Homepage Journal
    Ok, I seriously don't get this SLI thing. I mean, sure there are nuts out there who simply must get 10,000 fps in their favorite games at full resolution. You know, because it make a difference. Just like those audiophiles who buy $5,000 power cables.

    Seriously, what modern PC game wont run well with just one card? I've got an FX 5900 non-ultra 128MB. Doom3 and Half-Life 2 are both my bitch. And if I recall there haven't been any other PC games this year worth mentioning. And if you're not using the extra power to play games, and you're doing some serious 3d work you should have some professional SGI style equipment. The only reason I can really see to have this is if you were developing a PC game that is going to come out in a year or two and you need to have hardware as fast as what we will probably have then.

    So um yeah. Who's wasting their moneys? In fact, with those moneys you can buy a better monitor. Which makes a much bigger difference if the monitor you have is not super awesome.
    • I've got an FX 5900 non-ultra 128MB. Doom3 and Half-Life 2 are both my bitch.

      But at what settings? Are they your bitch at full quality on all settings, with the resolution up as far as your monitor can display and your eyes can cope with?

    • Re:Why? (Score:5, Insightful)

      by Mindwarp ( 15738 ) on Monday January 10, 2005 @11:24AM (#11310586) Homepage Journal
      I've got an FX 5900 non-ultra 128MB. Doom3 and Half-Life 2 are both my bitch.

      Really? You can run both of those titles at maximum detail settings, at 1600x1200, with 16x oversampling and 8x full-screen anti-aliasing at 60fps+ on an FX5900? I've gotta get me one of those FX5900 cards, as my 6800GT basically turns into a thermonuclear device when I try those settings.

      The point is that there are plenty of people out there who DO want to run their games at the maximum possible resolutions and image quality, and quite a few of those people are willing to spend the $500 plus necessary to get the performance they desire.
      • Well most 17" crt monitors can only display well up to 1024x768. Most LCD panels top at 1280x1024. While some monitor can display higher resolution, for it ot have the pixel density to do it well, it is gonna cost a bundle.

        • Well most 17" crt monitors can only display well up to 1024x768.

          Are you serious?
          • Re:Why? (Score:3, Informative)

            by swv3752 ( 187722 )
            I am not talking about max resolution, which varies from 1280x1024 up to 1600x1200, but the highest resolution that will display clearly.

            It is the rare 17" that can have a pixel density that is high enough to display 1280x1024 with no blurriness.

            I have a Relisys 17" monitor that has a max resolution of 1600x1200, but can only display withour blurriness up 1152x864.

            A lot of recent 17" monitors had only a max res of 1280x1024. Running at 1600x1200 is nice theory but only those with 21" displays are likely
        • Well most 17" crt monitors can only display well up to 1024x768

          Sorry? I was running 1600x1200 on a 17" CRT 4 years ago, and it looked fine. Now I'm just abotu to get a LCD which can finally go up to that same res. Hmm....progress.. :)
        • What's the fetish for 17" monitors? When was the last time you saw a gamer with a 17" monitor? All my gamer friends have at least 19" monitors, and a good 21" can be had these days for what you paid for a 19" only four or five years ago.
          • I hate lugging that 40# sob up the stairs is why I like 17" CRTs. best compromise between size and portability.

            Of course, the price of good gaming LCD's are still coming down...
      • Re:Why? (Score:3, Interesting)

        by master_p ( 608214 )
        I've bought an 6800 GT thinking that it just might be able to run Half Life 2 and Doom 3 on my Athlon 3400+. Guess what? the game runs at a very respectable frame rate, over 80 FPS, with all settings maxed out at 1280x1024, except for the anti-aliasing, which I tried it and I saw no real difference (x8 and x2 looks the same; you have to have a very trained eye to spot the difference; and in the heat of the action, it's not important). So I think the press overreacted a little for those two games...I think t
    • Re:Why? (Score:5, Insightful)

      by Vaystrem ( 761 ) on Monday January 10, 2005 @11:30AM (#11310640)
      This is an interesting post because a year or so ago most people would have been saying the same thing about your FX 5900.

      The point is - most people do not have upper teir graphics cards. Just like most people do not run the absolute top of the line AMD & Intel processors. They are too expensive and all about marketing.

      Myself with a laptop currently only have a Radeon 7500 onboard. My previous desktop had a Radeon 8500. YOU do not need an SLI or next-gen top teir card because you already have a last gen top teir card.

      Those of us, and there are many, who don't do need an upgrade.

      Why the SLI thing?

      I buy one 6600GT for my motherboard. I'm happy, I like it. 2 years later my games start to suffer, I buy another one. Go look at the benchmarks comparing hte Single to Double... its a 50-100% boost in performance depending on the application. That is really significant and considering where the prices of those cards will be in a year or two - has a lot of bang for the buck.

      Your comment about buying a "Monitor" is ridiculous. If you have a 17" and a crappy graphics card and then go buy a 19" and still have that crappy graphics card - you won't be able to take advantage of the higher resolutions availble on that monitor. Yes some monitors just have better picture quality, Mitsubishi Diamondtron comes to mind, but again your argument doesn't make sense.
    • Uhhh.... D3 and HL-2 are maybe the worst let downs ever.

      Far Cry was the only game which really stood out this year.
      • As a gamer I will have to agree fullheartedly on Doom 3.

        I've played and own all three of these games and I have to say HL2 > Farcry. They both have a good story and both have outstanding graphics. But HL2 was infinitely more enjoyable.

        Peace
    • Why?
      Why you ask?
      Haven't you heard the quote: "A fool and his money are easily parted"

      That the only answer some people will piss away a small fortune just to always have the latest and greatest hardware so they can play games with the graphic options all maxed out.

      To each his own.
      • Re:Why? (Score:2, Insightful)

        by wernercd ( 837757 )
        I personally couldn't see someone dropping 4k to raise a truck or put NOS in a car to drop it's 0-60.

        Most people can't understand why I'm willing to drop $500 on a new graphics card and 1k on 1Terabyte of storage.

        It has less to do with 'a fool and his money' as it has more to do with 'different strokes for different fokes.'

        Someone dropping $500 on a graphics card just to play Solitare would be a waste of money. But most people who drop that much money aren't into it for that. The same way that pimpin
    • you're doing some serious 3d work you should have some professional SGI style equipment.

      Last decade's technology. No one doing serious 3D work is using SGIs any more, at least not in the DoE. More precisely, SGI is in bed with nVidia and ATI at this stage of the game, so a good number of people are "rolling their own," as it were. Simple fact, a cluster of Linux nodes with nVidia 6800s can toast an SGI any day. And it's a lot cheaper. Check out this article [llnl.gov] for more information.
    • Unless you simply must run IRIX, these days you're doing 3D on PC hardware. Probably studly PC hardware.

      If you're using 3D Studio Max (which may displace Maya as the Gold Standard the way things are going -- sure flame me and say it never will) then you have no choice but to use PC hardware.

      Huge texture memory sizes and the ability to manipulate large polycount scenes in real time are far more important to such folk than gamers.
    • You must not play first-person shooters, where fine detail (for distance shooting) AND a fast framerate (for twitch-reflex aiming and evading) are both mandatory. Otherwise, the machine quite obviously holds you back and affects your performance.

      That being said, there is definitely such a thing as TOO fast. How fast is too fast? Too fast is when your framerate exceeds 1. the refresh rate of your monitor, 2. the refresh rate of your human visual system, or 3. both. Since both top out at 120, anything be
    • The advantage of SLI isn't here right now.

      I can go out and buy a 6600GT and it will run Everything at a very good frame rate.

      In a year or two from now though it won't be as good, but I'll be able to go out and buy a second really cheap 6600GT and have a system that can run everything at a very good frame rate again.

      That's what I see as the advantage of SLI. Whether I'll be able to do that I'll have to wait and see.
    • From what I see at www.tomshardware.com, doom3 requires a 6800 just to run optimally at 1024x768.

      Its insane and no graphics card is powerfull enough in my book.

      I would agree witn 1600 x 1800 as excessive but ID software keeps rewriting the rules whenever a new game comes out.

  • Some numbers for you (Score:5, Informative)

    by Smilin ( 840286 ) on Monday January 10, 2005 @10:59AM (#11310389)
    You should find this board outrunning the 6800Ultras. This is basically a $400 board outrunning a $500 board (that goes for as much as $600 depending on brand/features).

    The lowdown (using individual boards here but the dual is about the same):
    Doom3 1600x1200:
    6600GT SLI = 77.1fps, Cost = $376 (188x2)
    6800Ultra = 73.9fps, Cost = $489

    According to a great article on www.Anandtech.com it doesn't really outperform two individual boards though. It may be wiser to get a single 6600GT now and SLI later.

    This board somewhat defeats one of the great features of SLI: future upgrades. The idea is you can buy a "good" card today and at some point when it gets a little bit dated you can add more performance at a lower future cost.

    However, a single board SLI solution should help offset the nasty cost of an SLI motherboard right now. The NF4 SLI boards are running about $100-$150 over where they should be simply due to shortages (spanking new product overdemand).

    $255, Gigabyte NF4 SLI mobo
    $188, 6600GT today
    $59, 6600GT 2 years from now (Based on the cost today of a $200 graphics card two years ago, the GF4 4200)

    Total: $502

    Or you can opt for 6600GT performance today and tomorrow without SLI in the picture:
    $149, Gigabyte NF4 non-SLI mobo
    $188, 6600GT today
    $269, 6800Ultra 2 years from now (Based on the cost today of a $500 graphics card two years ago, the GF 5900 Ultra)

    Total: $606

    As you can see even with the badly overpriced SLI motherboards it's still a better deal in the long run. If SLI motherboards get back to reality you could see the savings increase from $104 to ~$200 as well but that's just speculation.

    References:
    All new prices are from www.newegg.com. For the older boards (4200 & 5900U) that are not available at Newegg I used pricegrabber. Anandtech was used for the benchmark and 2 year old reference articles.
    • It may be wiser to get a single 6600GT now and SLI later

      No! No! If you plan to SLI, buy two matching cards now. You'll pull your hair out trying to find the exact same model and revision to match the one you already have.

      Same goes for multiple CPUs or dual channel RAM. Buy a matching set now, or you're in for a headache down the road.

      PC Video cards have reached the point where, unless you're an "enthusiast" who likes to spend money, you don't need to spend more than 150-200 bucks.

      Nowadays the race i
      • "Nowadays the race is who can run Doom 3 at 1600x1200 at 70 vs 72 FPS. If you consider the average home PC with a 17" monitor that can't even display 1600x1200, and most gaming is done at 1024x768 or 800x600, it seems kind of pointless."

        But what detail can you get at 800x600?

        Thats the resolution I use in counterstrike, and in some maps you just cant see stuff, like the new map de_cpl_contra's fences. From far away, you just can't see it at all. If I turn on full AF and FSAA, I see through it fine. The pro
    • I wonder what the performance would be if you put 2 of these cards on the SLI board. that would be #$%@ing NUTS
      • by Anonymous Coward
        It also wouldn't work. The nVidia GPUs aren't capable of more than 2-way SLI. This capacity is already taken up by the two GPUs on the gigabyte board. Notice the board doesn't have the plug for the SLI connector at the top.
  • by JDevers ( 83155 ) on Monday January 10, 2005 @11:06AM (#11310443)
    Man, talk about good coverage. A single board getting TWO Slashdot posts when new GPUs often don't merit one.

    http://slashdot.org/article.pl?sid=04/12/16/1916 24 7&tid=152&tid=137
    • Slashdot's decided to inplement their own dual-story SLI technology. However, since the technology is new, only a few articles currently show duplication. Eventually, all stories will be posted twice.
  • I hope that this does not mark the beginning of a trend to make video cards only work with specific motherboards. This card will only run with a Giga-Byte MB with the appropriate BIOS.

    Changing from the 3 or 4 versions of AGP to PCI express is going to create a large enough ripple when it comes to upgrading and purchasing new motherboards and video cards, do we really need to have PCI-X that work on one MB and not another?

    --
    So who is hotter? Ali or Ali's sister?

  • Isn't this what the spare drive power connectors would be perfect for? Kinda like my PowerLeap Slocket adapter - don't want to crowbar the onboard CPU PS, add power with an onboard (or in my case - a cable) power connector.

    And heat dissipation is a job best for a chip/set fan(s).

    Email me to send me my check. ;-)

    Kenny P.
    Visualize Whirled P.'s
  • I need to look up the melting temperature of FR-4.

  • There is something big that annoys me with 2 card SLI. With 1 card you use the 16x slot as a 16x slot. With 2 cards though they throttle back to be 2 8x slots instead of 2 16x slots. Why do that? Maybe next generation pci express will let you have 2 full bandwith 16x slots on the mb. Throttling back to 8x for 2 cards does impact performance. SLI is pretty awesome but I view the current implementation as version 1.5 (version 1 being 3dfx's attempt).
  • by Hack Jandy ( 781503 ) on Monday January 10, 2005 @12:16PM (#11311056) Homepage
    A much better review is to be had here:
    http://www.anandtech.com/video/showdoc.aspx?i=2315 [anandtech.com]
  • So when are they coming out with a SLI version of this card? :-)
  • And just for the record, this is Hanners (Andy Hanley) first time breaking the new site he works at Hexus. :)

    He used to love crashing Elite Bastards all the time, but this is his first official time crashing Hexus.

    I'm so proud of him I could cheer, he's one of the good guys. :D
  • Ah, heat dissipation and graphics engines. That brings back memories of working in the local University graphics lab one summer doing some project work. Alas, the HVAC vents for the room were stuck full-open; the place was freezing cold. To combat this, I'd make sure the old GE Graphicon graphics engine (at the time, very high-end) under the desk of my workstation was turned on... it was a _very_ effective space heater. 8-)

  • What I think a lot of us are waiting for is a cheap DDL capable card for the PC to drive a larger monitor. PNY has the Quatro series but they are all priced for the "workstation" market.

    Nvidia has the 6800 DDL for the Mac (to drive the 30" cinema display) but nothing for the PC as of yet.

    Pat
  • Here's what I see when I look at this card: Two dual-6800 cards each using SLI, together using Alienware's video array. Unfortunately, the problems are:
    • No Video Array yet
    • The only company even rumoured to be working on a 2x 16 channel PCI Express motherboard (Tyan) won't even confirm it yet
    • The motherboard BIOS has to be tweaked to cope with the card
    • The card is only dual 6600.

    Given that there are single-card solutions better than this dual-GPU single card, and that it only works on one motherboard are a rea

If I want your opinion, I'll ask you to fill out the necessary form.

Working...