Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AMD Graphics Software

AMD's Dual GPU Monster, The Radeon HD 3870 X2 146

MojoKid writes "AMD officially launched their new high-end flagship graphics card today and this one has a pair of graphics processors on a single PCB. The Radeon HD 3870 X2 was codenamed R680 throughout its development. Although that codename implies the card is powered by a new GPU, it is not. The Radeon HD 3870 X2 is instead powered by a pair of RV670 GPUs linked together on a single PCB by a PCI Express fan-out switch. In essence, the Radeon HD 3870 X2 is "CrossFire on a card" but with a small boost in clock speed for each GPU as well. As the benchmarks and testing show, the Radeon HD 3870 X2 is one of the fastest single cards around right now. NVIDIA is rumored to be readying a dual GPU single card beast as well."
This discussion has been archived. No new comments can be posted.

AMD's Dual GPU Monster, The Radeon HD 3870 X2

Comments Filter:
  • by Ed Avis ( 5917 ) <ed@membled.com> on Monday January 28, 2008 @09:44AM (#22207812) Homepage
    No mention from the article summary of whether this is supported by ATI's recent decision to release driver source code. If you buy this card can you use it with free software?

    (Extra points if anyone pedantically takes the subject line and suggests targetting gcc to run the Linux kernel on your GPU... but you know what I mean...)
    • Was a Linux version of Crysis released that I didn't hear about?
      • Oh how I wait for this to be reality. My dual 8800GTX cards want to be in Linux all the time, but sadly there is no way to run it without Windows.
    • by habig ( 12787 ) on Monday January 28, 2008 @09:59AM (#22207986) Homepage
      No mention from the article summary of whether this is supported by ATI's recent decision to release driver source code. If you buy this card can you use it with free software?

      While AMD has done a good thing and released a lot of documentation for their cards, it has not been source code, and has not yet included the necessary bits for acceleration (either 2D or 3D). That said, I'm watching what I'm typing right now courtesy of the surprisingly functional radeonhd driver [x.org] being developed by the SUSE folks for Xorg from this documentation release. While lacking acceleration, it's already more stable and lacks the numerous show-stopper bugs present in ATI's fglrx binary blob.

      Dunno yet if this latest greatest chunk of silicon is supported, but being open source and actively developed, I'm sure that support will arrive sooner rather than later.

      • Re: (Score:2, Informative)

        by GuidoW ( 844172 )

        Actually, what did they really release? I remember some time ago, there was a lot of excitement right here on /. about ati releasing the first part of the documentation, which was basically a list with names and addresses of registers but little or no actual explanations. (Although I guess if you have programmed graphics drivers before, you'd be able to guess a lot from the names...)

        The point is, it was said that that these particular docs were only barely sufficient to implement basic things like mode-set

        • Re: (Score:1, Informative)

          by Anonymous Coward
          Watch this space: http://airlied.livejournal.com/ [livejournal.com]
        • by habig ( 12787 )
          What's the real status of the released docs? Is there enough to do a real implementation with all the little things like RandR, dual head support, TV-Out and 3D-support, or is ati just stringing us along, pretending to be one of the good guys?

          RandR and dual head work, based on what's running on my desk right now. Better than fglrx.

          No idea about TV-out. Some 2D acceleration is in the works, but the 3D bits were not in the released docs (although rumors of people taking advantage of standardized calls

    • AMD/ATI still has issues delivering drivers on par with nVidia, depending on the application.

      But, yes it does run Linux.
  • It's time to change my aging Athlon 900 MHz then :-).
  • by AceJohnny ( 253840 ) <jlargentaye@gmail. c o m> on Monday January 28, 2008 @09:47AM (#22207852) Journal
    Can't make it faster? Make more. Another multiprocessing application. Can I haz multiprocessor network card plz?

    When can I have a quantum graphics card that displays all possible pictures at the same time ?
    • by mwvdlee ( 775178 ) on Monday January 28, 2008 @09:58AM (#22207968) Homepage
      Here's your "all possible pictures at the same time" (using additive mixing), and it doesn't even require you to buy a new graphics card:














      Cool éh?
    • by n3tcat ( 664243 )
      I'm pretty sure that "white" was available on even the first video accelerator cards...
    • by afidel ( 530433 )
      Um? Actually video cards are an inherently parallizable problem set. You see this in every modern video card where the difference between the top and bottom of a product line is often simply the amount of parallel execution units that passed QC. All they are doing here is combining two of the largest economically producible dies together into one superchip. Oh and I already have multiprocessor network cards, their called multiport TCP Offload cards =)
    • When can I have a quantum graphics card that displays all possible pictures at the same time ?

      Quantum algorithm for finding properly rendered pictures:
      1. Randomly construct a picture, splitting the universe into as many possibilities as exist.
      2. Look at the picture.
      3. If it's incorrectly rendered, destroy the universe.

      But now, with Quantum Graphics, you don't have to destroy the unfit universes - the card will take care of it for you! Buy now!
    • by Yvanhoe ( 564877 )
      I have a quantum graphics card somewhere, standing still, but I can't locate it ! Damn you, Heisenberg !
    • by TheLink ( 130905 )
      "displays all possible pictures at the same time?"

      Goatse, hot tub girl and "Can I haz cheeseburger" at the same time? No thanks.
    • by imgod2u ( 812837 )
      In the case of graphics processors, this has been the trend for quite some time (and in fact, has always been the trend). Each generation of graphics processors has been made more powerful than the previous by adding more parallel pixel processors. The main clock for these chips have been kept steadily at the sub-GHz region. In fact, my old 4200 Geforce operates at 250 MHz.

      It makes sense since the processing of a pixel's shading and texture data is very parallel. In theory you could have up to your reso
  • Two GPUs on a single card? Who the hell needs that kind of power? Besides, don't modern graphics cards waste ridiculous amounts of energy even when they're simply drawing your desktop?

    For those who haven't been following the recent releases of ATI graphics cards, it's probably interesting to note that the AI HD2850 and HD2870 use only 20 Watt when idling (most low-end cards use at least 30W nowadays, and high-end cards are often closer to 100W).

    So that should mean that this new card should eat about 40W whe
    • by mwvdlee ( 775178 )
      Isn't AMD working on a system which switches back to a low-power on-board graphics chip when drawing the OS?
      • Isn't AMD working on a system which switches back to a low-power on-board graphics chip when drawing the OS?

        I don't know what AMD/ATI is currently working on, but you can not draw an Operating System. You can however draw a windowing system, for instance XOrg rendering KDE or Gnome. This is Slashdot, us nerds are pedantic.

        Perhaps you meant having a low power chip which can take over for simple 2D graphics. I believe Aero (hopefully I got the name correct) uses 3D graphics now, and its all the rage in

        • Actually, both nVidia and ATi are working on a system that allows a lower powered onboard GPU core to handle things like Veesta aero, then switch to the octal SLi Geforce 10000 GTX when rendering Crysis 2 or something. I believe it's a part of nVidia's hybrid SLi, and ATi's hybrid Crossfire. It's supposed to save a lot of power because not only does it divert light rendering load to a chip that can easily handle it, it suspends the main GPU, saving a lot on idle power draw (current cards, especially high-en
        • by Cecil ( 37810 )
          Don't get pedantic if you're not going to go all the way. An operating system operates all parts of the computer, including the video output parts. Even if it's running in text mode, something has to tell it what characters to draw where. Even DOS had to be "drawn" on the screen, so to speak. If you want to get even more pedantic, then yes, what you saw *most* of the time was command.com or some other program, but the OS itself had video output routines too, specifically, "Starting MS-DOS..." and the always
        • by Justus ( 18814 )

          This is Slashdot, us nerds are pedantic.

          In that case, allow me to give you a quick grammar lesson. If you're going to use a phrase like "us [sic] nerds are pedantic," there's a simple rule for determining whether to use "we" or "us." The sentence should be grammatically correct without the additional descriptive word you've added (nerds in this case). Following that rule, you would consider two possibilities: "we are pedantic" and "us are pedantic." Obviously, the latter is incorrect.

          I apologize for b

    • by Fross ( 83754 )
      This card is actually the most power-hungry of the lot.

      They only give power consumption for the whole system, 214W when idle, 374W when under load (!)

      SOme basic math on their results gives you the 3870 consumes 50W when idle, and the X2 consuming 100W when idle and up to a massive 260W when under full load.

      (3870 at idle = 164W, 3870 X2 at idle = 214W, hence 3870 = 50W)
    • Who the hell needs that kind of power?

      Who needs it? Probably graphics artists who are rendering amazingly complex scenes. I can imagine it would help some game designers and potentially even CAD architecture-types. Probably not so much with films because I think they're rendered on some uber-servers.

      Who wants it? Gamers with more money than sense and a desire to always be as close to the cutting edge as possible, even if it only gains them a couple of frames and costs another £100 or more.

      • Who needs it? Probably graphics artists who are rendering amazingly complex scenes. I can imagine it would help some game designers and potentially even CAD architecture-types. Probably not so much with films because I think they're rendered on some uber-servers.

        Not necessarily. Most standard rendering engines eat system CPU a lot more than it ever would GPU - especially when it comes to things like ray tracing, texture optimization, and the like.

        Most (even low-end) rendering packages do have "OpenGL Mode", which uses only the GPU, but the quality is usually nowhere near as good as you get with full-on CPU-based rendering. Things may catch up as graphics cards improve, but for the most part, render engines are hungry for time on that chip on your motherboard,

    • by balthan ( 130165 )
      Two GPUs on a single card? Who the hell needs that kind of power?

      Never played Crysis, huh?
    • by tyrione ( 134248 )

      Two GPUs on a single card? Who the hell needs that kind of power? Besides, don't modern graphics cards waste ridiculous amounts of energy even when they're simply drawing your desktop?

      For those who haven't been following the recent releases of ATI graphics cards, it's probably interesting to note that the AI HD2850 and HD2870 use only 20 Watt when idling (most low-end cards use at least 30W nowadays, and high-end cards are often closer to 100W).

      So that should mean that this new card should eat about 40W when idling, making this card not just the most powerful graphics card today, but also less wasteful than nVidia's 8800GT. Not a bad choice if you're in dire need of more graphics power. Although personally I'm planning to buy a simple 3850.

      Raises Hand. Who needs this kind of power? Ever done any Solid Modeling? Real-time rendering? Engineering computations that can be off-loaded onto a GPU that can do massive floating point calculations? As a Mechanical Engineer I want to be able to do this without buying a $3k FireGL card or competing card from nVidia and I also want to be able to deal with Multimedia compression and other aspects that those cards aren't designed to solve.

  • Don't bother (Score:2, Insightful)

    by BirdDoggy ( 886894 )
    Wait for the nVidia version. Based on their latest offerings, it'll probably be faster and have more stable drivers.
    • by Wicko ( 977078 )
      Or, pick up a pair of 8800GT's for roughly the same price as AMD's X2, and more performance (most likely). This is assuming you have an SLI capable board. An X2 from nvidia is gonna cost an arm and a leg most likely..
    • by Kris_J ( 10111 ) *
      Why wait? The Gigabyte 3D1 [gigabyte.com.tw] has been available for years. I'm still using mine.
  • Seriously? Yawn. (Score:4, Insightful)

    by esconsult1 ( 203878 ) on Monday January 28, 2008 @10:03AM (#22208020) Homepage Journal
    Am I the only one underwhelmed by almost every new graphics card announcement these days?

    Graphic cards have long since been really fast for 99.9999% of cases. Even gaming. These companies must be doing this for pissing contests, the few people who do super high end graphics work, or a few crazy pimply faced gamers with monitor tans

    • Re:Seriously? Yawn. (Score:5, Informative)

      by Cerberus7 ( 66071 ) on Monday January 28, 2008 @10:15AM (#22208168)
      Actually, graphics power isn't fast enough yet, and it will likely never be fast enough. With high-resolution monitors (1920x1200, and such), graphics cards don't yet have the ability to push that kind of resolution at good framerates (~60fps) on modern games. 20-ish FPS on Crysis at 1920x1200 is barely adequate. This tug-of-war that goes on between the software and hardware is going to continue nearly forever.

      Me, I'll be waiting for the card that can do Crysis set to 1920x1200, all the goodies on, and 50-60fps. Until then, my 7900GT SLI setup is going to have to be enough.
      • Me, I'll be waiting for the card that can do Crysis set to 1920x1200, all the goodies on, and 50-60fps. Until then, my 7900GT SLI setup is going to have to be enough.
        Which should be just in time for the "next big game" to come out. ;)
      • by jcnnghm ( 538570 )
        That's the biggest problem that I see with PC gaming. Last week, I went out and bought a Nvidia 8800 GTS for $300, so that I could play some of the more recent PC games at an acceptable frame rate at my primary monitor's native resolutions (1680x1050). My computer is fairly modern, with a 2.66 GHZ dual core processor and 2 GB of DDR2 800. The problem is, even with this upgrade, I could only play Crysis at medium settings.

        While it was definitely a performance improvement over my 6800 sli setup, the qualit
        • Re: (Score:3, Insightful)

          by stewbacca ( 1033764 )
          Play the game and enjoy it for the best settings you can get. I downloaded the Crysis demo last night for my 20" iMac booted into WinXP (2.33ghz c2d, 2gb ram, 256mb X1600 video card, hardly an ideal gaming platform, eh?). I read that I wouldn't be able to play it on very good settings, so I took the default settings for my native resolution and played through the entire demo level with no slowdowns. It looked great.

          The real problem here is people feeling like they are missing out because of the higher se

        • That's the biggest problem that I see with PC gaming.

          The problem is with your mindset, not with PC gaming.

          Crysis looks *beautiful* on medium settings. The fact that it will look even better on new hardware a year from now is an advantage for people who buy that hardware and completely irrelevant to anyone who doesn't. At least for people who don't have some sort of massive jealousy issue that makes it so they can't handle the idea that someone might, at some point in the future, have nicer toys than they

      • by zrq ( 794138 )

        I'll be waiting for the card that can do Crysis set to 1920x1200, all the goodies on, and 50-60fps

        I recently bought a new 24" monitor (PLE2403WS [iiyama.com]) from Iiyama. Very nice monitor, but a few problems integrating it with my current video card.

        The monitor is 1920x1200 at ~60Hz. The manual for my graphics card (GeForce PCX 5300) claims it can handle 1920x1080 and 1920x1440, but not 1920x1200 :-(

        Ok, I kind of expected I would need to get a new graphics card, but I am finding it difficult to find out what

      • Re: (Score:2, Insightful)

        Actually, graphics power isn't fast enough yet, and it will likely never be fast enough. With high-resolution monitors (1920x1200, and such), graphics cards don't yet have the ability to push that kind of resolution at good framerates (~60fps) on modern games. 20-ish FPS on Crysis at 1920x1200 is barely adequate. This tug-of-war that goes on between the software and hardware is going to continue nearly forever.

        Me, I'll be waiting for the card that can do Crysis set to 1920x1200, all the goodies on, and 50-60fps. Until then, my 7900GT SLI setup is going to have to be enough.

        But then you'd just be complaining that resolution Xres+1 x Yres+1 can't be pushed as FPS N+1. Honestly, you only need 24 to 32 FPS as that is pretty much where your eyes are at (unless you have managed to time travel and get ultra-cool ocular implants that can decode things faster). It's the never ending b(#%*-fest of gamers - it's never fast enough - doesn't matter that you're using all the resources of the NCC-1701-J Enterprise to play your game.

        • Honestly, you only need 24 to 32 FPS as that is pretty much where your eyes are at

          Honestly, you don't play FPS games if you say that.

          Film has such a crappy frame rate (24fps) that most movies avoid fast camera pans.
          TV runs at 60 fields (480i60, 1080i60) or 60 frames (480p60, 720p60) per second, not 30 frames per second.

          30fps is acceptable for a game like WoW where you have hardware cursor and you aren't using a cursor-controlled viewpoint. It's not as smooth, but it's playable.

          30fps isn't acceptable for a F

          • Perhaps, but 30 is adequate. If you don't own at 30 fps, you're not going to do much better at 60. I assume you're using a high-resolution mouse, btw, or your statement would barely even make sense.

            You may be too young to remember, but back in the day, we got 10 fps playing Q2, and that's the way we liked it! Ahh, the old days of not having a 3d card and going with full-software graphics...
        • Not at all (Score:5, Insightful)

          by Sycraft-fu ( 314770 ) on Monday January 28, 2008 @12:50PM (#22209908)
          Many things you are wrong with there. The first is framerate. If you can't tell the difference between 24 and 60 FPS, well you probably have something wrong. It is pretty obvious on computer graphics due to the lack of motion blur present in film, and even on a film/video source you can see it. 24 FPS is not the maximum amount of frames a person can perceive, rather it is just an acceptable amount when used with film.

          So one goal in graphics is to be able to push a consistently high frame rate, probably somewhere in the 75fps range as that is the area when people stop being able to perceive flicker. However, while the final output frequency will be fixed to something like that due to how display devices work, it would be useful to have a card that could render much faster. What you'd do is have the card render multiple sub frames and combine them in an accumulation buffer before outputting them to screen. That would give nice, accurate, motion blur and thus improve the fluidity of the image. So in reality we might want a card that can consistently render a few hundred frames per second, even though it doesn't display that many.

          There's also latency to consider. If you are rendering at 24fps that means you have a little over 40 milliseconds between frames. So if you see something happen on the screen and react, the computer won't get around to displaying the results of your reaction for 40 msec. Maybe that doesn't sound like a long time, but that has gone past the threshold where delays are perceptible. You notice when something is delayed that long.

          In terms of resolution, it is a similar thing. 1920x1200 is nice and all, and is about as high as monitors go these days, but let's not pretend it is all that high rez. For a 24" monitor (which is what you generally get it on) that works out to about 100PPI. Well print media is generally 300DPI or more, so we are still a long way off there. I don't know how high rez monitors need to be numbers wise, but they need to be a lot higher to reach the point of a person not being able to perceive the individual pixels which is the useful limit.

          Also pixel oversampling is useful just like frame oversampling. You render multiple subpixels and combine them in to a single final display pixel. It is called anti-aliasing and it is very desirable. Unfortunately, it does take more power to do since you do have to do more rendering work, even when you use tricks to do it (and it really looks the best when does as straight super-sampling, no tricks).

          So it isn't just gamers playing the ePenis game, there's real reasons to want a whole lot more graphics power. Until we have displays that are so high rez you can't see individual pixels, and we have cards that can produce high frame rates at full resolution with motion blur and FSAA, well then we haven't gotten to where we need to be. Until you can't tell it apart form reality, there's still room for improvement.
        • Re: (Score:3, Insightful)

          Honestly, I doubt you play FPS games because the difference between the 24-32fps range and the 50-60's is way noticeable. Forget the theoretical technicalities of human eyes capabilities for one second because I'm sure when the FPS of a game reaches the 30's, there are other factors that make it sluggish and all that together give us the perception that the difference between 30's and 60's is an important difference.
      • Re: (Score:1, Informative)

        by Anonymous Coward
        Judging from the comments here it seems that the market for this card is for Crysis players who want to play at max settings. That is a pretty narrow market.
      • by Splab ( 574204 )
        This is a big problem since my 24" is running in 1920x1200 and windows insists on running GL application in windowed mode if they run in a resolution less than 1920x1200.

        Been looking into a new rig, but even high end everything doesn't push fast enough for Crysis to run smooth. I hope the Nvidia 9800 will do wonders.
    • Am I the only one underwhelmed by almost every new graphics card announcement these days?

      Absolutely not, and the reason these announcements are so 'boring' is the fact the cards are never That Much better than the previous generation.

      I expected to see Double the scores and Double the frame-rates from a Dual GPU card! But alas, steady incremental improvements that don't warrant the extreme cost of the device.

      Maybe now that I've made that realization, I won't overhype myself from now on.

    • by jez9999 ( 618189 )
      Graphic cards have long since been really fast for 99.9999% of cases.

      I think they're releasing a new Elder Scrolls soon.
    • I hear what you're saying, esconsult1, in that the top-of-the-range GPUs are capable of hoovering up the most demanding apps and games at ridiculous resolutions and so product announcements such as this are neither groundbreaking nor exciting.

      In terms of the progression of GPU technology as a whole, however, I for one shall be acquiring a new 'multimedia' laptop in about six months and I need a fairly high spec graphics card that will, for example, support game play of the latest titles but (1) will not d
    • Am I the only one underwhelmed by almost every new graphics card announcement these days? Graphic cards have long since been really fast for 99.9999% of cases. Even gaming.

      Do you play current games? They keep getting more demanding, and people who want to play those games also want hardware to match. If your current hardware suits your needs... Good for you. Please realize that others will have different needs.

      Today's "pissing match" card is tomorrow's budget gamer's choice. I LIKE progress.
    • by Fweeky ( 41046 )
      Pfft, I could just about run Crysis on medium at 1024*768 on my 7950GT; numerous other games need settings turning down and/or resolution decreasing to run smoothly, never mind 4xAA or 16xAS. I recently upgraded to a G92 8800GTS and it's great actually being able to run everything in my monitor's native resolution again, and remembering what "smooth" meant.

      Now I'm thinking about getting a 30" monitor; 2560x1600 -- ruh oh, now my card needs to be twice as powerful again to avoid having to run in non native
  • I am really not that impressed. It's not much faster than the 8800 GT which is MUCH MUCH MUCH less expensive. I am sure you can pick up two 8800 GT's for the price of this card. Of course then you have to deal with the noise, but overall it looks to me that the price/performance ratio of this card is not that great.
  • No matter how well they designed the card, at the end of the day price/performance is what you are looking for in a graphics card. This card delivers performance that teeters around the same performance that the 8800 Ultra gives at a much lower cost and produces about the same noise and power ratios.

    ATI announced that they won't sell cards for over 500 dollars and I think that gives them a good standing in the market place. If you are willing to spend 450 dollars http://www.newegg.com/Product/Product.aspx?I [newegg.com]
  • by Overzeetop ( 214511 ) on Monday January 28, 2008 @10:14AM (#22208154) Journal

    Ultimately though, the real long-term value of the Radeon HD 3870 X2 will be determined by AMD's driver team.
    That doesn't really bode well, given the clusterfuck that the CCC drivers tend to be.
    • by chromozone ( 847904 ) on Monday January 28, 2008 @10:24AM (#22208262)
      ATI/AMD's drivers can make you cry. But their Crossfire already scales much better than Nvidia's SLI which is a comparative disaster. Most games use Nvidia's cards/drivers for development so Nvidia cards hit the ground running more often. As manky as ATI drivers can be, when they say they will be getting better they tell the truth. ATI drivers tend to show substantial improvements after a cards release.
      • by ashpool7 ( 18172 )
        Hm, well if that's the case, then nobody should run out and buy this card.

        WRT Crossfire... I had a friend who wanted to buy Intel because they're "the fastest." Hence, he was stuck with ATi for video cards. Except the latest driver bugged Crossfire and he spent a couple hours uninstalling the driver to reinstall the older one. Doesn't that sound like fun?

        nVidia's drivers aren't better because they're used for development, they're better because nVidia knows "IT'S ALL ABOUT THE DRIVERS, STUPID". ATi stil
    • Six month old card, no working driver. ...and by that I mean a driver which doesn't say "Card not supported" when you try to install it.

      This month they released an unsupported "hotfix driver" which installs but puts garbage on screen when you try anything - even with obvious things like 3DMark.

  • Does it come with... (Score:4, Interesting)

    by Walles ( 99143 ) <johan.walles@noSpAm.gmail.com> on Monday January 28, 2008 @10:15AM (#22208170)
    ... specs or open source drivers?

    I haven't heard anything about any specs for 3d operations being released from AMD. I know they were talking about it, but what happened then? Did they release anything while I wasn't looking?

    • Re: (Score:3, Insightful)

      by Andy Dodd ( 701 )
      They probably are pulling a Matrox. Release partial specs, promise to release more, rake in $$$$$$$$$$ from gullible members of the Open Source community, fail to deliver on promises. Great short-term strategy but only works once before said community stops trusting you, especially those who were dumb enough to go for your promises like I was back in 1999.

      Ever since I made the mistake of buying a Matrox G200 (Partial specs - more complete than what ATI has released so far as I understand it, and a promise
    • Are you talking about the radeonhd [radeonhd.org] driver?

      Currently there's only 2d support, but a handful of developers from Novell seem to be consistently working on it.

      As for specs, they just released another batch back in early january [phoronix.com].
    • I keep repeating this: Buy vendors that do offer open source drivers.

      Typical Reply: Boo hoo, Intel is too slow, boo hoo.

      My reply: Intel's graphic cards won't get faster if no one buys them. Other companies won't open source their drivers if you keep buying them with closed source drivers. Other companies will only open their drivers if they see it works for Intel.
    • Re: (Score:3, Interesting)

      by Kjella ( 173770 )

      I haven't heard anything about any specs for 3d operations being released from AMD. I know they were talking about it, but what happened then? Did they release anything while I wasn't looking?

      They released another 900 pages of 2D docs around Christmas, 2D/3D acceleration is still coming "soon" but given their current pace it'll take a while to get full 3D acceleration. So far my experience with the nVidia closed source drivers have been rock stable, I have some funny issues getting the second screen of my dual screen setup working but it never crashed on me.

      Drivers are something for the here and now, they don't have any sort of long term implications like say what document format you use. The d

  • The summary failed to mention the most important factor: the new AMD card is actually much cheaper than the 8800 Ultra and at the same time a lot faster in many tests. In addition, it seems that the X2 equivalent of nVidia is delayed by one month or more, so AMD does have the lead for at least another month.
    • by beavis88 ( 25983 )
      AMD would have the lead for another month if they would ship actual product. But they haven't yet, in usual ATI form, and I wouldn't recommend holding your breath...I would not be at all surprised to see Nvidia's competitor, while delayed, in the hands of actual consumers around the same time as the 3870X2.
      • by beavis88 ( 25983 )
        Well shit, serves me right for not checking again - looks like Newegg has some of these in stock and available for purchase, right now. Go ATI!
  • Anyone remember the ATI Rage Fury MAXX [amd.com]? I've still got one in use. It was a monster in its day. Dual Rage 128 Pro GPU's and 64MB of RAM. But for some reason the way that they jury rigged them on one board didn't work properly in XP, so it only uses one. Oh well, still s nifty conversation piece.
    • Re: (Score:3, Informative)

      by TheRaven64 ( 641858 )
      The Rage 128 Pro was never close to the top of the line for a graphics accelerator (and doesn't really qualify as a GPU since it doesn't do transform or lighting calculations in hardware). It was around 50% faster than the Rage 128, which was about as fast as a VooDoo 2 (although it also did 2D). You had been able to buy Obsidian cards with two VooDoo 2 chips for years before the Maxx was released, and the later VooDoo chips were all designed around putting several on a single board.

  • Why only pci-e 1.1 a 2.0 switch would better split the bus to the 2 gpus.

    also there should be 2 cross fire slots as each gps has 2 links and 2 out of 4 are used to link each other.
    • Why only pci-e 1.1 a 2.0 switch would better split the bus to the 2 gpus.

      Because there simply aren't any 3-way PCI Express 2.0 switches available on the market yet - Waiting for that would have delayed the product substantially for very little in the way of real-world gains.
  • Just take two of your cards that are getting beaten by NVIDIA and then combine them in the hopes that they'll beat NVIDIA! aaaaand go!
  • Next up... (Score:3, Funny)

    by jez9999 ( 618189 ) on Monday January 28, 2008 @10:57AM (#22208742) Homepage Journal
    Work is in the pipeline for a board which can house all your computer's necessary components, including a multiple core CPU that can handle graphics AND processing all-in-one! It will be the mother of all boards.
  • whatever happened to the physics card that some company released a while back? It seemed like a pretty good idea, and I wonder if it could be modified to fit onto a graphics card as well. I just think that would be a nice coupling because I like the small towers rather than the huge behemoth that I have in my Mom's basement (no, I don't live at home any more, wanna take my geek card back?). It's nice that they are putting an extra chip into their cards, I can definitely imagineer that as being pretty helpfu
  • So we have come full circle to the Voodoo 5 then?

"The vast majority of successful major crimes against property are perpetrated by individuals abusing positions of trust." -- Lawrence Dalzell

Working...