Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
AMD Graphics Software

AMD's Dual GPU Monster, The Radeon HD 3870 X2 146

MojoKid writes "AMD officially launched their new high-end flagship graphics card today and this one has a pair of graphics processors on a single PCB. The Radeon HD 3870 X2 was codenamed R680 throughout its development. Although that codename implies the card is powered by a new GPU, it is not. The Radeon HD 3870 X2 is instead powered by a pair of RV670 GPUs linked together on a single PCB by a PCI Express fan-out switch. In essence, the Radeon HD 3870 X2 is "CrossFire on a card" but with a small boost in clock speed for each GPU as well. As the benchmarks and testing show, the Radeon HD 3870 X2 is one of the fastest single cards around right now. NVIDIA is rumored to be readying a dual GPU single card beast as well."
This discussion has been archived. No new comments can be posted.

AMD's Dual GPU Monster, The Radeon HD 3870 X2

Comments Filter:
  • Don't bother (Score:2, Insightful)

    by BirdDoggy ( 886894 ) on Monday January 28, 2008 @10:59AM (#22207982)
    Wait for the nVidia version. Based on their latest offerings, it'll probably be faster and have more stable drivers.
  • Seriously? Yawn. (Score:4, Insightful)

    by esconsult1 ( 203878 ) on Monday January 28, 2008 @11:03AM (#22208020) Homepage Journal
    Am I the only one underwhelmed by almost every new graphics card announcement these days?

    Graphic cards have long since been really fast for 99.9999% of cases. Even gaming. These companies must be doing this for pissing contests, the few people who do super high end graphics work, or a few crazy pimply faced gamers with monitor tans

  • by Overzeetop ( 214511 ) on Monday January 28, 2008 @11:14AM (#22208154) Journal

    Ultimately though, the real long-term value of the Radeon HD 3870 X2 will be determined by AMD's driver team.
    That doesn't really bode well, given the clusterfuck that the CCC drivers tend to be.
  • by Andy Dodd ( 701 ) <atd7NO@SPAMcornell.edu> on Monday January 28, 2008 @11:44AM (#22208560) Homepage
    They probably are pulling a Matrox. Release partial specs, promise to release more, rake in $$$$$$$$$$ from gullible members of the Open Source community, fail to deliver on promises. Great short-term strategy but only works once before said community stops trusting you, especially those who were dumb enough to go for your promises like I was back in 1999.

    Ever since I made the mistake of buying a Matrox G200 (Partial specs - more complete than what ATI has released so far as I understand it, and a promise for full specs which were never delivered) I never make buying decisions based on promised specification/driver releases - only what works NOW, whether binary or not. (Hence I've been happily buying NVidia for 6-7 years now.)
  • by TemporalBeing ( 803363 ) <bm_witness@BOYSENyahoo.com minus berry> on Monday January 28, 2008 @01:02PM (#22209450) Homepage Journal

    Actually, graphics power isn't fast enough yet, and it will likely never be fast enough. With high-resolution monitors (1920x1200, and such), graphics cards don't yet have the ability to push that kind of resolution at good framerates (~60fps) on modern games. 20-ish FPS on Crysis at 1920x1200 is barely adequate. This tug-of-war that goes on between the software and hardware is going to continue nearly forever.

    Me, I'll be waiting for the card that can do Crysis set to 1920x1200, all the goodies on, and 50-60fps. Until then, my 7900GT SLI setup is going to have to be enough.
    But then you'd just be complaining that resolution Xres+1 x Yres+1 can't be pushed as FPS N+1. Honestly, you only need 24 to 32 FPS as that is pretty much where your eyes are at (unless you have managed to time travel and get ultra-cool ocular implants that can decode things faster). It's the never ending b(#%*-fest of gamers - it's never fast enough - doesn't matter that you're using all the resources of the NCC-1701-J Enterprise to play your game.
  • by seeker_1us ( 1203072 ) on Monday January 28, 2008 @01:37PM (#22209732)
    So we have come full circle to the Voodoo 5 then?
  • Not at all (Score:5, Insightful)

    by Sycraft-fu ( 314770 ) on Monday January 28, 2008 @01:50PM (#22209908)
    Many things you are wrong with there. The first is framerate. If you can't tell the difference between 24 and 60 FPS, well you probably have something wrong. It is pretty obvious on computer graphics due to the lack of motion blur present in film, and even on a film/video source you can see it. 24 FPS is not the maximum amount of frames a person can perceive, rather it is just an acceptable amount when used with film.

    So one goal in graphics is to be able to push a consistently high frame rate, probably somewhere in the 75fps range as that is the area when people stop being able to perceive flicker. However, while the final output frequency will be fixed to something like that due to how display devices work, it would be useful to have a card that could render much faster. What you'd do is have the card render multiple sub frames and combine them in an accumulation buffer before outputting them to screen. That would give nice, accurate, motion blur and thus improve the fluidity of the image. So in reality we might want a card that can consistently render a few hundred frames per second, even though it doesn't display that many.

    There's also latency to consider. If you are rendering at 24fps that means you have a little over 40 milliseconds between frames. So if you see something happen on the screen and react, the computer won't get around to displaying the results of your reaction for 40 msec. Maybe that doesn't sound like a long time, but that has gone past the threshold where delays are perceptible. You notice when something is delayed that long.

    In terms of resolution, it is a similar thing. 1920x1200 is nice and all, and is about as high as monitors go these days, but let's not pretend it is all that high rez. For a 24" monitor (which is what you generally get it on) that works out to about 100PPI. Well print media is generally 300DPI or more, so we are still a long way off there. I don't know how high rez monitors need to be numbers wise, but they need to be a lot higher to reach the point of a person not being able to perceive the individual pixels which is the useful limit.

    Also pixel oversampling is useful just like frame oversampling. You render multiple subpixels and combine them in to a single final display pixel. It is called anti-aliasing and it is very desirable. Unfortunately, it does take more power to do since you do have to do more rendering work, even when you use tricks to do it (and it really looks the best when does as straight super-sampling, no tricks).

    So it isn't just gamers playing the ePenis game, there's real reasons to want a whole lot more graphics power. Until we have displays that are so high rez you can't see individual pixels, and we have cards that can produce high frame rates at full resolution with motion blur and FSAA, well then we haven't gotten to where we need to be. Until you can't tell it apart form reality, there's still room for improvement.
  • by Man in Spandex ( 775950 ) <prsn DOT kev AT gmail DOT com> on Monday January 28, 2008 @02:19PM (#22210298)
    Honestly, I doubt you play FPS games because the difference between the 24-32fps range and the 50-60's is way noticeable. Forget the theoretical technicalities of human eyes capabilities for one second because I'm sure when the FPS of a game reaches the 30's, there are other factors that make it sluggish and all that together give us the perception that the difference between 30's and 60's is an important difference.
  • by stewbacca ( 1033764 ) on Monday January 28, 2008 @02:21PM (#22210334)
    Play the game and enjoy it for the best settings you can get. I downloaded the Crysis demo last night for my 20" iMac booted into WinXP (2.33ghz c2d, 2gb ram, 256mb X1600 video card, hardly an ideal gaming platform, eh?). I read that I wouldn't be able to play it on very good settings, so I took the default settings for my native resolution and played through the entire demo level with no slowdowns. It looked great.


    The real problem here is people feeling like they are missing out because of the higher settings they can't play. Just play the game! Quit fueling the ridiculous ten-year-old trend of spending more on graphic cards than the computers themselves! If the game were unplayable at the medium setting, then yeah, I'd say the complaint is valid.

  • by MulluskO ( 305219 ) on Monday January 28, 2008 @04:05PM (#22211920) Journal
    Man is finite, and therefore cannot make infinite observations.

Genetics explains why you look like your father, and if you don't, why you should.

Working...