AMD's Dual GPU Monster, The Radeon HD 3870 X2 146
MojoKid writes "AMD officially launched their new high-end flagship graphics card today and
this one has a pair of graphics processors on a single PCB.
The Radeon HD 3870 X2 was codenamed R680 throughout its development.
Although that codename implies the card is powered by a new GPU, it is not. The
Radeon HD 3870 X2 is instead powered by a pair of RV670 GPUs linked together on
a single PCB by a PCI Express fan-out switch. In essence, the Radeon HD 3870 X2
is "CrossFire on a card" but with a small boost in clock speed for each GPU as
well.
As the benchmarks and testing show, the Radeon HD 3870 X2 is one of the
fastest single cards around right now. NVIDIA is rumored to be readying a dual
GPU single card beast as well."
But does it run Linux? (Score:4, Interesting)
(Extra points if anyone pedantically takes the subject line and suggests targetting gcc to run the Linux kernel on your GPU... but you know what I mean...)
Re: (Score:2)
Re: (Score:2)
Re:But does it run Linux? (Score:5, Informative)
While AMD has done a good thing and released a lot of documentation for their cards, it has not been source code, and has not yet included the necessary bits for acceleration (either 2D or 3D). That said, I'm watching what I'm typing right now courtesy of the surprisingly functional radeonhd driver [x.org] being developed by the SUSE folks for Xorg from this documentation release. While lacking acceleration, it's already more stable and lacks the numerous show-stopper bugs present in ATI's fglrx binary blob.
Dunno yet if this latest greatest chunk of silicon is supported, but being open source and actively developed, I'm sure that support will arrive sooner rather than later.
Re: (Score:2, Informative)
Actually, what did they really release? I remember some time ago, there was a lot of excitement right here on /. about ati releasing the first part of the documentation, which was basically a list with names and addresses of registers but little or no actual explanations. (Although I guess if you have programmed graphics drivers before, you'd be able to guess a lot from the names...)
The point is, it was said that that these particular docs were only barely sufficient to implement basic things like mode-set
Re: (Score:1, Informative)
Re: (Score:2)
RandR and dual head work, based on what's running on my desk right now. Better than fglrx.
No idea about TV-out. Some 2D acceleration is in the works, but the 3D bits were not in the released docs (although rumors of people taking advantage of standardized calls
Well, barely (Score:2)
But, yes it does run Linux.
Rename already (Score:1)
Re: (Score:2)
With AMD Open Source Linux Drivers (Score:4, Funny)
Multiprocessing everywhere! (Score:5, Funny)
When can I have a quantum graphics card that displays all possible pictures at the same time ?
Re:Multiprocessing everywhere! (Score:5, Funny)
Cool éh?
Re: (Score:2, Funny)
Re: (Score:2)
Re: (Score:1)
Re: (Score:1)
(A raw pixel dump, that is.)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Observing changes the outcome. By observing all outcomes, there is nothing left to change into. Ergo, no way to die?
Re: (Score:3, Insightful)
Re: (Score:1)
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
Re: (Score:1)
Quantum algorithm for finding properly rendered pictures:
1. Randomly construct a picture, splitting the universe into as many possibilities as exist.
2. Look at the picture.
3. If it's incorrectly rendered, destroy the universe.
But now, with Quantum Graphics, you don't have to destroy the unfit universes - the card will take care of it for you! Buy now!
You have been deselected... (Score:1)
Re: (Score:2)
Re: (Score:2)
Goatse, hot tub girl and "Can I haz cheeseburger" at the same time? No thanks.
Re: (Score:2)
Re: (Score:2)
It makes sense since the processing of a pixel's shading and texture data is very parallel. In theory you could have up to your reso
Sounds wasteful, but isn't (Score:2, Redundant)
For those who haven't been following the recent releases of ATI graphics cards, it's probably interesting to note that the AI HD2850 and HD2870 use only 20 Watt when idling (most low-end cards use at least 30W nowadays, and high-end cards are often closer to 100W).
So that should mean that this new card should eat about 40W whe
Re: (Score:2)
Re: (Score:2)
I don't know what AMD/ATI is currently working on, but you can not draw an Operating System. You can however draw a windowing system, for instance XOrg rendering KDE or Gnome. This is Slashdot, us nerds are pedantic.
Perhaps you meant having a low power chip which can take over for simple 2D graphics. I believe Aero (hopefully I got the name correct) uses 3D graphics now, and its all the rage in
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
In that case, allow me to give you a quick grammar lesson. If you're going to use a phrase like "us [sic] nerds are pedantic," there's a simple rule for determining whether to use "we" or "us." The sentence should be grammatically correct without the additional descriptive word you've added (nerds in this case). Following that rule, you would consider two possibilities: "we are pedantic" and "us are pedantic." Obviously, the latter is incorrect.
I apologize for b
RTFA (Score:2)
They only give power consumption for the whole system, 214W when idle, 374W when under load (!)
SOme basic math on their results gives you the 3870 consumes 50W when idle, and the X2 consuming 100W when idle and up to a massive 260W when under full load.
(3870 at idle = 164W, 3870 X2 at idle = 214W, hence 3870 = 50W)
Re: (Score:2)
Who needs it? Probably graphics artists who are rendering amazingly complex scenes. I can imagine it would help some game designers and potentially even CAD architecture-types. Probably not so much with films because I think they're rendered on some uber-servers.
Who wants it? Gamers with more money than sense and a desire to always be as close to the cutting edge as possible, even if it only gains them a couple of frames and costs another £100 or more.
Re: (Score:2)
Who needs it? Probably graphics artists who are rendering amazingly complex scenes. I can imagine it would help some game designers and potentially even CAD architecture-types. Probably not so much with films because I think they're rendered on some uber-servers.
Not necessarily. Most standard rendering engines eat system CPU a lot more than it ever would GPU - especially when it comes to things like ray tracing, texture optimization, and the like.
Most (even low-end) rendering packages do have "OpenGL Mode", which uses only the GPU, but the quality is usually nowhere near as good as you get with full-on CPU-based rendering. Things may catch up as graphics cards improve, but for the most part, render engines are hungry for time on that chip on your motherboard,
Re: (Score:2)
Never played Crysis, huh?
Re: (Score:2)
Two GPUs on a single card? Who the hell needs that kind of power? Besides, don't modern graphics cards waste ridiculous amounts of energy even when they're simply drawing your desktop?
For those who haven't been following the recent releases of ATI graphics cards, it's probably interesting to note that the AI HD2850 and HD2870 use only 20 Watt when idling (most low-end cards use at least 30W nowadays, and high-end cards are often closer to 100W).
So that should mean that this new card should eat about 40W when idling, making this card not just the most powerful graphics card today, but also less wasteful than nVidia's 8800GT. Not a bad choice if you're in dire need of more graphics power. Although personally I'm planning to buy a simple 3850.
Raises Hand. Who needs this kind of power? Ever done any Solid Modeling? Real-time rendering? Engineering computations that can be off-loaded onto a GPU that can do massive floating point calculations? As a Mechanical Engineer I want to be able to do this without buying a $3k FireGL card or competing card from nVidia and I also want to be able to deal with Multimedia compression and other aspects that those cards aren't designed to solve.
Don't bother (Score:2, Insightful)
Re: (Score:2)
Re: (Score:2)
Seriously? Yawn. (Score:4, Insightful)
Graphic cards have long since been really fast for 99.9999% of cases. Even gaming. These companies must be doing this for pissing contests, the few people who do super high end graphics work, or a few crazy pimply faced gamers with monitor tans
Re:Seriously? Yawn. (Score:5, Informative)
Me, I'll be waiting for the card that can do Crysis set to 1920x1200, all the goodies on, and 50-60fps. Until then, my 7900GT SLI setup is going to have to be enough.
Re: (Score:2)
Re: (Score:2)
While it was definitely a performance improvement over my 6800 sli setup, the qualit
Re: (Score:3, Insightful)
The real problem here is people feeling like they are missing out because of the higher se
Re: (Score:2)
Re: (Score:2)
I dont know why people try to argue that graphics dont matter, if they didnt high end graphics cards wouldnt sell and crysis would look like pong.
Crysis doesn't look like Pong, even on a crappy low-ish end X1600 video card. Unfortunately, I have an iMac, so newer cards pushing down prices are of no benefit to me ;-) Perhaps FEAR requires the most subtle of graphics capabilities, but not at the expense of a $500 video card. I'll just play FEAR next year when I can build an entire PC with a decent video card (that will be outdated, but cheap) for less than the cost of that same video card now. For the record, I've played the FEAR demo on my iMac a
Re: (Score:2)
Re: (Score:2)
The problem is with your mindset, not with PC gaming.
Crysis looks *beautiful* on medium settings. The fact that it will look even better on new hardware a year from now is an advantage for people who buy that hardware and completely irrelevant to anyone who doesn't. At least for people who don't have some sort of massive jealousy issue that makes it so they can't handle the idea that someone might, at some point in the future, have nicer toys than they
Re: (Score:2)
I recently bought a new 24" monitor (PLE2403WS [iiyama.com]) from Iiyama. Very nice monitor, but a few problems integrating it with my current video card.
The monitor is 1920x1200 at ~60Hz. The manual for my graphics card (GeForce PCX 5300) claims it can handle 1920x1080 and 1920x1440, but not 1920x1200 :-(
Ok, I kind of expected I would need to get a new graphics card, but I am finding it difficult to find out what
Re: (Score:2, Insightful)
Actually, graphics power isn't fast enough yet, and it will likely never be fast enough. With high-resolution monitors (1920x1200, and such), graphics cards don't yet have the ability to push that kind of resolution at good framerates (~60fps) on modern games. 20-ish FPS on Crysis at 1920x1200 is barely adequate. This tug-of-war that goes on between the software and hardware is going to continue nearly forever.
Me, I'll be waiting for the card that can do Crysis set to 1920x1200, all the goodies on, and 50-60fps. Until then, my 7900GT SLI setup is going to have to be enough.
But then you'd just be complaining that resolution Xres+1 x Yres+1 can't be pushed as FPS N+1. Honestly, you only need 24 to 32 FPS as that is pretty much where your eyes are at (unless you have managed to time travel and get ultra-cool ocular implants that can decode things faster). It's the never ending b(#%*-fest of gamers - it's never fast enough - doesn't matter that you're using all the resources of the NCC-1701-J Enterprise to play your game.
Re: (Score:2)
Honestly, you don't play FPS games if you say that.
Film has such a crappy frame rate (24fps) that most movies avoid fast camera pans.
TV runs at 60 fields (480i60, 1080i60) or 60 frames (480p60, 720p60) per second, not 30 frames per second.
30fps is acceptable for a game like WoW where you have hardware cursor and you aren't using a cursor-controlled viewpoint. It's not as smooth, but it's playable.
30fps isn't acceptable for a F
Re: (Score:2)
You may be too young to remember, but back in the day, we got 10 fps playing Q2, and that's the way we liked it! Ahh, the old days of not having a 3d card and going with full-software graphics...
Not at all (Score:5, Insightful)
So one goal in graphics is to be able to push a consistently high frame rate, probably somewhere in the 75fps range as that is the area when people stop being able to perceive flicker. However, while the final output frequency will be fixed to something like that due to how display devices work, it would be useful to have a card that could render much faster. What you'd do is have the card render multiple sub frames and combine them in an accumulation buffer before outputting them to screen. That would give nice, accurate, motion blur and thus improve the fluidity of the image. So in reality we might want a card that can consistently render a few hundred frames per second, even though it doesn't display that many.
There's also latency to consider. If you are rendering at 24fps that means you have a little over 40 milliseconds between frames. So if you see something happen on the screen and react, the computer won't get around to displaying the results of your reaction for 40 msec. Maybe that doesn't sound like a long time, but that has gone past the threshold where delays are perceptible. You notice when something is delayed that long.
In terms of resolution, it is a similar thing. 1920x1200 is nice and all, and is about as high as monitors go these days, but let's not pretend it is all that high rez. For a 24" monitor (which is what you generally get it on) that works out to about 100PPI. Well print media is generally 300DPI or more, so we are still a long way off there. I don't know how high rez monitors need to be numbers wise, but they need to be a lot higher to reach the point of a person not being able to perceive the individual pixels which is the useful limit.
Also pixel oversampling is useful just like frame oversampling. You render multiple subpixels and combine them in to a single final display pixel. It is called anti-aliasing and it is very desirable. Unfortunately, it does take more power to do since you do have to do more rendering work, even when you use tricks to do it (and it really looks the best when does as straight super-sampling, no tricks).
So it isn't just gamers playing the ePenis game, there's real reasons to want a whole lot more graphics power. Until we have displays that are so high rez you can't see individual pixels, and we have cards that can produce high frame rates at full resolution with motion blur and FSAA, well then we haven't gotten to where we need to be. Until you can't tell it apart form reality, there's still room for improvement.
Re: (Score:3, Insightful)
Re: (Score:1, Informative)
Re: (Score:2)
Been looking into a new rig, but even high end everything doesn't push fast enough for Crysis to run smooth. I hope the Nvidia 9800 will do wonders.
Re: (Score:1)
Absolutely not, and the reason these announcements are so 'boring' is the fact the cards are never That Much better than the previous generation.
I expected to see Double the scores and Double the frame-rates from a Dual GPU card! But alas, steady incremental improvements that don't warrant the extreme cost of the device.
Maybe now that I've made that realization, I won't overhype myself from now on.
Re: (Score:2)
I think they're releasing a new Elder Scrolls soon.
Re: (Score:1)
In terms of the progression of GPU technology as a whole, however, I for one shall be acquiring a new 'multimedia' laptop in about six months and I need a fairly high spec graphics card that will, for example, support game play of the latest titles but (1) will not d
Re: (Score:2)
Do you play current games? They keep getting more demanding, and people who want to play those games also want hardware to match. If your current hardware suits your needs... Good for you. Please realize that others will have different needs.
Today's "pissing match" card is tomorrow's budget gamer's choice. I LIKE progress.
Re: (Score:2)
Now I'm thinking about getting a 30" monitor; 2560x1600 -- ruh oh, now my card needs to be twice as powerful again to avoid having to run in non native
price/performance (Score:1)
Still a good product (Score:1)
ATI announced that they won't sell cards for over 500 dollars and I think that gives them a good standing in the market place. If you are willing to spend 450 dollars http://www.newegg.com/Product/Product.aspx?I [newegg.com]
Driver dependent performance (Score:3, Insightful)
Re:Driver dependent performance (Score:5, Informative)
Re: (Score:2)
WRT Crossfire... I had a friend who wanted to buy Intel because they're "the fastest." Hence, he was stuck with ATi for video cards. Except the latest driver bugged Crossfire and he spent a couple hours uninstalling the driver to reinstall the older one. Doesn't that sound like fun?
nVidia's drivers aren't better because they're used for development, they're better because nVidia knows "IT'S ALL ABOUT THE DRIVERS, STUPID". ATi stil
I'm still waiting for a working driver for my 2600 (Score:2)
This month they released an unsupported "hotfix driver" which installs but puts garbage on screen when you try anything - even with obvious things like 3DMark.
Does it come with... (Score:4, Interesting)
I haven't heard anything about any specs for 3d operations being released from AMD. I know they were talking about it, but what happened then? Did they release anything while I wasn't looking?
Re: (Score:3, Insightful)
Ever since I made the mistake of buying a Matrox G200 (Partial specs - more complete than what ATI has released so far as I understand it, and a promise
Re: (Score:1)
Currently there's only 2d support, but a handful of developers from Novell seem to be consistently working on it.
As for specs, they just released another batch back in early january [phoronix.com].
Buy Intel (Score:2)
Typical Reply: Boo hoo, Intel is too slow, boo hoo.
My reply: Intel's graphic cards won't get faster if no one buys them. Other companies won't open source their drivers if you keep buying them with closed source drivers. Other companies will only open their drivers if they see it works for Intel.
Re: (Score:3, Interesting)
I haven't heard anything about any specs for 3d operations being released from AMD. I know they were talking about it, but what happened then? Did they release anything while I wasn't looking?
They released another 900 pages of 2D docs around Christmas, 2D/3D acceleration is still coming "soon" but given their current pace it'll take a while to get full 3D acceleration. So far my experience with the nVidia closed source drivers have been rock stable, I have some funny issues getting the second screen of my dual screen setup working but it never crashed on me.
Drivers are something for the here and now, they don't have any sort of long term implications like say what document format you use. The d
Price point (Score:1)
Re: (Score:2)
Re: (Score:2)
Anyone remember . . . (Score:1, Interesting)
Re: (Score:3, Informative)
it's missing stuff (Score:2)
also there should be 2 cross fire slots as each gps has 2 links and 2 out of 4 are used to link each other.
Re: (Score:1)
Because there simply aren't any 3-way PCI Express 2.0 switches available on the market yet - Waiting for that would have delayed the product substantially for very little in the way of real-world gains.
If you can't beat 'em... (Score:2)
Next up... (Score:3, Funny)
Offtopic, but... (Score:1)
Newegg lists them... (Score:2)
Newegg has a category for them [the "AGEIA PhysX Card"]:
Crossfire on a Card. (Score:2, Insightful)
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
Re: (Score:3, Interesting)
Take a read through hardocp's review [hardocp.com] for an example.
As to why AMD released? Well, my understanding is that NVidia is looking to release thier own 2-GPU card (9800 GX2) in Feb/March. Given the benchmarks of the current cards, I can't see the 3870 X2 holding up well... so... beat 'em to market. Although when you factor price in, I'd imagine it'll still be competitive; just not anywhere near
Re: (Score:1)
Re: (Score:3, Funny)
Pretty cool if you ask me.
Re: (Score:2)
Re: (Score:1, Offtopic)
In the 7th Century what we know as France today, along with the low countries and some of western Germany, was known as Francia [wikipedia.org] and was ruled, at least in theory, by the Merovingian [wikipedia.org] line of Frankish kings. This century saw the rise of the Carolingian [wikipedia.org] dynasty within Francia, which reached their height in the late 8th and early 9th Centuries with the reign of Charlemagne [wikipedia.org].
Germany wasn't a single political entity until the 19th Century, and the Franks were Germanic [wikipedia.org], which is more of a group of identities, but
Re: (Score:1)