AMD's New Flagship HD 6990 Tested 164
I.M.O.G. writes "Today AMD officially introduces their newest flagship GPU, the Radeon HD 6990. Targeted to counter Nvidia's current generation flagship GTX580, for AMD this is a followup to their previous generation 2xGPU on a single PCB design, the Radeon HD 5970. It represents the strongest entry AMD will release within their 6000 series graphics lineup. Initial testing and overclocking results are publishing at first tier review sites now. As eloquently stated by Tweaktown's Anthony Garreffa, the 6990 'punches all other GPUs in the nuts.'"
punches all other GPUs in the nuts (Score:2, Informative)
and your wallet too!
$700. ouch.
Hey, if you've got the money to play, lucky you. I'm envious.
Re: (Score:1)
Re: (Score:2)
It's about exactly twice as much as the budget I've set aside for building my next PC...
Last computers I built were budgeted at $500 a piece... Whole new systems - motherboard, CPU, RAM, HDD, power supply, LCD monitor, keyboard, mouse, all of it. Came in just slightly over after I was done with shipping and handling and whatnot...
We're still using those computers, too. And we do a good amount of gaming. Obviously I can't crank all the settings up as far as they'll go... But I have yet to see a game that didn't play just fine on this machine.
Re: (Score:3)
Or, he'll still have it in 5 years because his needs didn't require a $700 video card in the first place.
Re: (Score:3)
Or that $700 video card becomes more reasonable like $10 or $60 and you just upgrade that bit in 2 years.
Comment removed (Score:4, Interesting)
Re: (Score:2)
I would say that the sentence is awkward, but accurate. The real question is whether it should be "about exactly" or "exactly about".
Re: (Score:1)
I've got the money and am in the market for a new graphics card, but I just don't see the point in a card like this. 450W draw is fucking retarded and currently there just aren't games out there that will legitimately make use of a card like this. And in a year or two when the games actually exist, you'll be able to buy a card that can keep up with this one for significantly less money and probably a lower TDP.
Re: (Score:2)
I admit I haven't kept up to date on the latest PC games. I was a WoWaholic for a long time, and I've been spending most of my gaming time with my PS3 recently. my GTX 470 has had a nice break the past few months. That said, in just about every generation of games and hardware, there's at least one game that still runs like crap even on the best current hardware when you push all the settings to ultra and run it on a 30inch monitor, be it mass effect, crysis, or what have you. Don't we have one of those thi
Re: (Score:1)
Re: (Score:2)
That's called a 'Bad port' and game makers are /SLOWLY/ changing to actually focus on PC development.
I remember when GTAIV came out for PC and it crippled everything by being the worst optimized game I've ever seen.
Re: (Score:2)
pretty sure crysis was ported -to- consoles, not from it, although I could be wrong. Mass effect was released on the xbox first, but since the 360 is running directx 9 I believe, it can't have been that much of a port job.
as a side point, PC ports of console games often get a bad rep for performance compared to their console brethren. The PC version usually has a lot more eye candy in my experience. Aside from the convenience of using a mouse, I can't play the 360 version of mass effect nor the ps3 version
Re: (Score:2)
Because we all know that console gamers never pirate games...
Re: (Score:2)
Re: (Score:2)
Not only that, it will probably get a die shrink and only require 250w
Re: (Score:2)
and your wallet too!
$700. ouch.
Hey, if you've got the money to play, lucky you. I'm envious.
Yep.
I was always kind of amazed at these prices... I'd build an entire computer for $700, and then somebody would come along and tell me how they had two of these $700 video cards in their machine.
I mean... If you've got the money, go for it. But I just can't see justifying $1400 in video cards alone. Especially when we're talking about the consumer-grade gaming cards. Are a few more frames per second in Crysis really worth $1400 to you?
Re: (Score:2)
true that.
Had I that kind of money to waste on a computer, I'm pretty sure I'd be much better off in the long run spending tha tmoney on something like a single upper mainstream (say, gtx 570) card, a 30" monitor, and two solid state drives in raid 0, but that's just me.
Come to think of it, anybody who spent $1400 on two of these probably have the 30" monitor and the two SSDs as well.
Re: (Score:3)
Much more fun to interrupt their proxy-penis waving with a few well placed headshots.
"Dual $700 cards, huh? How come you still suck?" *BOOM* headshot
Re: (Score:2)
For home use, I'd keep using the GeForce 8800 GTX, but I'
Punches your power supply in the nuts, too (Score:4, Interesting)
375+ watts. That's more than my whole computer. Oddly enough I have plenty of headroom in my power supply and it only requires a single slot so if I felt the need to punch myself in the nuts by loading drivers written by ATI onto my computer, I could slap it right in there.
Re: (Score:2, Informative)
I was ATI-only from 2000-2009. Thought the same thing about their drivers. Then I went Nvidia again because of my dislike and... nope, no difference.
If you're planning on Linux though then yeah, Nvidia is obviously much better. ATI-on-Linux will make you want to hang yourself with a sock.
Re: (Score:1)
Yeah, NVidia's driver quality has taken a nose-dive in recent years. At least ATI is no better or worse than they've ever been.
Re:Punches your power supply in the nuts, too (Score:5, Insightful)
Re: (Score:2)
And they're releasing source as their legal department signs off on it. So at least in theory they should be better integrated at some point in the future.
Re: (Score:2)
This has not been my experience. I bought an HD4350 for an HTPC build. Open source drivers flicker every 10-20 seconds, that's unwatchable. The closed source drivers are another story entirely. They installed fine, and judging from the Xorg logs they were working. Except that *nothing* was displayed on the screen. No errors, no warnings, X correctly determined which displays were connected. Yet all I got was empty blackness.
The card is completely useless in Linux. I can't see myself ever buying an A
Re: (Score:2)
Another possible related thing: "Empty blackness" would be what you get with X nowadays before any window manager or similar is started (did you try clicking somewhere if you were using twm?). I can see why they removed the pretty horrible patterned background that was the default earlier, but there are
Re: (Score:2)
Here [archlinux.org] is the cry for help I issued on the Arch forums. I didn't really expect any help, since the Catalyst drivers were exiled to AUR. For good reason it seems.
Anyways, if you see any clues I missed, or if you know of a better place to ask for help, please let me know.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Since AMD took over ATI, their drivers have massively improved, even in Linux.
This has not been my experience. For instance, they have abandoned R690 chipset already. No fully working graphics drivers newer than Vista. I think my next CPU will be intel. My current video card is from nVidia and everything I hoped it would be. I've owned several Radeons and all were total nightmares. I've never, repeat NEVER had a problem with an nVidia card that wasn't solved with a driver update. That's just because I'm lucky with hardware I guess, but the ATI problems weren't hardware ones.
When my R
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Not that I don't sympathize but Crossfire and SLI are one of those areas where driver bugs are to be expected.
Not only do I not agree, except perhaps in the case that I expect bugs in drivers from ATI, but nVidia seems to have been able to figure it out.
Re: (Score:2)
I recently bought new motherboard, and the Nvidia drivers wouldn't work in Linux. Not at all. It would practically lock the system with interrupts (it would pin one cpu core dealing with them). I even tried a completely different Nvidia card, working in another system. Same thing.
I then bought an AMD card. And it just worked.
AMD/ATI is worth looking at now.
Re: (Score:2)
375+ watts. That's more than my whole computer. Oddly enough I have plenty of headroom in my power supply and it only requires a single slot so if I felt the need to punch myself in the nuts by loading drivers written by ATI onto my computer, I could slap it right in there.
Holy hell. I've only got a 500w PSU in my box... I don't think I could even run one of those.
feels hollow (Score:3)
I don't know - the card is certainly fast, but when all you can do to beat your competition's single-GPU card is to stick two of your slower GPUs on it, it just feels hollow to me. All Nvidia has to do is come back with a $800 card with two 580s on it to decimate AMD's nuts in return. Is this *really* all that amazing?
Re: (Score:2)
You say that as if it's a trivial thing to do.
Re: (Score:1)
Re:feels hollow (Score:5, Insightful)
It should be trivial to do something, from the point of view of someone who isn't doing it, and has no idea what is involved in doing it.
Are you a manager, perchance?
Re: (Score:2)
Indeed, anyone who doesn't respect someone else's work and assumes it is trivial, because they don't understand it, is inferior.
Re: (Score:2)
All Nvidia has to do is come back with a $800 card with two 580s on it to decimate AMD's nuts in return. Is this *really* all that amazing?
Thats the 590. Its out in a week or two.
It makes me laugh that most sites reviewed it on a single screen system, most at 1080p. Most of the current top-end cards can easily do modern games at maximum detail even on 30" screens. These kinds of cards are only really worth it for multi-monitor gaming. The problem is 3 x 30" screens starts to fill that 2GB
of video memory quite quickly.
I hope the 7990 has better memory use. Use HyperTransport or some kind of NUMA setup and let the GPUs access all the memory.
Re: (Score:1)
The memory on a GPU card is typically MUCH faster than the system memory.
Re: (Score:1)
Re: (Score:2)
I wouldn't buy a monitor bigger than 24" that only supported 1080p.
TV sure.
Computer monitor no.
Re: (Score:2)
I can only guess at what something like that would cost and where you'd buy it.
I've never seen such a beast. That's not to say they don't exist, but it seems a fairly exotic thing.
Re: (Score:2)
I can only guess at what something like that would cost and where you'd buy it.
I've never seen such a beast. That's not to say they don't exist, but it seems a fairly exotic thing.
Exotic? Really?
On the consumer/normal workstation end of things off the top of my head you've got the Dell U2711, IPS, res. 2560x1440 [dell.com] (list 1k, but frequently on sale for ~$700) plus Apple's *only* display, in the same price range with essentially the same panel (glossy though, and LED backlight).
On the true high end Eizo, NEC, and others make even better displays. Not to mention that with slightly lower DPI you cna get the same 2560x1440 resolution on nearly every 30" computer monitor made in the la
Re: (Score:2)
You're frequently better off getting a second screen than going larger than 24". Hell even with a 20" screen it's likely better to get a second one. Unfortunately, most systems don't seem to handle multiple monitors very well. Meaning that if I'm playing a game on my primary screen, the OS doesn't know to put a screen saver on the other and restrict the mouse to the game screen. I haven't seen any WMs that handle that well, regardless of OS. And don't get me started with times when the screens aren't the sa
Re: (Score:2)
You're frequently better off getting a second screen than going larger than 24". Hell even with a 20" screen it's likely better to get a second one.
Better in what sense? I've never liked working with multiple monitors, but I really like working with my 24" widescreen. No annoying gap in the middle, fewer wires, more desktop space, and no interface issues like you described.
Re: (Score:2)
I've got an i-inc (rebranded Hanns-G) 28" 1920x1200 on my desk that cost me $250 from compusa/tiger direct?
I really do prefer the 16x10 ratio for computer monitors. The thing about 1080p is that the vertical resolution on it is really about the same as the 19" crt I had on my desk 10 years ago, it's just wider.
I will agree that getting to 2560x1600 does seem to take a big paycheck though.
Re: (Score:2)
jesus, they're down to $210. maybe I should get another one.
http://www.mwave.com/mwave/SKUSearch.asp?px=NT&scriteria=AA98799 [mwave.com]
Re: (Score:1)
http://www.newegg.com/Product/ProductList.aspx?Submit=Property&Subcategory=20&PropertyCodeValue=1099%3A25153 [newegg.com]
Re: (Score:2)
Don't know about you, but I consider a "$1-3k" monitor to be exotic and pricey.
Sure, they sound absolutely awesome, but you're talking about more money than I'd be willing to spend on a computer.
Definitely 'niche' market kinda stuff.
Re: (Score:2)
Re: (Score:1)
I assume he was referring to the 30" lcd's that run at 2560x1600 resolution... which are awesome for the record
Re: (Score:2)
*drools on keyboard*
Wow! Seriously, wow! How much does something like that cost? This seems like you're way beyond gaming rig here -- and, if you're really talking about running 2 or 3 of these for a gaming machine (like some people are), well, then I strongly suspect you don't really care that your video card(s) cost. You've already spent a small fortune on monitors.
Re: (Score:1)
Yeah, I can't imagine having 3 of these... 1 takes up a good amount of desk real estate as it is, I got my Dell 3008 refurbished for about $1200 I think a year and a half ago with full warranty, etc.
Re: (Score:2)
Except that monitors tend to last a lot longer than video cards do. Even during the 90s, the monitors would typically outlast several generations worth of videocards. It was a bit less one sided when the LCDs first came out, but at this point there's little point for most people to upgrade again if they buy a quality monitor.
Re: (Score:2)
If they're both at 1080p, then the size of the screen doesn't matter, does it? It doesn't take more memory if the pixels are bigger but the same in number.
Or, are you talking about running at resolutions higher than 1920x1080? I didn't think you could easily get monitors at much higher resolution.
Re: (Score:2)
Re: (Score:2)
This is a handy chart for figuring out the number of pixels. When you start getting into the larger 16:10 monitors, you really need a lot of horsepower. Add in three large monitors...
2400 x 600 . = 1,440,000 pixels | Triple 4:3
1680 x 1050 = 1,764,000 pixels | Single 16:10
1600 x 1200 = 1,920,000 pixels | Single 4:3
1920 x 1080 = 2,073,600 pixels | Single 16:9
1920 x 1200 = 2,304,000 pixels | Single 16:10
3072 x 768 . = 2,359,296 pixels | Triple 4:3
3840 x 720 . = 2,764,800 pixels | Triple 16:9
3840 x 800 . = 3,
Re: (Score:2)
I have one at 1920x1200 which was on sale for 159 €. A 24" at 1920x1200 currently sells for 225 € here around.
It's not easy though to get a monitor with a decent ratio (4:3 or 5:4) though. Ironically it's cheaper to buy a 24" at 1920x1200 than a 20" at 1600x1200, even though the last one has about the same dpi and less pixels.
Re: (Score:3)
AMD's offerings usually have lower power consumption and heat generation. While I'm sure nVidia could come up with something, they'd probably have a hard time using the 580 as a basis, because it runs so hot already. I mean, the 6970 consumes a whole ~140W less than the 580 (!), yet they still had to notch it down so it fit in the standard and add that clever switch. AMD's current offerings are just far more power efficient than nVidia's, which means they'd need to underclock their dual-GPU card more than A
Re: (Score:2)
The big problem with dual 580s is peak power draw would be around 750w, just for the GPUs. They would have to make certain sacrifices to fit any reasonable power envelope.
If these GPUs keep sucking more and more power, they will have to start seriously considering making them external. You'll have your PC, a GPU box beside it with its own kilowatt power supply, and just an interface board and cable between the two. There is simply no sense in cramming more heat and power into the PC chassis, just to play
Re: (Score:2)
why? (Score:3, Insightful)
Re: (Score:1)
Re: (Score:2)
Re: (Score:1)
Your opinion needs a little more humble in it. The only use you can see and the "only sensible use" have absolutely nothing in common. You just aren't that important.
Re: (Score:2)
Thats exactly the use for my GTX460 and a Radeon 6850 that I have in my BOINC computer. This machine alone does several times the computations that used to do on 3 desktops and one laptop before.
Re: (Score:2)
Re: (Score:2)
It isn't about some specific need per se; overclocking and tuning is a hobby, an expensive and not always such a smart hobby, but nevertheless there's some even worse hobbies in the world.
As an example from another of my hobbies, the price of this card would get you halfway towards a top-of-the-line set of headphones... not counting of course the top-of-the-line amp to go with it... which put together are WAY cheaper than an equivalent speaker setup... which in turn is WAY cheaper than an offshore boat... which is of course way cheaper than manipulating the world financial markets for shits and giggles.
That last hobby scares me.
Re: (Score:2)
And, of course, the rest of us are convinced you're daft to spend that much money on a set of headphones.
You may actually be able to hear the difference, or at least believe you can. To most of us, it seems like you're spending several times more
Re: (Score:3)
I can hear the difference between my old Sennheiser HD650 headphones and my new Beyerdynamic DT880s. I can hear the difference between my wife's HD580 headphones and my HD650s. I don't see any reason to suspect that I wouldn't be able to hear a difference between the HD650s / DT880s and a set of HD800s or Tesla T1s.
That said, you are correct that there are diminishing returns for your money, but that's true of any hobby.
Say you've got an old 1991 5.0 mustang. completely stock, it's going to probably run a h
Re: (Score:2)
I believe intellectually someone might be able to distinguish from one of those alphabet soup things which might impress me if I knew (or cared) what it was.
However, as a practical matter, I just find it unlikely that everybody who claims to have such golden ears actually does. It's just hard not to believe that there's a bunch of people who have shelled out crazy amou
Re: (Score:2)
Oh I certainly don't have golden ears. That's why I'm confused that people can't hear the difference between quality headphones and not-so-quality.
I'm honest and right up front:
I can't hear the difference between LAME 160kbps and FLAC.
I can't hear the difference between audio cables (I really want to, but I can't)
I can't hear the difference beween 24bit audio and 16 bit audio
I can easily hear the difference between different headphones. I've got 3 sets of headphones right now, Sennheiser HD650s, Beyerdynami
Re: (Score:2)
Well I would imagine you are not running your $150 card at 5760 x 1200 (across three 24" monitors) with 4X AA and 16 AF now are you?
There IS are market for this performance, and granted it may not include you, but some people are more than able to bring cards with these specs to their knees.
As for console ports, granted there are quite a few, but I seriously doubt my GeForce 3 Ti500 (2001) could have have run any of today's games.
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
IMHO, it's not worth it, or a "need" since it's far, far outside my normal usage, or most of the people I'm familiar with.
Unfortunately, this is a consumer electronics component, so they need more than a very few fanatic people with 3 monitor setups to sell these cards. However, in recent years the number of titles that support these extremes has grown less, shrinking the pool of people who could potentially be interested. We've g
Re: (Score:2)
There's a difference between being able to play a game and running the game on ultra high settings. My laptop can run any game on the market right now, but I wouldn't say it would be the most pleasant experience and it certainly wouldn't be at anything more than medium to medium-low settings. Some people like the new shiny that PC games offer. While I (and apparently you) don't think it's worth the extra money just to be able to run the latest Crisis expansion across three monitors with the graphics up t
Re: (Score:1)
Re: (Score:2)
You don't have to pay $500-1500... the low end cards in this generation sell for as low as $250. Those cards being the AMD Radeon HD 6950 [newegg.com] and nVidia GTX 560 Ti [newegg.com].
Re: (Score:1)
You don't have to pay $500-1500... the low end cards in this generation sell for as low as $250.
I was referring to the mythical three monitors + video card setup. As I stated earlier, I've got a $200 video card that I'm very happy with.
Re: (Score:2)
This is a terrible card for future-proofing. In a few years, it'll perform no better than the $100 cards, but will still be the same huge, hot, loud, power-sucking beast it's always been. Cards like this one are really only good for a couple of years, after that they're generally more trouble than their worth. Generally, it's the low-end video cards I have that I end up using the longest - obviously not in my main PC but they're the ones that get a second life in some secondary role.
Re: (Score:3)
Re: (Score:2)
My $150 card I bought a year ago can play every game on the market right now. Why do I need a $700 card?
Hell, the $150 card I bought about two years ago still works fine.
Obviously I can't crank all the settings up as high as they'll go... But I have yet to run into a game that doesn't run well.
Just finished playing through Dead Space 2 - it ran fine and looked great.
Re: (Score:1)
{{cite}}
I want to know what $150 card from two years ago plays your current games at max settings.
Re: (Score:2)
Wait, I think I heard the exact same thing yesterday in the Intel Extreme cpu comments. Why? Because you can. This is luxury, like drinking a 70$ wine over a 15$ wine, nobody needs to do it but it's to spoil yourself. It's not necessary to be able to crank the quality settings all the way up to enjoy a game, but if you can afford it it's the little extra.
Re: (Score:1)
Because mine does it with higher quality settings and with more than enough FPS to not have any slowdowns during high intensity moments, while yours doesn't.
Re: (Score:2)
You don't. It also won't help you with the kids on your lawn, understanding "rock and roll" or with your shouting at a cloud.
p.s. I read your post in Grandpa Simpson's voice.
Re: (Score:2)
Why do I need a $700 card?
For the same reason that they're selling it: It'll arm you for your next dick-wagging contest.
OTOH, as the price drops the mid-range cards will get cheaper too, so I can't complain...
Re: (Score:2, Funny)
....if you can't answer, then you're not the target audience either.
Ouch... (Score:2)
What ever happened to VR? (Score:5, Interesting)
Whatever happened to VR? (Virtual Reality) A decade or two ago, it seemed to be (short of direct neural interfaces) where user interfaces were heading. I even remember going to a Disney mini-theme park where they had some true VR rides (you wore a tracking headset) so that you could ride Aladdin's carpet.
Back then it seemed as if the main thing keeping this technology back was the room-sized SGI supercomputer required to render a reasonable scene in real time. I remember a presentation by the CEO of SGI saying that all they needed to get to was 60M triangles/sec, then VR would be achieve mass appeal. (Then again, he also dismissed delivering video from computers by saying computers wouldn't become video "jukeboxes" so maybe he wasn't so good at predicting the future.) Anyway, I don't know the latest spec's but I'm sure a modern video card could blow away one of those old SGI "Reality Engines".
So why aren't we all wearing goggles (and wearing spandex) and looking like the characters in "The Lawnmower Man"? Is it because micro-displays never got good enough? Or something else?
Re: (Score:2)
Re: (Score:3)
Apple ships millions of phones with 3.5", 326 ppi screens that iSuppli estimates to cost $28.50 each. Maybe they underestimated, so let's say a pair would cost $80, which is still in the price range of a cool video game add-on like the Kinect.
Re: (Score:3)
Quoth: "So why aren't we all wearing goggles (and wearing spandex) and looking like the characters in "The Lawnmower Man"? Is it because micro-displays never got good enough? Or something else?"
Because Apple haven't released a product with it causing all competitors to shit themselves?
Re: (Score:2)
ATI and Nvidia have been doing this dance since 1998 I believe. I don't think they're going to change anything soon.
Also, I'm a bit confused by your logic, as GPU clock speeds haven't advanced anything like CPU clock speeds in the same time period. GPUs have mostly been going for the massively parallel multi-core architecture design ever since the Voodoo2, which CPUs have only really started doing in the past few years. Hell, I think my GTX478 has something like 448 cores or something like that, clocked -lo
Re: (Score:2)
Honestly.... what games are even going to stress this card in the foreseeable future?
The obvious joke is Crysis 2...
But, seriously, something like this is pure overkill.
I've got a two-year-old video card that I bought for $150 at the time, and it still plays everything just fine.
Re: (Score:3)
There are people that buy stuff like this just to say they have it, so they can go around on interwebforums posting their synthetic benchmark results and bragging about it.
Some of them will probably never actually play a game with it.