Retail Radeon R9 290X Graphics Cards Slower Than AMD's Press Samples 111
crookedvulture writes "AMD's recently introduced Radeon R9 290X is one of the fastest graphics cards around. However, the cards sent to reviewers differ somewhat from the retail units available for purchase. The press samples run at higher clock speeds and deliver better performance as a result. There's some variance in clock speeds between different press and retail cards, too. Part of the problem appears to be AMD's PowerTune mechanism, which dynamically adjusts GPU frequencies in response to temperature and power limits. AMD doesn't guarantee a base clock speed, saying only that the 290X runs at 'up to 1GHz.' Real-world clock speeds are a fair bit lower than that, and the retail cards suffer more than the press samples. Cooling seems to be a contributing factor. AMD issued a driver update that raises fan speeds, and that helps the performance of some retail cards. Retail units remain slower than the cards seeded to the press, though. Flashing retail cards with the press firmware raises clock speeds slightly, but it doesn't entirely close the gap, either. AMD hasn't explained why the retail cards are slower than expected, and it's possible the company cherry-picked the samples sent to the press. At the very least, it's clear that the 290X exhibits more card-to-card variance than we're used to seeing in a PC graphics product."
the cards run at higher temps by default (Score:5, Informative)
This has been discussed in many places like toms hardware. Essentially they found the cards volting is determined in the bios and the fan speeds can be altered. change the bios which many have released and the undervolting which occurs at lower temps is solved. Sapphire already released a new bios for the card to make these changes to keep them consistent yet keeping them from going above 95 degrees.
Re: (Score:2)
There seem to be some problems with the cooling paste installed on a lot of cards as well, causing them to overheat and ramp themselves down.
Re: (Score:2)
liq N2 option (Score:5, Funny)
You jest (Score:2)
You're joking but that's more or less what's happening.
Re:the cards run at higher temps by default (Score:5, Informative)
At this point it's reasonable to assume that AMD cherry-picked the cards they sent reviewers to make sure they were as good as they could be.
Re: (Score:2)
I wouldn't be surprised if nVidia/Intel/anyone else is doing the same.
Kinda like the picture on the menu and what you get in your plate
Re: (Score:2)
It's surely common in any industry, but the performance difference in this case (assuming no weird stuff is going on) is more than what is considered reasonable by the collective.
The problem is probably the lack of a specific lower threshold to which the cards are held (would also help explain the aggressive pricing).
Re: (Score:2)
Re: (Score:2)
Actually, NVidia have been doing the same thing for a couple of generations of GPUs as far as anyone's been able to tell, the press are just a lot less willing to kick up a fuss about anything they do than with AMD. (And I mean literally the same thing - designing their cards so the actual clock speeds and performance they run at out the box varies from card to card, then cherry-picking the best ones to send to reviewers.)
Re: (Score:2)
Not that FIX / Thermal performance (Score:5, Informative)
If you bothered to RTFA (I know!), you'd see that they indeed checked this out. They flashed the BIOS of their sample card onto their worst performing retail card. There was a small difference, but far from enough to make up for the gap between that card and the sample unit they received from AMD
Not that BIOS. As other have pointed in the thread, the variation in performance is more or less linked to the variance of thermal management.
Not all PWM Fan behave the same. There's a *newer* BIOS version (not as in "use the one that came with the sample" but as in "download the latest version that was made available on the manufacturer website, betwen when you bought it and now").
This version of BIOS is better at computing what signal it should send to the fan to have better cooling.
And once the cooling is improvent, the card will automatically scale up its speed.
Also, there can be difference in thermal grease, etc.
At this point it's reasonable to assume that AMD cherry-picked the cards they sent reviewers to make sure they were as good as they could be.
Or, instead of cherry-picking, maybe there's some build quality between the first engineering sample sent by AMD, and the mass-produced card by NONAME asian manufacturer ? (Or even mass-produced cards by very popular brands that have to fulfill lots of orders ?)
Difference in quality of the fans (NONAME will pick whatever is currently the cheapest, and even with popular big-names, there's going to be some variance, depending on where the current batch was sourced).
Difference in quality of thermal conduction of the interface. Difference of quality of thermal grease (NONAME will pick the cheapest, bigname might have variation in batches, specially if they source batch from several manufacturer to keep up with the pace). Difference in quality of work (NONAME might even do a sloppy job in applying the thermal medium to the radiator).
You end up with exactly the same chip produced by AMD, but vastly different thermal condition, all this with a firmware and a driver which isn't yet best at fan throttling, and you end-up with some measurable difference in output. ...BUT...
Pick up Nvidia cards, and you're going to see exactly the same effect.
Either card that vary in their performance (or that have big variation in temperature, depending on how the current firmware throttles the card)
Re: (Score:2)
Not all PWM Fan behave the same
That doesn't matter if you are competent, because they have a tachometer lead. you don't just send a PWM signal and then trust that the fan is going at the speed you want.
Exactly (Score:2)
And according to Tom's, that exactly what the last BIOS update was all about. Taking better into account the tacho feedback.
Re: (Score:2)
And according to Tom's, that exactly what the last BIOS update was all about. Taking better into account the tacho feedback.
But this isn't exactly ATI's first time around the bases... how did they forget how PWM control works?
Re: (Score:1)
(Assuming the waterblock gets any closer than Neufahrn which is where it's currently languishing on it's journey)
Mendacity or incompetence? (Score:5, Interesting)
...add variation in source. (Score:2)
but if it turns out that the junior thermal past application technicians get less attentive once the first production batch is finished and the people who've been babying the project leave, that wouldn't be a total surprise.
Now add in the mix that some parts, like fans, might be sourced from several different manufacturer (and according to source, BIOS wasn't until latest update so good at operating them), add also that there might be variation in quality between the different batches of thermal paste (which got very probably sourced from several productors) and the output variation is clearly expected.
But also fixable (newest BIOS to compensate fans with better throttling, manually replace thermal paste. Now cards works as go
Re: (Score:2)
What I'd want to know (but am unlikely to find out) is whether AMD was actively misleading reviewers by sending them hand-picked especially good cards, or whether review cards come from the initial run, probably the one where AMD's people
Re: (Score:1)
THe issue is the thermal paste [tomshardware.com], as well as chip manufactors lower mhz clock speeds until yields improve.
It is a standard process for all chip makers. What AMD did was pick the best of the best where yields would not make sufficient defect free chips at that speed for the demo.
As chip makers increase production and yield quality increases then the speed goes up as well. Notice Tom got closed if not matching the real demo with these tricks.
So these were not botched demos at all! However, you do need to void t
Hanlon's razor / Napoleon Bonaparte (Score:2)
or whether review cards come from the initial run, probably the one where AMD's people are mostly tightly and nervously observing the process, rather than the potentially more variable just-another-day-slapping-chips-on-cards production that follows.
I would indeed agree with your first post and this part. To me, a big conspiracy to manipulate results is far less likely than simply slopiness of a mass-produced good, where speed of production counts, in order to quickly meet the demand.
To quote a variant of Hanlon's Razor (often attributed without sources to Napoleon Bonaparte):
"Never ascribe to malice what can adequately be explained by incompetence."
Merely variation is only inconvenient, and may well mean that the usual 3rd-party overkill heatsinks actually help significantly.
Yup very probably. Specially with modern card that try to go as fast as they can, while still within tar
Typical (Score:3, Insightful)
Re:Nvidia has worse drivers today (Score:2)
The latest ones crash ALOT if you read maximumpc.com or tomshardware.com and my Nvidia fanboys I raid with.
AMD has better quality hardware with less flaky voltage regulators. I went through 2 nvidia cards over 8 years that failed and switched all AMD/ATI in 2010 with the phenom II (better that steamroller per clock tick sadly ) and an ATI 5750.
Had one bizaare issue with black edges on the screen when switching to HDMI. That problem went away after I went into the drivers and configured my screen to not do u
FPS TOO LOW!! (Score:4, Funny)
Re: (Score:2)
When you pay for discrete graphics you're usually making purchase decisions based on performance/$. When the reviews all say you'll achieve a certain trade-off, but with the retail product you don't, then the market has been deceived - perhaps you would have been better off buying something else.
No comment as to what's actually going on in this case, if anything, since I haven't been following it...
Re: (Score:1)
Oh gosh I hope this doesn't result in some poor sap attempting to drive his car and while thinking they should achieve a pure 40mpg they only hit a measly 20mpg and their lives are runied forever. The consequences will never be the same.
See how stupid you sound? Please redeem yourself by thinking carefully about the situation (potentially mislabeled product) and ideally apologizing for an utterly useless comment.
Re: (Score:2)
Nobody even brought up the monitor refresh rate.
Re: (Score:2)
Gee, you're a pedant. Am I going to have to start opening Windows calculator every time I want to point something out?
Honestly, it doesn't sound like a joke, more like you're making fun of people who got less than they hoped for when purchasing a product. It doesn't matter if it's a graphics card or a car, false advertising is false advertising (assuming this is all more than a few misbehaving cards, of course).
Say you buy 1kg of *insert favorite nourishment here*. Wouldn't you get pissed if you found out i
Re: (Score:2)
Besides, the whole point of burning $550 on a top-of-the-line graphics card
Re: (Score:2)
I am. I liked Quake 1, 2, and even Arena. I also liked Doom and Doom 2. And Duke Nukem 3d. And Command and Conquer. And Starcraft/Brood War. Hell, I'd love to be able to play Carmageddon again. What's wrong with having fun with a game vs being able to count the chest hairs on Duke as they wave in the gentle breeze?
And get off my lawn. It's dying under the snow there.
[John]
Re: (Score:2)
I still play Quake1 from time to time. It's still my favourite FPS of all time.
As for Carmageddon, if you have a decent Android or iPhone, you'll definitely want to pick this up: http://www.carmageddon.com/funsize/carmageddon [carmageddon.com]
Re: (Score:2)
Re: (Score:2)
You just made my year. Thank you very much.
(Off to buy Carmageddon...)
[John]
Re: (Score:2)
Not uncommon... (Score:2)
It's known. (Score:2)
due to the fact that the software/driver adjusts the frequency independently instead of being a static clock speed (something they should have disclaimed to the reviewers).
It's well known that these cards operate at a fixed temperature and push the clock and voltage as high as they can within these thermal limits.
It's so well known among professionnal, that some like Tom's are giving advice about BIOS replacement (newer have better and more consistent throttling or fan accross all the varied parts) or thermal paste replacement (to improve cooling and thus performance).
Re: (Score:2)
This probably happens more often than we think.
For a long time we've know that demonstrations lie.
Any demo indistinguishable from magic is insufficiently rigged.
I fail to see why anyone is surprised by this, Nvdia do it, Intel do it, Apple do it, everyone lies with demonstrators and all demonstration machines are rigged up to their eyeballs. IBM spends money making sure the lights on their demonstration machines blink in unison for crying out loud.
Which is why anyone with half a brain does not base purchasing decisions on a vendor's product dem
Yeah (Score:2)
I'm a huge fan of market competition and AMD. If this is not not a slashvertismemnt then I'm a noob.
Re: (Score:1)
Re: (Score:2)
I'm a huge fan of market competition, which is why I can't wait for a capable third manufacturer to compete against the big two.
Re: (Score:2)
Intel should can compete just fine in the low end market, and is starting to poke their nose into the mid-range. You can play Battlefield 4 with the integrated graphics of an i5/i7 desktop Haswell chip at 720p and low settings at 30fps.
Desktop processors running Iris Pro graphics may let Intel start stealing the spot light from AMD's APUs. I wouldn't count on Intel diving into the high-end market, so hopefully Nvidia won't be killed off. I also believe Nvidia has foreseen the end of themselves being anyt
Same old story (Score:1)
Re: (Score:2)
As far as I remember, those were plainly slower than the ones they were to replace, let alone Intel's products, even running at significantly higher clocks.
They performed to spec, but the spec wasn't what AMD had originally hoped for.
Throttling + OEM fan speed and grease variance (Score:4, Informative)
Toms Hardware covered this pretty extensively [tomshardware.com] a month ago.
The short story is that AMD is throttling clock speeds to hold within a temperature limit. They learned the hard way that 40% PWM does not equal 40% fan speed, especially across all fans the OEMS used. There's a driver fix for that now measures fan speed and adjusts accordingly when in quiet mode that eliminates most of this performance discrepancy (retail cards can now see higher performance in line with review samples).
Remaining differences between cards may be due to different heatsink grease, also already examined by replacing the grease on a retail card [tomshardware.com] for a significant performance gain.
Re: (Score:3)
I know some guy making $0.30/day in China isn't going to take a credit card and insure a perfectly smooth and even coating of thermal paste before carefully ap
Re: (Score:3)
How come applying thermal grease is still such a big problem in the semiconductor industry?
They've been doing it for decades, but still haven't figured out how to get it right every time.
Even Apple, who are renowned for their design and manufacturing prowess, keeps hiring companies that screw it up.
Re: (Score:2)
I think it's too much variation in the way heatsinks attach to the chip. Ideally it would be something like how heatsinks on Intel do it (there may be others, but Intel is what I have experience on) - you put on
Re: (Score:2)
Probably because there's no test for it. If they put the CPU on the board backwards, they'll notice when they try to turn the system on. Too much paste (or too little), and things will work just fine as far as any test is concerned.
When I did computer repair, I once encountered a PowerMac where the heatsink had a manufacturing defect; one of the posts that fit in a hole on the CPU card had a large extra blob of aluminum on it. It was impossible to seat the heatsink on the CPU, though it could still be strap
They're just rebranded last year's model cards.... (Score:1)
I believe the entire R9 series of cards are little more than rebranded versions of older cards AMD just discontinued. The R9 280X for example? It's just a 7970 card. Sure, it may have a few BIOS tweaks and such, but you can even peel the label off the circuit board of many of them and find 7970 etched on the board underneath the R9 280X label.
Personally, I think AMD should have waited until it had legitimate improvements to offer before releasing anything, rather than trying to fool people with the R9 se
Re: (Score:3)
Nope. The 290/290X is a much larger chip - similar architecture, but bigger (and mildly improved).
More or less (Score:2)
Personally, I think AMD should have waited until it had legitimate improvements to offer before releasing anything, rather than trying to fool people with the R9 series.
The problem, is that AMD got too busy doing legitimate improvement under contract for the coming generation of consoles (Xbox One, Play Station 4, and some of the upcoming Steam Box, all run the same combo of AMD CPUs and GPUs).
With that work, there was going to be some delay for their PC's Radeon HD 8xxx serie.
So it was either:
- have absolutely nothing to sell.
- do some small upgrade on the older board (R2 270/280 are simply older board slightly upgraded) and older chips (R2 290(X) are GCN1.1 chips, slight
Re: (Score:2)
Will gamers see that much of an improvement? The PS4 and Xbone being x86 hardware is nice as the excuses on why a port to PCs can't happen, but both consoles are pretty sad when compared to current mid-range PCs let alone a high-end rig. The Xbone one is struggling to hit 1080p while the PS4 is hitting it, but at 30fps. This is matching or lower than the performance of a current mid-range PC and the performance gap will only widen.
It's good that AMD was able to get the contracts to get income as they hav
Scale! (Score:2)
Will gamers see that much of an improvement? The PS4 and Xbone being x86 hardware is nice as the excuses on why a port to PCs can't happen, but both consoles are pretty sad when compared to current mid-range PCs let alone a high-end rig. The Xbone one is struggling to hit 1080p while the PS4 is hitting it, but at 30fps. This is matching or lower than the performance of a current mid-range PC and the performance gap will only widen.
...all this done with intergrated GPU. That's the key point. These performance are pulled using just an embed chip, that pulls minimal amount of power.
Now scale this thing up, move one generation next ( to GCN 2.x ) and the discrete card that will be getting next year from AMD are going to be quite interesting.
Architecture (Score:2)
I'm talking about all the above being GCN 1.x chips.
The GCN 2.x chips (which was initially what was going to be inside HD 8000, before they delayed everything) will be here early next year.
Re: (Score:2)
do people really need a $500 video card?
People running 4k displays or multi-monitor displays with the latest games do [legitreviews.com].
old news (Score:2)
I listened to a Maximum PC podcast about this a couple of weeks ago. Is this a slow news day?
How about R9 290 then? (Score:2)
I've just ordered a R9 290X, before I saw this (oh no!). If this effect is reducing the 290X performance, and 290X is just a higher clocked 290, could I expect similar performance on the 290 as the retail 290Xs? If so I just want to return the 290X and recoup 25 % of the cost... Advice greatly appreciated
Re: (Score:2)
Looking at some pictures of the stock 290X heatsink, you will be well served by removing, lapping, and replacing the thermal grease with some decent stuff. Even more if you decide to spring for an aftermarket heatsink, but you may have to do a little leg work to figure out which ones will work and which ones won't.
Re: (Score:2)
that's an option that I didn't consider, cheers! (probably won't check back on this thread any more and will keep the 290X)
Re: (Score:1)
Re: I never got the fascination with AMD/ATI (Score:2)
Re: (Score:2)
perhaps, but this is typical of AMD and ATI products ranging all the way back to my experiences with Athlons and the ATI Rage 128 graphics card.
Their hardware just isn't as stable as Intel or Nvidia. And that's my experience with 4 different AMD systems over the years. When you put an intel computer on the same power supply, stability just happens.
Re: (Score:2)
Re: (Score:2)
While few will need to buy a graphics card for Blender 3D in particular it does show that you have to consider what you wish to use the card for when you want to make a purchase to get the performance you need out of it.
Re: (Score:2)
Also AMD has better free software drivers.
Yeah, but their proprietary Linux drivers are simply atrocious. Performance isn't so bad but stability is awful.
Re: I never got the fascination with AMD/ATI (Score:1)
And you are like the people that always eat the same pizza, while complaining that there's no variation.
Re: (Score:2, Insightful)
I use Intel/Nvidia with FreeBSD because they have better driver support than AMD/ATI.
Re: (Score:3)
I used to be an Nvidia guy until last week. Just hopped onto the crypto coin mining bandwagon and my new 7950 card should pay for itself in ~2-3 weeks.
Nvidia are junk for this, unfortunately.
And it shows, you try to find a new or 2nd hand 79xx card anywhere for a reasonable price. They have all been snapped up by the miners :(
Re: (Score:3, Informative)
Lots of coins (Score:5, Interesting)
There are lots of alt-coins in addition of bitcoin.
Some are easier to mine on GPU than other.
You need to pick up a coin that plays well with it.
Re: (Score:2)
who buys those altcoins? speculators? do they have valid exchanges?
Re: (Score:2)
Indeed (Score:2)
Yeah, and that's the main exchange where I trade mine.
A few other thing:
- much smaller scale, but cryptsy [cryptsy.com] has almost the whole zoo.
- lots of trader are playing around and speculating with minor coin.
- That means there's a lot of exchange traffic between major and minor alt coins.
- That means it's easier to exchange whatever you mined with whatever is more usefull to do transaction for you.
- LTC is starting to grow big enough to get some indenpendance from BTC. ...thus it's starting to get more acceptance (
-
Re: (Score:2)
Yeah I thought I missed the bandwagon, but then I discovered scrypt based coins and it turns out I might just be able to catch the tail end of this ride. My first card I just bought should pay for itself in ~16 days. *IF* that happens, I will be upgrading to a bigger rig.
It's not too late.
Mine for WorldCoin by the way. It's still one of the best longterm options for mining.
I will either be wrong, in which case I have a perfectly sellable graphics card I can move on (Or game with) or I am right and BWUAAHAHA
Re: (Score:2)
bwahahaha. If you were a serious miner, you'd have an ASIC rig.
GPU miners don't cover the cost of their electrical bills.
Re: (Score:2)
Just for your info, I covered the extra electricity charges for the coming month in under 24 hours.
And there are no ASIC chips for scrypt based mining yet so it's currently GPU mining all the way (to tha moon, lol).
I'm not 100% sure yet, but I also think that due to the very nature of the scrypt algorithm, ASIC mining is unlikely to come soon as it;s really reliant on having a lot of fast memory (Basically, high end video cards). There is one company that claims it's close to making one, but they have not a
Re: (Score:2)
And me :)
Re: (Score:1)
My cheap AMD/ATI system is the most stable computer I ever owned.
Only BSOD was from an unproper shutdown and file corruption that I did.
Re: (Score:2)
And cherry picks the best units to send out to reviewers.
Can't seem to care too hard. Unless this wasn't an industry standard way of doing things... But it is.
Not just in computer related things either. The stuff sent to be reviewed is always the absolute best and pre-tested before the reviewers get it.
Video card manufacturers have been cheating for more than 20 years. Sending hardware and drivers to reviewers that has been tweaked and optimized far beyond what is available at retail.