Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AMD Graphics

Retail Radeon R9 290X Graphics Cards Slower Than AMD's Press Samples 111

crookedvulture writes "AMD's recently introduced Radeon R9 290X is one of the fastest graphics cards around. However, the cards sent to reviewers differ somewhat from the retail units available for purchase. The press samples run at higher clock speeds and deliver better performance as a result. There's some variance in clock speeds between different press and retail cards, too. Part of the problem appears to be AMD's PowerTune mechanism, which dynamically adjusts GPU frequencies in response to temperature and power limits. AMD doesn't guarantee a base clock speed, saying only that the 290X runs at 'up to 1GHz.' Real-world clock speeds are a fair bit lower than that, and the retail cards suffer more than the press samples. Cooling seems to be a contributing factor. AMD issued a driver update that raises fan speeds, and that helps the performance of some retail cards. Retail units remain slower than the cards seeded to the press, though. Flashing retail cards with the press firmware raises clock speeds slightly, but it doesn't entirely close the gap, either. AMD hasn't explained why the retail cards are slower than expected, and it's possible the company cherry-picked the samples sent to the press. At the very least, it's clear that the 290X exhibits more card-to-card variance than we're used to seeing in a PC graphics product."
This discussion has been archived. No new comments can be posted.

Retail Radeon R9 290X Graphics Cards Slower Than AMD's Press Samples

Comments Filter:
  • by spacepimp ( 664856 ) on Wednesday December 04, 2013 @03:33PM (#45600059)

    This has been discussed in many places like toms hardware. Essentially they found the cards volting is determined in the bios and the fan speeds can be altered. change the bios which many have released and the undervolting which occurs at lower temps is solved. Sapphire already released a new bios for the card to make these changes to keep them consistent yet keeping them from going above 95 degrees.

    • by Sir_Sri ( 199544 )

      There seem to be some problems with the cooling paste installed on a lot of cards as well, causing them to overheat and ramp themselves down.

    • by harvey the nerd ( 582806 ) on Wednesday December 04, 2013 @04:00PM (#45600503)
      Someone obviously didn't buy the turbo liquid nitrogen supply option.
    • by Nemyst ( 1383049 ) on Wednesday December 04, 2013 @04:08PM (#45600625) Homepage
      If you bothered to RTFA (I know!), you'd see that they indeed checked this out. They flashed the BIOS of their sample card onto their worst performing retail card. There was a small difference, but far from enough to make up for the gap between that card and the sample unit they received from AMD. It also made the retail card crash because the voltage was too low. The sample card managed to give better performance at lower fan speeds and voltages.

      At this point it's reasonable to assume that AMD cherry-picked the cards they sent reviewers to make sure they were as good as they could be.
      • I wouldn't be surprised if nVidia/Intel/anyone else is doing the same.

        Kinda like the picture on the menu and what you get in your plate

        • It's surely common in any industry, but the performance difference in this case (assuming no weird stuff is going on) is more than what is considered reasonable by the collective.

          The problem is probably the lack of a specific lower threshold to which the cards are held (would also help explain the aggressive pricing).

        • Of course they are. But they're also not delivering one product for review and a completely different product to sell. What AMD's done here is little more than a bait-and-switch that's coming back to bite them in the ass. Of course there's going to be some variance from card to card, but what's happening here goes far beyond what's expected.
          • by makomk ( 752139 )

            Actually, NVidia have been doing the same thing for a couple of generations of GPUs as far as anyone's been able to tell, the press are just a lot less willing to kick up a fuss about anything they do than with AMD. (And I mean literally the same thing - designing their cards so the actual clock speeds and performance they run at out the box varies from card to card, then cherry-picking the best ones to send to reviewers.)

        • Nvidia does it to, but since they HAVE a base clock on their cards then what ever boost card can do. You know all the cards if running at base clock will be within a few % of each other and not this 15-20% that some sites have seen with AMD cards. press nvidia cards seem to overclock like beats which does show you if you are lucky to get one those you could do with it, Sadly the "up to" terms AMD used makes them sound like an ISP.
      • by DrYak ( 748999 ) on Wednesday December 04, 2013 @05:12PM (#45601777) Homepage

        If you bothered to RTFA (I know!), you'd see that they indeed checked this out. They flashed the BIOS of their sample card onto their worst performing retail card. There was a small difference, but far from enough to make up for the gap between that card and the sample unit they received from AMD

        Not that BIOS. As other have pointed in the thread, the variation in performance is more or less linked to the variance of thermal management.
        Not all PWM Fan behave the same. There's a *newer* BIOS version (not as in "use the one that came with the sample" but as in "download the latest version that was made available on the manufacturer website, betwen when you bought it and now").
        This version of BIOS is better at computing what signal it should send to the fan to have better cooling.
        And once the cooling is improvent, the card will automatically scale up its speed.

        Also, there can be difference in thermal grease, etc.

        At this point it's reasonable to assume that AMD cherry-picked the cards they sent reviewers to make sure they were as good as they could be.

        Or, instead of cherry-picking, maybe there's some build quality between the first engineering sample sent by AMD, and the mass-produced card by NONAME asian manufacturer ? (Or even mass-produced cards by very popular brands that have to fulfill lots of orders ?)

        Difference in quality of the fans (NONAME will pick whatever is currently the cheapest, and even with popular big-names, there's going to be some variance, depending on where the current batch was sourced).
        Difference in quality of thermal conduction of the interface. Difference of quality of thermal grease (NONAME will pick the cheapest, bigname might have variation in batches, specially if they source batch from several manufacturer to keep up with the pace). Difference in quality of work (NONAME might even do a sloppy job in applying the thermal medium to the radiator).

        You end up with exactly the same chip produced by AMD, but vastly different thermal condition, all this with a firmware and a driver which isn't yet best at fan throttling, and you end-up with some measurable difference in output. ...BUT...

        Pick up Nvidia cards, and you're going to see exactly the same effect.

        Either card that vary in their performance (or that have big variation in temperature, depending on how the current firmware throttles the card)

        • Not all PWM Fan behave the same

          That doesn't matter if you are competent, because they have a tachometer lead. you don't just send a PWM signal and then trust that the fan is going at the speed you want.

          • And according to Tom's, that exactly what the last BIOS update was all about. Taking better into account the tacho feedback.

            • And according to Tom's, that exactly what the last BIOS update was all about. Taking better into account the tacho feedback.

              But this isn't exactly ATI's first time around the bases... how did they forget how PWM control works?

    • And this is why I'm going to be watercooling mine :)
      (Assuming the waterblock gets any closer than Neufahrn which is where it's currently languishing on it's journey)
  • by fuzzyfuzzyfungus ( 1223518 ) on Wednesday December 04, 2013 @03:39PM (#45600135) Journal
    Cherry-picking would be a bad thing; but if it turns out that the junior thermal past application technicians get less attentive once the first production batch is finished and the people who've been babying the project leave, that wouldn't be a total surprise.
    • but if it turns out that the junior thermal past application technicians get less attentive once the first production batch is finished and the people who've been babying the project leave, that wouldn't be a total surprise.

      Now add in the mix that some parts, like fans, might be sourced from several different manufacturer (and according to source, BIOS wasn't until latest update so good at operating them), add also that there might be variation in quality between the different batches of thermal paste (which got very probably sourced from several productors) and the output variation is clearly expected.

      But also fixable (newest BIOS to compensate fans with better throttling, manually replace thermal paste. Now cards works as go

      • There are a variety of possible sources of variation. And, since the part's (not illogical, if somewhat unhelpful for making firm predictions) thermal management strategy seems to be 'run as fast as possible; but don't die', those variations would show up in performance.

        What I'd want to know (but am unlikely to find out) is whether AMD was actively misleading reviewers by sending them hand-picked especially good cards, or whether review cards come from the initial run, probably the one where AMD's people
        • THe issue is the thermal paste [tomshardware.com], as well as chip manufactors lower mhz clock speeds until yields improve.

          It is a standard process for all chip makers. What AMD did was pick the best of the best where yields would not make sufficient defect free chips at that speed for the demo.

          As chip makers increase production and yield quality increases then the speed goes up as well. Notice Tom got closed if not matching the real demo with these tricks.

          So these were not botched demos at all! However, you do need to void t

        • or whether review cards come from the initial run, probably the one where AMD's people are mostly tightly and nervously observing the process, rather than the potentially more variable just-another-day-slapping-chips-on-cards production that follows.

          I would indeed agree with your first post and this part. To me, a big conspiracy to manipulate results is far less likely than simply slopiness of a mass-produced good, where speed of production counts, in order to quickly meet the demand.

          To quote a variant of Hanlon's Razor (often attributed without sources to Napoleon Bonaparte):
          "Never ascribe to malice what can adequately be explained by incompetence."

          Merely variation is only inconvenient, and may well mean that the usual 3rd-party overkill heatsinks actually help significantly.

          Yup very probably. Specially with modern card that try to go as fast as they can, while still within tar

  • Typical (Score:3, Insightful)

    by bit trollent ( 824666 ) on Wednesday December 04, 2013 @03:41PM (#45600175) Homepage
    This is becoming a habit for AMD. You can't even trust their FRAPS measured frame-rates. Seriously. I will never ever own another AMD card. Their graphics cards are nothing but empty promises and bad drivers.
    • The latest ones crash ALOT if you read maximumpc.com or tomshardware.com and my Nvidia fanboys I raid with.

      AMD has better quality hardware with less flaky voltage regulators. I went through 2 nvidia cards over 8 years that failed and switched all AMD/ATI in 2010 with the phenom II (better that steamroller per clock tick sadly ) and an ATI 5750.

      Had one bizaare issue with black edges on the screen when switching to HDMI. That problem went away after I went into the drivers and configured my screen to not do u

  • by SpaceManFlip ( 2720507 ) on Wednesday December 04, 2013 @03:46PM (#45600277)
    Oh gosh I hope this doesn't result in some poor sap attempting to play Battlefield 4.7 and while thinking they should achieve a pure 115fps they only hit a measly 92fps and their lives are ruined forever. The consequences will never be the same.
    • When you pay for discrete graphics you're usually making purchase decisions based on performance/$. When the reviews all say you'll achieve a certain trade-off, but with the retail product you don't, then the market has been deceived - perhaps you would have been better off buying something else.

      No comment as to what's actually going on in this case, if anything, since I haven't been following it...

    • Oh gosh I hope this doesn't result in some poor sap attempting to drive his car and while thinking they should achieve a pure 40mpg they only hit a measly 20mpg and their lives are runied forever. The consequences will never be the same.

      See how stupid you sound? Please redeem yourself by thinking carefully about the situation (potentially mislabeled product) and ideally apologizing for an utterly useless comment.

      • 92/115 = 0.8 ratio != 20/40 = 0.5 ratio. See how stupid you sound? Also, trolling an obvious joke with insults....

        Nobody even brought up the monitor refresh rate.

        • Gee, you're a pedant. Am I going to have to start opening Windows calculator every time I want to point something out?
          Honestly, it doesn't sound like a joke, more like you're making fun of people who got less than they hoped for when purchasing a product. It doesn't matter if it's a graphics card or a car, false advertising is false advertising (assuming this is all more than a few misbehaving cards, of course).

          Say you buy 1kg of *insert favorite nourishment here*. Wouldn't you get pissed if you found out i

    • I get your point, but nobody's interested in playing Quake 1 any more :). Generally speaking, with the recent/upcoming graphics-intensive games, when you're set to a reasonably high resolution and the pretties up, you'll have a hard time maintaining a steady 50+ FPS in the middle of an intense firefight. Which is exactly the moment when you need it the most, even if anywhere else you get 100 FPS. The lows are what you want to avoid.

      Besides, the whole point of burning $550 on a top-of-the-line graphics card
      • by Bigbutt ( 65939 )

        I am. I liked Quake 1, 2, and even Arena. I also liked Doom and Doom 2. And Duke Nukem 3d. And Command and Conquer. And Starcraft/Brood War. Hell, I'd love to be able to play Carmageddon again. What's wrong with having fun with a game vs being able to count the chest hairs on Duke as they wave in the gentle breeze?

        And get off my lawn. It's dying under the snow there.

        [John]

    • If I buy a Ford Fiesta with a 1.2l engine, and $RoadTestReview gets 45MPG over a week, three tank refills, and a good mixture of motorway and city driving, I don't expect my (supposedly) identical 1.2l Ford Fiesta to get 36MPG in the same period under the same conditions.
  • This probably happens more often than we think. Frankly, it makes sense to validate that a card is going to run solid for someone before you send it to them if they're going to be blabbering all over the interwebs about it. It's just in this case they got burned (and justifiably) due to the fact that the software/driver adjusts the frequency independently instead of being a static clock speed (something they should have disclaimed to the reviewers).
    • due to the fact that the software/driver adjusts the frequency independently instead of being a static clock speed (something they should have disclaimed to the reviewers).

      It's well known that these cards operate at a fixed temperature and push the clock and voltage as high as they can within these thermal limits.
      It's so well known among professionnal, that some like Tom's are giving advice about BIOS replacement (newer have better and more consistent throttling or fan accross all the varied parts) or thermal paste replacement (to improve cooling and thus performance).

    • by mjwx ( 966435 )

      This probably happens more often than we think.

      For a long time we've know that demonstrations lie.

      Any demo indistinguishable from magic is insufficiently rigged.

      I fail to see why anyone is surprised by this, Nvdia do it, Intel do it, Apple do it, everyone lies with demonstrators and all demonstration machines are rigged up to their eyeballs. IBM spends money making sure the lights on their demonstration machines blink in unison for crying out loud.

      Which is why anyone with half a brain does not base purchasing decisions on a vendor's product dem

  • I'm a huge fan of market competition and AMD. If this is not not a slashvertismemnt then I'm a noob.

    • Hello noob.
    • I'm a huge fan of market competition, which is why I can't wait for a capable third manufacturer to compete against the big two.

      • Intel should can compete just fine in the low end market, and is starting to poke their nose into the mid-range. You can play Battlefield 4 with the integrated graphics of an i5/i7 desktop Haswell chip at 720p and low settings at 30fps.

        Desktop processors running Iris Pro graphics may let Intel start stealing the spot light from AMD's APUs. I wouldn't count on Intel diving into the high-end market, so hopefully Nvidia won't be killed off. I also believe Nvidia has foreseen the end of themselves being anyt

  • It seems like this is a recurring problem for AMD. I recall a similar issue with the bulldozer cpu's
    • As far as I remember, those were plainly slower than the ones they were to replace, let alone Intel's products, even running at significantly higher clocks.
      They performed to spec, but the spec wasn't what AMD had originally hoped for.

  • by Anonymous Coward on Wednesday December 04, 2013 @03:59PM (#45600485)

    Toms Hardware covered this pretty extensively [tomshardware.com] a month ago.

    The short story is that AMD is throttling clock speeds to hold within a temperature limit. They learned the hard way that 40% PWM does not equal 40% fan speed, especially across all fans the OEMS used. There's a driver fix for that now measures fan speed and adjusts accordingly when in quiet mode that eliminates most of this performance discrepancy (retail cards can now see higher performance in line with review samples).

    Remaining differences between cards may be due to different heatsink grease, also already examined by replacing the grease on a retail card [tomshardware.com] for a significant performance gain.

    • by jandrese ( 485 )
      I've always found it shocking that after reading about how important it is to apply paste properly and take the time to do it right when building my own machine, every time I open up an OEM box I discover the paste just globbed on there willy nilly with a caulking gun. nVidia had a huge problem with this back with the 8xxx series GPUs in laptops.

      I know some guy making $0.30/day in China isn't going to take a credit card and insure a perfectly smooth and even coating of thermal paste before carefully ap
    • How come applying thermal grease is still such a big problem in the semiconductor industry?
      They've been doing it for decades, but still haven't figured out how to get it right every time.

      Even Apple, who are renowned for their design and manufacturing prowess, keeps hiring companies that screw it up.

      • by tlhIngan ( 30335 )

        How come applying thermal grease is still such a big problem in the semiconductor industry?
        They've been doing it for decades, but still haven't figured out how to get it right every time.

        Even Apple, who are renowned for their design and manufacturing prowess, keeps hiring companies that screw it up.

        I think it's too much variation in the way heatsinks attach to the chip. Ideally it would be something like how heatsinks on Intel do it (there may be others, but Intel is what I have experience on) - you put on

      • Probably because there's no test for it. If they put the CPU on the board backwards, they'll notice when they try to turn the system on. Too much paste (or too little), and things will work just fine as far as any test is concerned.

        When I did computer repair, I once encountered a PowerMac where the heatsink had a manufacturing defect; one of the posts that fit in a hole on the CPU card had a large extra blob of aluminum on it. It was impossible to seat the heatsink on the CPU, though it could still be strap

  • I believe the entire R9 series of cards are little more than rebranded versions of older cards AMD just discontinued. The R9 280X for example? It's just a 7970 card. Sure, it may have a few BIOS tweaks and such, but you can even peel the label off the circuit board of many of them and find 7970 etched on the board underneath the R9 280X label.

    Personally, I think AMD should have waited until it had legitimate improvements to offer before releasing anything, rather than trying to fool people with the R9 se

    • Nope. The 290/290X is a much larger chip - similar architecture, but bigger (and mildly improved).

    • Personally, I think AMD should have waited until it had legitimate improvements to offer before releasing anything, rather than trying to fool people with the R9 series.

      The problem, is that AMD got too busy doing legitimate improvement under contract for the coming generation of consoles (Xbox One, Play Station 4, and some of the upcoming Steam Box, all run the same combo of AMD CPUs and GPUs).
      With that work, there was going to be some delay for their PC's Radeon HD 8xxx serie.

      So it was either:
      - have absolutely nothing to sell.
      - do some small upgrade on the older board (R2 270/280 are simply older board slightly upgraded) and older chips (R2 290(X) are GCN1.1 chips, slight

      • Will gamers see that much of an improvement? The PS4 and Xbone being x86 hardware is nice as the excuses on why a port to PCs can't happen, but both consoles are pretty sad when compared to current mid-range PCs let alone a high-end rig. The Xbone one is struggling to hit 1080p while the PS4 is hitting it, but at 30fps. This is matching or lower than the performance of a current mid-range PC and the performance gap will only widen.

        It's good that AMD was able to get the contracts to get income as they hav

        • Will gamers see that much of an improvement? The PS4 and Xbone being x86 hardware is nice as the excuses on why a port to PCs can't happen, but both consoles are pretty sad when compared to current mid-range PCs let alone a high-end rig. The Xbone one is struggling to hit 1080p while the PS4 is hitting it, but at 30fps. This is matching or lower than the performance of a current mid-range PC and the performance gap will only widen.

          ...all this done with intergrated GPU. That's the key point. These performance are pulled using just an embed chip, that pulls minimal amount of power.

          Now scale this thing up, move one generation next ( to GCN 2.x ) and the discrete card that will be getting next year from AMD are going to be quite interesting.

  • I listened to a Maximum PC podcast about this a couple of weeks ago. Is this a slow news day?

  • I've just ordered a R9 290X, before I saw this (oh no!). If this effect is reducing the 290X performance, and 290X is just a higher clocked 290, could I expect similar performance on the 290 as the retail 290Xs? If so I just want to return the 290X and recoup 25 % of the cost... Advice greatly appreciated

    • Looking at some pictures of the stock 290X heatsink, you will be well served by removing, lapping, and replacing the thermal grease with some decent stuff. Even more if you decide to spring for an aftermarket heatsink, but you may have to do a little leg work to figure out which ones will work and which ones won't.

      • by fa2k ( 881632 )

        that's an option that I didn't consider, cheers! (probably won't check back on this thread any more and will keep the 290X)

"The voters have spoken, the bastards..." -- unknown

Working...