AMD Reveals Radeon R9 Fury X Specs and Preliminary Benchmark Performance Results 87
MojoKid writes: AMD announced new Radeon R9 and R7 300 series of graphics cards earlier this week, and while they are interesting, they're not nearly as impressive as AMD's upcoming flagship of AMD GPU, code named Fiji. Fiji will find its way into three products this summer: the Radeon R9 Nano, Radeon R9 Fury, and the range-topping (and water-cooled) Radeon R9 Fury X. Other upcoming variants like, AMD's dual-Fiji board, were teased at E3 but are still under wraps. However, while full reviews are still under embargo, the official specification of the Radeon R9 Fury X have been revealed, along with an array of benchmark scores comparing the GPU to NVIDIA's GeForce GTX 980 Ti. Should the numbers AMD has released jibe with independent testing, the Radeon R9 Fury X looks strong and possibly faster than Nvidia's GeForce GTX 980 Ti.
Re: (Score:1)
So you are saying MojoKid is just another shill who pays Dice to get articles on the Slashdot front page?
I'm not saying anything, but some variation on that theme would neatly explain why my submission on this subject hasn't been accepted.
Re: (Score:3, Insightful)
They're not even embarrassed. They know their hardware is junk, and they're relying on the emperor's new clothes model to keep people excited about it.
Considering that AMD has long experience with anti-AMD shilling I can understand them not wanting it to be reviewed by them. After all, Tomshardware did the same thing going all the way as far as blaming AMD for the poor performance. That was until someone on their side let it slip that TH knew that the poor hardware performance was because intel allowed specific optimizations to benchmark codes. Thus their real world performance was flawed.
Re:AMD Refuses Review Hardware over Negative Revie (Score:4, Insightful)
That was until someone on their side let it slip that TH knew that the poor hardware performance was because intel allowed specific optimizations to benchmark codes. Thus their real world performance was flawed.
The problem with this argument is that all the benchmarks are flawed for this reason. I never trust a synthetic benchmark to tell me how hardware is going to behave in the real world.
Re: (Score:2, Informative)
It goes beyond the synthetics though. I've seen some benchmarks where they choose an odd set of game video quality settings, such as cranking everything to the max, but disabling MSAA (so that the "mid range cards get a chance to shine" is the excuse), and then showing that the game gets 49FPS on the AMD 390X card, but 58FPS or so on the Nvidia GTX 980, and claiming the "390X can't compete". Of course, the clueless readers are "wtf, amd fail!!11!", ignoring the fact that the 980 is like $150CAD more expensi
Re: (Score:2)
They're the reason you only see maybe one AMD notebook in any store, surrounded by 20+ Intel powered notebooks.
No, AMD's shitty fucking netbook chips are the reason for that. I bought one AMD netbook, and it was a festering pile of shit because of shitty chips which pretend to be old video cards but don't emulate them faithfully, and a total shitfest when it comes to drivers. And of course, AMD won't let you download these drivers directly; there were a few months when they permitted it, but then they revamped their site into the new form where you can't find anything, and at that time they removed the mobile driver
Re: (Score:2)
Re: (Score:2)
Funny how any APU will kick the ass of an i3...
I wouldn't buy an i3, either. And I don't even know intel's naming scheme well enough for a really good rant about how shit it is. Believe me, I'm not in love with intel. I am quite fond of nVidia, but I've had the occasional minor problem there, too. And again, the problem with that G71 die bonding... They're not perfect. However, nVidia wasn't the one I had to deal with on that, HP was. They strung me along for over 24 hours of phone time altogether before I got a replacement machine. Too little too late.
Re: (Score:2)
Re: (Score:2)
Meandering off to ramblemutterstan is a great counter-argument.
I responded to your comment, then I rambled. If you missed my response, pay closer attention next time. I'm not responsible for your reading comprehension skill level. My counter-argument in more detail, for the clue-impaired, is that "intel makes slow chips too" is not an argument. It's an uninteresting fact. It's also an irrelevant fact, because I wouldn't buy an AMD IGP, because of the IG part. I have an AMD CPU in my PC right now. In the box with it is an nVidia GPU. And way back in the way back, I had
a source (Score:2)
That was until someone on their side let it slip that TH knew that the poor hardware performance was because intel allowed specific optimizations to benchmark codes. Thus their real world performance was flawed.
The problem with this argument is that all the benchmarks are flawed for this reason. I never trust a synthetic benchmark to tell me how hardware is going to behave in the real world.
for anyone who was looking for more to read on this matter, Ars Technica looked into this with PCMark2005 [arstechnica.com]. I'm not sure why but I don't care if that was a 10 year old benchmark, 10 years is not a long time IMO. I suppose AMD could be blamed a little though for not supporting writable registers like Via on their chips...
NVidia does sorta-similar with their Game/HairWorks features in Witcher 3, or pushing retardedly stupid anti-aliasing modes because their architecture runs them better than AMD's.
Re: (Score:1)
Well this is my anti-anti-AMD shill . Of the 2 NVidia cards I have had, both incinerated themselves while ALL my AMD/ATI cards were in perfect health when I upgraded. If I had to choice between an AMD/ATI GPU and a NVidia, I will always choose AMD/ATI. I would rather have a GPU that will still function after a month and not self destruct as my previous NVidia cards did.
Besides I never read either TH or KG reviews anyway (or I at least take their assessment with a grain of salt). The same thing happens w
Re: (Score:2)
I had many cards. More ATI cards than Nvidia ones. Only one broke, and it was an Nvidia card.
I've had one nvidia chip fail, and I've had one ATI chip fail, and I mean actual chip failure in each case. Where ATI fails over and over again is in their drivers. Their drivers have always been bad, literally always. I usually have to find hacked drivers to get them to even kind of work. I have a machine with AMD R690M chipset which has never been correctly supported on any platform but Windows Vista, I am not making this up even a little bit. Vista is the only place it behaves. On any other OS, it's horr
Re: AMD Refuses Review Hardware over Negative Revi (Score:2)
Re: (Score:3)
Re: (Score:2)
And yet again you bring up the drivers despite the fact multiple people on multiple occasions have given you the correct drivers which work perfectly well on anything from Win 7-Windows 10.
No, no they don't. Power management is unreliable, and the system becomes very flaky. If I load the DnA hacked drivers then only power management is a problem, I don't get free reboots. I told you this before.
Re: (Score:2)
Re: (Score:2)
Screencaps? In fact ANY evidence?
Fuck you, shitbag. I'm not lying, but that's what you're accusing me of right now. Well, you're a well-known pro-AMD shill.
So don't blame the company when its a case of "ur doing it wrong" or as the FOSSie faction is happy to parrot....Works For Me!(TM)
So you're going to use an argument that you think is shit? That's because you're shit. I'm tired of your bullshit. You really think I'm lying about this? What's the angle? I'm clearly not getting paid by intel, who I bad-mouth regularly. I'm obviously not sponsored by nVidia, or I'd have more than a 750Ti. So why would you think I'm lying? Just because it's what you would do in my posit
Re: (Score:2)
The old ati all in wonder 8500 that I put in my daughter's computer finally died last year.
Re: AMD Refuses Review Hardware over Negative Revi (Score:2)
Re:AMD Refuses Review Hardware over Negative Revie (Score:4, Informative)
Former ATI (and then AMD) engineer here... Now work at NVIDIA. My take is that, generally speaking, the quality of the chips from either company are pretty much on par. I'm not talking performance, that's a separate issue. I'm talking the quality of work that went into design, implementation, manufacturing. Neither company's chips/boards is going to be any more reliable than the other, on the whole. Similar MTBF and whatnot, and as these are consumer parts, there will necessarily be folks who unfortunately get a bad part or two. It's just probability.
Re: (Score:2)
Former ATI (and then AMD) engineer here... Now work at NVIDIA. My take is that, generally speaking, the quality of the chips from either company are pretty much on par. I'm not talking performance, that's a separate issue. I'm talking the quality of work that went into design, implementation, manufacturing. Neither company's chips/boards is going to be any more reliable than the other, on the whole. Similar MTBF and whatnot, and as these are consumer parts, there will necessarily be folks who unfortunately get a bad part or two. It's just probability.
AMD's drivers have, historically, been a little more rough around the edges for special features (alt-tabbing to multitask with something on the right HDMI monitor while using dual monitors while gaming on the left in DVI-- this was a problem 3-4 years ago, but not now), but lately (last year that I've had my 7850) I've been impressed with their driver stability, and it reminds me of how well my 670 worked 4 years ago.
NVidia still has the edge on CPU efficiency with their drivers currently but that'll be ch
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
This is my first and last post to slashdot. Goodby.
You didn't even tell us why you were leaving; although leaving your comment here under my comment is suggestive, it doesn't explain anything. What a waste of time.
Re: (Score:2)
That's actually your third post, you dirty liar. :)
Re:AMD Refuses Review Hardware over Negative Revie (Score:5, Informative)
here's why it was pulled:
https://www.youtube.com/watch?... [youtube.com]
The video was released before any official info about fiji was released, so it was full of speculation, had inaccuracies on topics that were already publicly known, and called AMD a bunch of cons for using previous gen cards in the non top end, non-fiji part of their lineup.
Re: (Score:1)
The video was released before any official info about fiji was released, so it was full of speculation,
Yeah, he says that a lot. I'm probably not going to watch the whole video, but he's said it a bunch of times already: that it's speculation. And as AMD didn't permit the press to see the product, even though they were showing it off to some people, he had only speculation to offer.
had inaccuracies on topics that were already publicly known,
That's a pity. What were they?
and called AMD a bunch of cons for using previous gen cards in the non top end, non-fiji part of their lineup.
Well, I'd say that about both AMD and nVidia. They both do that, and it's sleazy in both cases.
I gave the video thumbs down, because comments are disabled, and I always do that. But I don't see what t
hahahaha flamebait (Score:2)
Butt-hurt AMD fanboys with modpoints, or AMD employees with modpoints? You decide.
No HDMI 2.0 support, not even in Fiji (Score:3, Interesting)
A word of warning for everybody: Neither the new R9 3xx series GPUs nor the Fiji (Fury) parts support HDMI 2.0. That means that you will not be able to output 4k60 pictures to your brand new 4k TV. That's fairly silly for a graphics card that is advertised as making 4k gaming a reality.
Re: (Score:3)
Re: (Score:2)
Displayport to hdmi 2.0 cables are all of 11 bucks on amazon. Oddly think OEM's will be capable of throwing whatever chip is in there on an output.
Every decent TV I've had has been able to output it's sound via optical/coax to the receiver, hell hdmi has a standard for doing just that.
Re: (Score:3)
Displayport to DVI/HDMI can be done with a passive adapter. Basically, DP can output DMI/HDMI signal once it detects that kind of a monitor. I'm not sure if this applies to newer versions of HDMI, though.
The whole separation between AV and PC worlds seems silly anyway. For example, the first time I connected my AMD GPU to my "computer" monitor via HDMI, the computer was detecting/sending a correct resolution, but the output was shrunk, leaving black bands on all sides and the image ugly and blurred. It t
Re: (Score:1)
Yes, there are adapters, but the video signal will still only negotiate as 4k30 if both ends do not speak HDMI 2.0. The adapters are *passive*.
Re: (Score:2)
In theory an active Displayport to HDMI adapter is still possible. My guess is such a device will be made at some point.
Re: (Score:2)
Are you sure? Because you only need 10.2Gbps for HDMI 2.0, which is the same bandwidth as HDMI 1.4. It's how current HDMI 2.0 devices support 4k60 right now.
The video card might now have a proper 18Gbps HDMI port, which will
Whats is the slashdot stupid icons over the title (Score:3, Informative)
I can't even read the end of the friggen titles, come on slahsdot, get those STUPID icons out of the way of the titles of the articles
Re: (Score:2)
Slashdot, on top of the gripe that I am replying to, please remove that chat bubble from the story title line and restore the "Read More" link to it's proper place. I am not saying you can't have a prominent social networking link, I am saying put more thought into where it should go. Seriously, please backtrack on this one.
Re: (Score:2)
Same problem here, can't read titles. The font setting used by Slashdot means I always need to +1 the zooming, which means icons overlapping the titles. A rollback on both the icons and the "read more (x comments)" would also be appreciated.
Re: (Score:2)
I can't even read the end of the friggen titles, come on slahsdot, get those STUPID icons out of the way of the titles of the articles
You just need to stretch out your browsers windows so that it is wide enough to accommodate the headline layout as envisaged by the might /. design team.
Or in other words the idiots at /. designed a layout that looks good on their monitors and never tested it beyond that. Why should they, as it works on their systems!
Re: (Score:2)
You just need to stretch out your browsers windows so that it is wide enough to accommodate the headline layout as envisaged by the might /. design team.
Or in other words the idiots at /. designed a layout that looks good on their monitors and never tested it beyond that. Why should they, as it works on their systems!
The /. designers must now be wondering why newspaper text is laid out in narrow columns. After all, a newspaper has a lot of horizontal space, so why don't they just print long lines of text...
Re: (Score:2)
Still using a non-widescreen monitor here, browser is always full-width, can't make it any wider.
Re: (Score:3)
Give up 4:3 TV? Sure.
Give up 4:3/5:4 monitor? There's no reason if it still works.
Re: (Score:3)
Give up 4:3/5:4 monitor? There's no reason if it still works.
IMHO, 4:3 or 5:4 (which I'm using right now) is much better for most of the computing tasks I do -- see my cousin post on text line widths. I'd gladly get a new monitor with such dimensions (and a modern resolution, naturally).
For the inevitable comments on turning a widescreen monitor in a vertical orientation, please go and educate yourselves on subpixel font rendering. Of course, as videos don't use that, an ideal monitor might have their subpixels oriented in the long direction, so it could serve bot
Re: (Score:2)
- Independent Researchers Test Rossi's Alleged Cold Fusion Device For 32 Days
- Man Booted From Southwest Flight and Threatened With Arrest After Critical Tweet
- Russian Military Forces Have Now Invaded Ukraine
- Professor: Young People Are "Lost Generation" Who Can No Longer Fix Gadgets
- Oregon Testing Pay-Per-Mile D
Re: (Score:1)
They had to push out the Omega Driver Pack for the release of R290 and R290x, so sites would stop using pre beta drivers for benchmarking the cards.
So in all honesty? Its more or less the same as Nvidia, with different problems.
Paid Nvidia Reputation Managers (Score:5, Interesting)
Every year Nvidia spends TENS OF MILLIONS of dollars to what the PR industry calls REPUTATION MANAGERS. These are people who blitz the tech sites and their forums with anti-AMD FUD, and pro-Nvidia boosting.
That person on a forum who SWEARS he once owned AMD card after AMD acrd, and they all went wrong- forcing him to go Nvidia? A paid Nvidia shill. That person who obsesses about any omission in a new AMD product (see the HDMI FUD in this forum for instance) and swears it is an unforgivable and fatal flaw- a paid Nvidia shill.
Nvidia once made the high-end hot and power-hungry graphics cards, and Nvidia paid off every tech site to say that REAL PC gamers had massive cases filled full of fans that used the biggest PSU imaginable. Then Nvidia went to a higher performance per watt than AMD, and suddenly every tech was saying that how and power hungry graphics cards were the worst thing ever- and every REAL PC gamer wanted the smallest case with the tiniest PSU.
For years, only AMD implemented digital sound over the cables used to drive home cinema devices, and Nvidia paid off tech sites to say that REAL home cinema enthusiasts used separate amps driven from an independent audio output from the back of the PC.
Using two graphics cards together is an UTTER JOKE apparently every time AMD's solution is better than Nvidia, but the best thing since sliced-bread every time Nvidia's solution is better than AMD's.
Nvidia plays dirty- very, very, very dirty. And sometimes Nvidia has the better products as well (if you don't take price into account). For whatever reason ATI/AMD has never played this dirty propaganda game, and without doubt it has suffered horribly because of this. At the notorious Anandtech, for instance, the forums have 'moderators' who are actually also employees of Nvidia and Intel. The trolling of AMD on the graphics and CPU forums there is unprecedented as a result.
Fiji is a BETTER GPU than the Titan X (so-called Big Maxwell) and Nvidia knows this. But Fiji is a terribly expensive product for AMD, while the 980TI is a much cheaper build from Nvidia. On each Fiji card AMD makes a significant loss- while on each slower but similar priced 980TI card Nvidia makes an Apple like profit.
Both have chips of around the same size, but AMD has higher transistor density and lower yields. AMD's tech is much more advanced, which is why the anti-AMD fud is operating at 11 at this time.
Next year everything changes. Years late, Nvidia launches its first true DX12 architecture (AMD has been fully DX12 compatible since GCN first appeared with the 7870). Nvidia's PASCAL architecture is expected to crucify AMD (and sadly there is no reason to expect otherwise).
here's a good example you can expect (Score:2)
A good example with this you can expect coming up will be about how 4GB isn't enough memory for future proofing, but how 3 months ago everyone was saying having only 3.5GB available in the 970 [digitaltrends.com] wasn't.
Re: (Score:2)
Where do I apply for this job? I could get on sites and talk about my history with previously ATI now AMD products and why I only get nVidia now, and have a clear conscious while doing it.
I'll review it in a positive way... (Score:1)