AMD Releases New Tonga GPU, Lowers 8-core CPU To $229 98
Vigile (99919) writes AMD looks to continue addressing the mainstream PC enthusiast and gamer with a set of releases into two different component categories. First, today marks the launch of the Radeon R9 285 graphics card, a $250 option based on a brand new piece of silicon dubbed Tonga. This GPU has nearly identical performance to the R9 280 that came before it, but includes support for XDMA PCIe CrossFire, TrueAudio DSP technology and is FreeSync capable (AMD's response to NVIDIA G-Sync). On the CPU side AMD has refreshed its FX product line with three new models (FX-8370, FX-8370e and FX-8320e) with lower TDPs and supposedly better efficiency. The problem of course is that while Intel is already sampling 14nm parts these Vishera-based CPUs continue to be manufactured on GlobalFoundries' 32nm process. The result is less than expected performance boosts and efficiency gains. For a similar review of the new card, see Hot Hardware's page-by-page unpacking.
This urge I get sometimes (Score:5, Funny)
Sometimes I want to send headlines of this sort back to 1998 and see how the people of that era would react.
Re: (Score:2)
I PC game, and have zero reason to upgrade (Score:5, Informative)
I just can't justify upgrading everything for messily 10% gain. As such, both Intel and AMD have to work harder on backwards compatibility. I might buy new CPU when it goes on sale if I also don't have to upgrade motherboard and RAM.
Re: (Score:2)
Re: (Score:2)
For gaming, right? I imagine things might speed up a little now we have a new generation of consoles. On the other hand, as graphics get better and better, game dev costs skyrocket, so perhaps we really are seeing a ceiling.
(No matter how many times I encounter it, the flagrant mis-attribution in your sig still annoys me.)
Re: (Score:2)
Re: (Score:2)
How can I tolerate you?
(NOTE: This is a Tool reference, I'm not just being a random jerk).
Re: (Score:1)
Re:I PC game, and have zero reason to upgrade (Score:4, Insightful)
2 years old puts you on par with the latest generation of console hardware, which is what AAA developers target, and indie devs tend to focus more on whatever idea/style they're trying to show than pushing polygons.
In a year or two, when it becomes clear that there are certain kinds of things that can only be done on that years' hardware(maybe something physics related, or AI, or as a pipe dream ray tracing) then your 2 year old rig might start to have some trouble.
Re: (Score:1)
Maybe we've arrived at a situation where the technology to do anything you could reasonably want is simply here, and gaming is going back to providing a unique experience and captivating story lines.
Re: (Score:1)
Physics can already be done on the GPU very well - the development we're waiting for is getting data back off the GPU and into main system memory fast enough for the CPU to be able to use it (ie, this stuff being used for gameplay, not just eye candy). That won't happen until there's a rethink on how GPUs are connected to the mainboard
This isn't the turn of the century with your new fangled AGP 4x graphics card. PCI Express is symmetric. You can pull data in from peripherals just as fast as you can push it out. If there is a bottleneck in pulling computed physics data from modern graphics cards, it's entirely the fault of the internal design of those modern graphics cards.
Re: (Score:2)
Not to mention, the entire point of the AMD APUs (including the Jaguar-core ones the GP disparages) is that the GPU and CPU are the same damn chip, so they use the same damn memory. At this point, if it's slow then it's not even the fault of the hardware; it's the fault of the driver or API. If you're trying to get data from host to device using an APU and it's actually moving bits around, then you're doing it wrong [amd.com].
Re: (Score:2)
Re: (Score:2)
> In a year or two, when it becomes clear that there are certain kinds of things that can only be done on that years' hardware
Rather than argue speculatively like this, why not argue more concretely with a case where what we have have today is not possible with 3 or 4 year old hardware? I can't think of anything off the top of my head. Even if there is some technique like that, how widespread is its use in today's content? And how much would a person miss by not having that itty bitty feature?
PC gaming h
Re: (Score:2)
Because speculation doesn't require me to have a detailed and complex understanding of the particulars of CPU/GPU limitations. As a developer I've only ever run up against the "I'm rendering way too many polygons" problem.
Which is the kinda thing cleaned up through optimization.
Re: (Score:2)
From the-not-actually-true-also-we're-discussing-approximations-file: I'd be happy to care and update my opinions if you could objectively outline your case with numbers.
Re: (Score:2)
Many games on the Xbone and a smaller but still non-zero number on PS4 don't even run at 1080p@30hz natively.
This may be rectified as the dev tools improve, but since the hardware is so close to PC-based I doubt we will see as large of an in-gen improvement as we did with older custom hardware consoles.
Re: (Score:2)
I'm not sure what "many games" have to do with hardware specs, and I'm already familiar with the fact you brought up. So... I think you'll have to clarify your case further, if indeed, you're trying to explain AC's case.
Re: (Score:2)
Re: (Score:1)
With the "current gen console" you probably mean the PS4 or XBOX One, as they are available already?
Then the said mid level gaming PC might be equivalent. Maybe a bit better but not greatly superior. On the other hand, since the PS4 / XBOX One are fairly new, they might be the "standard" for the next five years or so.
But when the PS5 comes out, whenever that happens, all bets are off.
Re: (Score:2)
I PC game, and for the first time in decades have zero reasons to upgrade. My rig is now about 2 years old and runs every title at max setting. Unless I upgrade to 4K monitor (and I see no reason to) my PC should last me another 3-4 years before I get bumped to medium settings.
Why not? Games can actually render 4K detail, unlike the real problem with 4K TVs, there's almost zero native content. I did manage to play a bit at full 2160p and it was beautiful but also totally choking my GTX 670 so I'm currently waiting for a next-gen flagship model (GTX 880/390X probably) for a SLI/CF setup. CPU/RAM don't seem to be holding it back much though, but maybe at 4K so upgrading those too.
Re: (Score:2)
Re: (Score:2)
Plus, in-browser game launcher. Enough said.
Re: (Score:2)
Re: (Score:2)
You can thank consoles becoming popular for that. Given how little money AAA titles make on PC (it generally covers the cost of the port), and yes, I mean money made, not copies actually in use (the only number that matters is "how m
Re: (Score:3)
Re: (Score:3)
Re: (Score:2)
What neither chip maker wants to admit is that from 1993 to 2006 what we had was a BUBBLE, no different than the real estate or dotbomb bubbles.
That's because it's total horseshit.
In 1993 we had what a 486 at 60MHz or something? In 2006 we were up to the Core 2 processors which were several thousand times faster. It's not a bubble because it never burst. We still get to keep our Core 2 duo processors and they're every bit as fast. And the newer processors have been faster or cheaper or lower power and frequ
Re: (Score:2)
We had a growth bubble. Most corporations depend on endless growth to be healthy. When they stop growing, they start dying. When the PC market maxed out, both AMD and Intel suddenly had no idea where they were going next.
When the new Intel processors come out on the new process and we get to see how low they can get power consumption, we'll see if Intel is going to continue to kick ass in the next iteration, which is going to have to be mobile.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
As such, both Intel and AMD have to work harder on backwards compatibility. I might buy new CPU when it goes on sale if I also don't have to upgrade motherboard and RAM.
Intel, ok, but AMD? AMD doesn't make breaking changes to their sockets unless they're needed to support newer memory. They released AM3 to support DDR3 in 2009, and AM3+ is backward compatible. (The FM sockets are for APUs only and therefore not relevant.)
In the same period, Intel has had 4 desktop sockets (twice as many as AMD), and none of which are backward compatible, AFAICT.
Source: http://en.wikipedia.org/wiki/C... [wikipedia.org]
Time to cut prices (Score:2)
I suspect my next CPU will be arm(MIPS). I am astonished that I see I CPU the cost of several 1080p tablets. I am a little tired of all the posters of my computer does everything... I would love a faster machine, but at these prices they can whistle, and that is without the escalating cost of ram... and Microsoft bleeding it's monopoly to those tied into it.
Re: (Score:2)
and Microsoft bleeding it's monopoly to those tied into it.
What?
Re: (Score:2)
Re: (Score:2)
To my knowledge, the best current ARM tablets have maybe 4GB of RAM. So if you want something that offers superior performance, an x86_64 bit dual core processor matches or beats any current 8 core ARM chip (correct me if I'm wrong, anyway) plus a minimum motherboard plus 4GB of RAM plus a 32GB USB flash drive plus a cheap case and power supply, not including monitor, keyboard, and mouse will probably
Re: (Score:2)
$230 is... OK. Performance is roughly comparable to Intel chips at the same pricepoint, but with significantly higher power draw. There may be potential for cost savings in platform costs, I've not looked into it.
FYI (Score:2)
ARM and MIPS are different things. The companies are different, instruction sets are different, cores are different, etc. They've nothing to do with each other.
Re: (Score:2)
> I suspect my next CPU will be arm(MIPS). I am astonished that I see I CPU the cost of several 1080p tablets.
Yes. And you will have to "outsource" any interesting computational tasks like something as simple as voice recognition. ARM based devices are good enough only so long as your use of it fits narrowly defined parameters driven by what speciality silicon is on your particular SoC. Even that is limited.
ARM lags behind even ancient and discontinued x86 processors. PCs also have more interesting "spec
Re: (Score:2)
Sigh. (Score:3, Interesting)
This GPU has nearly identical performance to the R9 280 that came before it
Which had nearly identical performance to the 7950 that came before it. Which came out nearly three years ago.
Meanwhile, this says it all [pcper.com] about the CPU. Sure, the AMD might save you $100 over the (faster) Intel, but you'll pay that back on a beefier PSU and cooler and electricity bills to support the beast.
What happened, AMD? I loved you back in the Athlon64 era...
Re: (Score:1)
It has much less power consumption than the R9 280 though. It would be more interesting as a laptop version
Re: (Score:1)
Re: (Score:1)
What happened? They got a jackass MBA "buisness" CEO that decided it was a good idea to cut R&D and go to automated layout (Instead of hand layout like Intel does) Real fuckwit. The sort of guy that thinks everything besides sales is a cost center to be axed.
Predictably, AMD's products suffered.
Tonga (Score:5, Funny)
King Tupou VI wants royalties
Re: (Score:1)
King Tupou VI wants royalties
Why? He is already royalty.
still a problem (Score:1)
Re: (Score:1)
What AMD needs to invent is hardware core multiplexing. In other words, have 8 cores but represent them to the system as 1 core and handle the distributed processing in the firmware. That would crush Intel.
It's been invented, it's called superscalar processing, and almost everything uses it. Good branch prediction is hard, as Intel learned with the Itanium.
What AMD needs to do is admit that "modules" and "cores" are the same thing, and that they have a quite decent quad core processor on their hands. This silly doubled-up ALU architecture is strikingly similar to what Intel did with the Pentium 4, and it's having the same results - inferior performance, with deep pipelines and extremely high clocks.
Here's the
Re: (Score:1)
It's a little more complex than that - Bulldozer would be fine if it weren't a two-wide design*. Haswell, by comparison, is four-wide, which is what makes it about 50% faster. The module architecture would be fine, if a little irregular - faster in some workloads and slower in others as a result of the shared resources - if not for that deficiency. It made sense back when Bulldozer was originally designed, and was apparently roughly equivalent to the Steamroller iteration, but that version was cancelled
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
Would you be interested in a spell-checker?
gaming rig (Score:2)
OK, since Slashdot is running these weekly Tom's Hardware-type posts, lemme axe you something:
I've got a system I put together maybe three years ago. I used a good motherboard bought a good case, good RAM, very good PSU. It was when the first i5s were coming out, so it's an i5-750 (2.7ghz, I think). I didn't spend a lot of dough on a GPU, but I've been able to play everything up to and including Watch Dogs on this setup.
I want to be ready for the fall games (The Crew, GTA V, Dragon Age Whatever, Witcher
Re: (Score:3)
Why not actually try the games that you want and then decide if things are too slow at all, rather than listen to some people that will evangelize how cool new stuff is with impunity since it is not their money they are justifying spend on...
Also, my wife thinks a grown man playing computer games is a little bit pathetic, and I can't really argue with her,
What could be pathetic is neglecting responsibilities or pissing away family savings on superfluous stuff. If one takes care of their responsibilities appropriately and is prudent in their spending, it doesn't really matter if a grown man plays computer games or watch
Re: (Score:2)
You're not married, are you?
Thanks for the advice, though. Right now, "Can I Run It" shows that most of the games that have published requirements will run on my machine. I'll save the dough and wait and see. It's not like I can't get a new video card in a d
Re: (Score:2)
Indeed, even after installing the software, upgrading is easy. Excepting some DRM crap that could fire if you change too much, but that is BS.
I am actually married and a father too. I can't disappear into a 'mancave' every day for hours on end or spend all our money on high end gaming equipment, but I don't catch flak for spending my time gaming for a short while many days and the occasional 'bender' of gaming. If I covered the house in gaming paraphernalia or something maybe, but as long as I don't go o
Re: (Score:2)
Re: (Score:2)
I know. Some expect you to get a job.
I'm pretty lucky all in all. I was able to retire on my 50th birthday and except for the occasional request to not throw another controller through the window because I'm frustrated with Dark Souls' horrible PC port, she doesn't mind me gaming. Occasionally, when company comes to the house, she'll ask me to put some pants on, though. I like to g
Re: (Score:2)
Occasionally, when company comes to the house, she'll ask me to put some pants on, though. I like to game au natural. She made me a nice little pad to sit on
Yes., way too much information. But funny as hell.
Re: (Score:2)
I'm married, and I have no problems with my wife's opinion about pretty much any decision I make. If you can't do thing
Re: (Score:2)
I'd say just a new GPU would be fine. I use an Asus 770GTX and can play everything I've tried on max settings @1440p, so you should fine @1080p.
The 770 doesn't take advantage of the higher power efficiency parts in the newest Nvidia generation, but the price on some of the variants is quite good. Newegg has a Zotac verison for $275: http://www.newegg.com/Product/... [newegg.com]
The 280X can be picked up for a little less, bit it uses more power and is louder from what I have read.
Re: (Score:2)
That's good advice. Do you happen to know if new cards like the 770 are backwards compatible with motherboards that don't have the latest PCI-e 3.0? My motherboard has PCI-e 2.0.
Oh, I guess I can go look it up. Thanks for the good advice.
Re: (Score:2)
The 6850 while not a bad card now, may struggle to play "next generation" games. That being said AMD cards like the 270x are dead cheap on ebay, or get a card in the $250-$300 range and that should work for a while.
Curious what your wife does for fun?
Re: (Score:2)
She makes fun of grown-ass men who put on helmets with horns on them and play computer games in their underwear.
Personally, I think I look pretty cool in the helmet with the horns, and playing in just my underwear makes me feel more like a level 50 battlemage.
Re: (Score:2)
Not really 8 cores... (Score:3)
If IBM did the processor, they would have called it 4 Core with SMT2. Basically you have 4 modules, with 2 of many of the components, but a lot of shared components. Notably, each of the 4 modules has a single FPU (so it's more like IBM's SMT8 versus SMT4 mode if you talk about their current stuff).
So it's more substantial than hyperthreading, but at the same time not reasonable to call each chunk a 'core'. I think it behaves better than Bulldozer did at launch *if* you have the right platform updates to make the underlying OS schedule workload correctly, but it's still not going to work well (and some workloads work better if you mask one 'core' per module entirely).
Basically, it's actually pretty analogous to NetBurst. NetBurst came along to deliver higher clock speeds since that was the focus of marketing, with some hope of significant workloads behaving a certain way to smooth over the compromises NetBurst made to get there. the workloads didn't evolve that way and NetBurst was a power hungry beast that gave AMD a huge opportunity. Now replace high clock speed with high core count and you basically have Bulldozer/Piledriver in a nutshell. I'm hoping AMD comes back with an architecture that challenges Intel agin, just like Intel came back from NetBurst.
Re: Not really 8 cores... (Score:2)
Somewhat analogous to P4 but not quite in that Bulldozer IPC is about at Phenom II levels. See here [anandtech.com]: fully loaded, IPC is equivalent to Sandy Bridge/Ivy Bridge but single-threaded it's about at Phenom II IPC.
AMD's original goal was to get Bulldozer to have similar IPC as Phenom II. Basically, Piledriver is what Bulldozer should have been.
Re: (Score:2)
I'm not saying the IPC is netburst like, but that the overall performance characteristic is low performance relative to what the competition *would* be at '8 cores'. Just like a NetBurst 3.0 ghz would have been trounced by the contemperary AMD at 3.0 ghz (and even much lower), an '8 core' is bested by something with much lower core count. For example, in the url you cite, they effectively consider the FX 8350 as a quad core rather than 8 core solution for the performance to be comparable. This is with a
Re: (Score:2)
Re: (Score:2)
Hence why I compared it very carefully to IBM'S SMT rather than Hyperthreading. IBM SMT has componentsto handle each 'thread' while sharing common components (including FPU in SMT8 but isn't shared in SMT4). It isn't 8 threads in the hyperthreading sense, but neither is it 8 'cores' with respect to how any other CPU vendor calls things cores. IBM is the only other microprocessor vendor that has something that resembles the AMD design, and they do not refer to the components as 'cores'.
I haven't seen any
Re: (Score:2)
Re: (Score:2)
Your results called out the Piledriver explicitly as 'per module' rather than 'per core' to make the numbers match. It basically validates the point you respond to. In practice, it's more complicated and can outshine hyperthreading in some cases, but in your specific citation the processor measures up if you pretend module==core rather than saying it is an 8 core system.
Re: (Score:1)
/p>
Basically, it's actually pretty analogous to NetBurst. NetBurst came along to deliver higher clock speeds since that was the focus of marketing, with some hope of significant workloads behaving a certain way to smooth over the compromises NetBurst made to get there. the workloads didn't evolve that way and NetBurst was a power hungry beast that gave AMD a huge opportunity. Now replace high clock speed with high core count and you basically have Bulldozer/Piledriver in a nutshell. I'm hoping AMD comes back with an architecture that challenges Intel agin, just like Intel came back from NetBurst.
I dont think it is as easy as that. If there were any major architectural changes that could wring any more large performance gains from x86, Intel/AMD would already have implemented that. But all the low hanging fruit have already been picked. The days of huge performance increases due to architectural changes are long gone. Thats why we went multicore, but beyond 8 cores, the gains diminish rapidly. These days we focus on power efficiency, but that only gets you so far.
I think the next big gains wont come
Come on, shake your PC, baby (Score:1)