Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics AMD Upgrades

AMD Releases New Tonga GPU, Lowers 8-core CPU To $229 98

Vigile (99919) writes AMD looks to continue addressing the mainstream PC enthusiast and gamer with a set of releases into two different component categories. First, today marks the launch of the Radeon R9 285 graphics card, a $250 option based on a brand new piece of silicon dubbed Tonga. This GPU has nearly identical performance to the R9 280 that came before it, but includes support for XDMA PCIe CrossFire, TrueAudio DSP technology and is FreeSync capable (AMD's response to NVIDIA G-Sync). On the CPU side AMD has refreshed its FX product line with three new models (FX-8370, FX-8370e and FX-8320e) with lower TDPs and supposedly better efficiency. The problem of course is that while Intel is already sampling 14nm parts these Vishera-based CPUs continue to be manufactured on GlobalFoundries' 32nm process. The result is less than expected performance boosts and efficiency gains. For a similar review of the new card, see Hot Hardware's page-by-page unpacking.
This discussion has been archived. No new comments can be posted.

AMD Releases New Tonga GPU, Lowers 8-core CPU To $229

Comments Filter:
  • by i kan reed ( 749298 ) on Tuesday September 02, 2014 @08:22AM (#47806135) Homepage Journal

    Sometimes I want to send headlines of this sort back to 1998 and see how the people of that era would react.

    • by sinij ( 911942 )
      Maybe FrozenPiss (or whatever that troll's name) is a time traveler from the future? That would explain some things.
  • by sinij ( 911942 ) on Tuesday September 02, 2014 @08:26AM (#47806159)
    I PC game, and for the first time in decades have zero reasons to upgrade. My rig is now about 2 years old and runs every title at max setting. Unless I upgrade to 4K monitor (and I see no reason to) my PC should last me another 3-4 years before I get bumped to medium settings.

    I just can't justify upgrading everything for messily 10% gain. As such, both Intel and AMD have to work harder on backwards compatibility. I might buy new CPU when it goes on sale if I also don't have to upgrade motherboard and RAM.
    • Couldn't agree more - gone are the days when you needed to upgrade every six months, now I only feel the need to upgrade every ~3 years. RAM is so cheap I stuck another 16gb in my machine just for the hell of it. I don't bother writing anything to DVD either, I just buy another external HDD - in fact I am using the SATA port for my main SSD, the DVD drive is not even plugged in anymore.
      • For gaming, right? I imagine things might speed up a little now we have a new generation of consoles. On the other hand, as graphics get better and better, game dev costs skyrocket, so perhaps we really are seeing a ceiling.

        (No matter how many times I encounter it, the flagrant mis-attribution in your sig still annoys me.)

        • Back in the day, my parents had a board game called "Lie, Cheat & Steal" [amazon.com]. A pretty fun game!
          • by Kargan ( 250092 )

            How can I tolerate you?

            (NOTE: This is a Tool reference, I'm not just being a random jerk).

        • Fixed that for you - although I was not attributing it to anyone, I was just saying who I stole it from. Considering it's generally attributed to Mark Twain who himself attributed it to Benjamin Disraeli, so no one really knows - did it really matter?
    • by i kan reed ( 749298 ) on Tuesday September 02, 2014 @08:41AM (#47806289) Homepage Journal

      2 years old puts you on par with the latest generation of console hardware, which is what AAA developers target, and indie devs tend to focus more on whatever idea/style they're trying to show than pushing polygons.

      In a year or two, when it becomes clear that there are certain kinds of things that can only be done on that years' hardware(maybe something physics related, or AI, or as a pipe dream ray tracing) then your 2 year old rig might start to have some trouble.

      • by HetMes ( 1074585 )
        Why in a year or two, and not a decade? What you suggest sounds like the technology push of TV manufacturers with their 3D, curved screens, 4k resolution TVs that nobody seems to be waiting for.
        Maybe we've arrived at a situation where the technology to do anything you could reasonably want is simply here, and gaming is going back to providing a unique experience and captivating story lines.
      • by jma05 ( 897351 )

        > In a year or two, when it becomes clear that there are certain kinds of things that can only be done on that years' hardware

        Rather than argue speculatively like this, why not argue more concretely with a case where what we have have today is not possible with 3 or 4 year old hardware? I can't think of anything off the top of my head. Even if there is some technique like that, how widespread is its use in today's content? And how much would a person miss by not having that itty bitty feature?

        PC gaming h

        • Because speculation doesn't require me to have a detailed and complex understanding of the particulars of CPU/GPU limitations. As a developer I've only ever run up against the "I'm rendering way too many polygons" problem.

          Which is the kinda thing cleaned up through optimization.

      • NO it wont. My 570 GTX is still going strong, my 770 will outlive this console generation in raw usability. The only reason i dont even run the 570 anymore is because it sucks power like crazy, but its still a totally viable part.
    • by Kjella ( 173770 )

      I PC game, and for the first time in decades have zero reasons to upgrade. My rig is now about 2 years old and runs every title at max setting. Unless I upgrade to 4K monitor (and I see no reason to) my PC should last me another 3-4 years before I get bumped to medium settings.

      Why not? Games can actually render 4K detail, unlike the real problem with 4K TVs, there's almost zero native content. I did manage to play a bit at full 2160p and it was beautiful but also totally choking my GTX 670 so I'm currently waiting for a next-gen flagship model (GTX 880/390X probably) for a SLI/CF setup. CPU/RAM don't seem to be holding it back much though, but maybe at 4K so upgrading those too.

      • by jandrese ( 485 )
        But even at 4k resolution you're still using textures designed for a 720p display, because it's a port of a game that was optimized to fit on a single disc and run a 720p.
    • by tlhIngan ( 30335 )

      I PC game, and for the first time in decades have zero reasons to upgrade. My rig is now about 2 years old and runs every title at max setting. Unless I upgrade to 4K monitor (and I see no reason to) my PC should last me another 3-4 years before I get bumped to medium settings.

      You can thank consoles becoming popular for that. Given how little money AAA titles make on PC (it generally covers the cost of the port), and yes, I mean money made, not copies actually in use (the only number that matters is "how m

    • Comment removed based on user account deletion
      • by PRMan ( 959735 )
        Just this weekend I realized that my motherboard's copyright date was 2009 (AMD Phenom X2 unlocked to 4 cores). It's 5 years old already (hard to believe) and I can play all the latest games on the top settings for the price of a $150 graphics card. I'm sure the 16GB RAM and the SSD help, but there is seemingly no reason to upgrade other than 4K, if you want to do that.
      • What neither chip maker wants to admit is that from 1993 to 2006 what we had was a BUBBLE, no different than the real estate or dotbomb bubbles.

        That's because it's total horseshit.

        In 1993 we had what a 486 at 60MHz or something? In 2006 we were up to the Core 2 processors which were several thousand times faster. It's not a bubble because it never burst. We still get to keep our Core 2 duo processors and they're every bit as fast. And the newer processors have been faster or cheaper or lower power and frequ

        • We had a growth bubble. Most corporations depend on endless growth to be healthy. When they stop growing, they start dying. When the PC market maxed out, both AMD and Intel suddenly had no idea where they were going next.

          When the new Intel processors come out on the new process and we get to see how low they can get power consumption, we'll see if Intel is going to continue to kick ass in the next iteration, which is going to have to be mobile.

      • ...All I did was slap in a $100 HD7750 to replace my aging HD4850 (which frankly still played the newer games just fine, it was just a heat monster) and everything plays great, with more bling than I can pay attention to in the heat of battle.

        ...I ran a log for a couple weeks on his home and office systems just to see how hard they were being slammed...the result? That Phenom I quad was maxing out at 35% and the Pentium Dual at work was maxing out at just 45%!

        So there really isn't any reason to upgrade any

    • by rdnetto ( 955205 )

      As such, both Intel and AMD have to work harder on backwards compatibility. I might buy new CPU when it goes on sale if I also don't have to upgrade motherboard and RAM.

      Intel, ok, but AMD? AMD doesn't make breaking changes to their sockets unless they're needed to support newer memory. They released AM3 to support DDR3 in 2009, and AM3+ is backward compatible. (The FM sockets are for APUs only and therefore not relevant.)
      In the same period, Intel has had 4 desktop sockets (twice as many as AMD), and none of which are backward compatible, AFAICT.

      Source: http://en.wikipedia.org/wiki/C... [wikipedia.org]

  • I suspect my next CPU will be arm(MIPS). I am astonished that I see I CPU the cost of several 1080p tablets. I am a little tired of all the posters of my computer does everything... I would love a faster machine, but at these prices they can whistle, and that is without the escalating cost of ram... and Microsoft bleeding it's monopoly to those tied into it.

    • and Microsoft bleeding it's monopoly to those tied into it.

      What?

    • Comment removed based on user account deletion
    • For AMD, these are top end processors. Their very best desktop CPU is now $230. That's pretty good.

      To my knowledge, the best current ARM tablets have maybe 4GB of RAM. So if you want something that offers superior performance, an x86_64 bit dual core processor matches or beats any current 8 core ARM chip (correct me if I'm wrong, anyway) plus a minimum motherboard plus 4GB of RAM plus a 32GB USB flash drive plus a cheap case and power supply, not including monitor, keyboard, and mouse will probably
      • by Guspaz ( 556486 )

        $230 is... OK. Performance is roughly comparable to Intel chips at the same pricepoint, but with significantly higher power draw. There may be potential for cost savings in platform costs, I've not looked into it.

    • ARM and MIPS are different things. The companies are different, instruction sets are different, cores are different, etc. They've nothing to do with each other.

    • by jedidiah ( 1196 )

      > I suspect my next CPU will be arm(MIPS). I am astonished that I see I CPU the cost of several 1080p tablets.

      Yes. And you will have to "outsource" any interesting computational tasks like something as simple as voice recognition. ARM based devices are good enough only so long as your use of it fits narrowly defined parameters driven by what speciality silicon is on your particular SoC. Even that is limited.

      ARM lags behind even ancient and discontinued x86 processors. PCs also have more interesting "spec

    • They do. AMD will sell you just a core for only $28.625. Isn't it a ripoff ?!?
  • Sigh. (Score:3, Interesting)

    by pushing-robot ( 1037830 ) on Tuesday September 02, 2014 @08:44AM (#47806319)

    This GPU has nearly identical performance to the R9 280 that came before it

    Which had nearly identical performance to the 7950 that came before it. Which came out nearly three years ago.

    Meanwhile, this says it all [pcper.com] about the CPU. Sure, the AMD might save you $100 over the (faster) Intel, but you'll pay that back on a beefier PSU and cooler and electricity bills to support the beast.

    What happened, AMD? I loved you back in the Athlon64 era...

    • It has much less power consumption than the R9 280 though. It would be more interesting as a laptop version

    • In the Athlon64 era Intel started negotiating with PC makers under terms like "we'll charge you 50% less per CPU if you sell zero AMD processors, 40% less per CPU if you sell less than 10% of your total sales volume as AMD processors, 30% less per CPU if you sell less than 20% of your total sales volume as AMD processors, and full price otherwise." AMD hemorrhaged cash, and could no longer afford the research investment they needed to make the Steamroller/Bulldozer chip family competitive with Intel's i-li
    • by Anonymous Coward

      What happened? They got a jackass MBA "buisness" CEO that decided it was a good idea to cut R&D and go to automated layout (Instead of hand layout like Intel does) Real fuckwit. The sort of guy that thinks everything besides sales is a cost center to be axed.

      Predictably, AMD's products suffered.

  • Tonga (Score:5, Funny)

    by rossdee ( 243626 ) on Tuesday September 02, 2014 @08:45AM (#47806341)

    King Tupou VI wants royalties

  • That's nice. Too bad the single core performance on the 8 core still SUCKS. There's a reason it beats the i5...because it has 2x the cores. Back in Windows 7 where almost all tasks are single core, even those in the OS itself, I need fast single core performance. What AMD needs to invent is hardware core multiplexing. In other words, have 8 cores but represent them to the system as 1 core and handle the distributed processing in the firmware. That would crush Intel.
    • by Anonymous Coward

      What AMD needs to invent is hardware core multiplexing. In other words, have 8 cores but represent them to the system as 1 core and handle the distributed processing in the firmware. That would crush Intel.

      It's been invented, it's called superscalar processing, and almost everything uses it. Good branch prediction is hard, as Intel learned with the Itanium.

      What AMD needs to do is admit that "modules" and "cores" are the same thing, and that they have a quite decent quad core processor on their hands. This silly doubled-up ALU architecture is strikingly similar to what Intel did with the Pentium 4, and it's having the same results - inferior performance, with deep pipelines and extremely high clocks.

      Here's the

      • by Anonymous Coward

        It's a little more complex than that - Bulldozer would be fine if it weren't a two-wide design*. Haswell, by comparison, is four-wide, which is what makes it about 50% faster. The module architecture would be fine, if a little irregular - faster in some workloads and slower in others as a result of the shared resources - if not for that deficiency. It made sense back when Bulldozer was originally designed, and was apparently roughly equivalent to the Steamroller iteration, but that version was cancelled

  • OK, since Slashdot is running these weekly Tom's Hardware-type posts, lemme axe you something:

    I've got a system I put together maybe three years ago. I used a good motherboard bought a good case, good RAM, very good PSU. It was when the first i5s were coming out, so it's an i5-750 (2.7ghz, I think). I didn't spend a lot of dough on a GPU, but I've been able to play everything up to and including Watch Dogs on this setup.

    I want to be ready for the fall games (The Crew, GTA V, Dragon Age Whatever, Witcher

    • by Junta ( 36770 )

      Why not actually try the games that you want and then decide if things are too slow at all, rather than listen to some people that will evangelize how cool new stuff is with impunity since it is not their money they are justifying spend on...

      Also, my wife thinks a grown man playing computer games is a little bit pathetic, and I can't really argue with her,

      What could be pathetic is neglecting responsibilities or pissing away family savings on superfluous stuff. If one takes care of their responsibilities appropriately and is prudent in their spending, it doesn't really matter if a grown man plays computer games or watch

      • If one takes care of their responsibilities appropriately and is prudent in their spending, it doesn't really matter if a grown man plays computer games or watch telly tubbies or whatever they like so long as it doesn't screw up other people's lives.

        You're not married, are you?

        Thanks for the advice, though. Right now, "Can I Run It" shows that most of the games that have published requirements will run on my machine. I'll save the dough and wait and see. It's not like I can't get a new video card in a d

        • by Junta ( 36770 )

          Indeed, even after installing the software, upgrading is easy. Excepting some DRM crap that could fire if you change too much, but that is BS.

          I am actually married and a father too. I can't disappear into a 'mancave' every day for hours on end or spend all our money on high end gaming equipment, but I don't catch flak for spending my time gaming for a short while many days and the occasional 'bender' of gaming. If I covered the house in gaming paraphernalia or something maybe, but as long as I don't go o

        • Im married and if my wife looked own upon video games I wouldnt have married her. Just saying, not all wives are like yours.
          • Im married and if my wife looked own upon video games I wouldnt have married her. Just saying, not all wives are like yours.

            I know. Some expect you to get a job.

            I'm pretty lucky all in all. I was able to retire on my 50th birthday and except for the occasional request to not throw another controller through the window because I'm frustrated with Dark Souls' horrible PC port, she doesn't mind me gaming. Occasionally, when company comes to the house, she'll ask me to put some pants on, though. I like to g

            • Occasionally, when company comes to the house, she'll ask me to put some pants on, though. I like to game au natural. She made me a nice little pad to sit on

              Yes., way too much information. But funny as hell.

        • Also, my wife thinks a grown man playing computer games is a little bit pathetic, and I can't really argue with her

          If one takes care of their responsibilities appropriately and is prudent in their spending, it doesn't really matter if a grown man plays computer games or watch telly tubbies or whatever they like so long as it doesn't screw up other people's lives.

          You're not married, are you?

          I'm married, and I have no problems with my wife's opinion about pretty much any decision I make. If you can't do thing

    • I'd say just a new GPU would be fine. I use an Asus 770GTX and can play everything I've tried on max settings @1440p, so you should fine @1080p.

      The 770 doesn't take advantage of the higher power efficiency parts in the newest Nvidia generation, but the price on some of the variants is quite good. Newegg has a Zotac verison for $275: http://www.newegg.com/Product/... [newegg.com]

      The 280X can be picked up for a little less, bit it uses more power and is louder from what I have read.

      • I'd say just a new GPU would be fine. I use an Asus 770GTX and can play everything I've tried on max settings @1440p, so you should fine @1080p.

        That's good advice. Do you happen to know if new cards like the 770 are backwards compatible with motherboards that don't have the latest PCI-e 3.0? My motherboard has PCI-e 2.0.

        Oh, I guess I can go look it up. Thanks for the good advice.

    • The 6850 while not a bad card now, may struggle to play "next generation" games. That being said AMD cards like the 270x are dead cheap on ebay, or get a card in the $250-$300 range and that should work for a while.

      Curious what your wife does for fun?

      • Curious what your wife does for fun?

        She makes fun of grown-ass men who put on helmets with horns on them and play computer games in their underwear.

        Personally, I think I look pretty cool in the helmet with the horns, and playing in just my underwear makes me feel more like a level 50 battlemage.

    • Your CPU is one degree away from continuing to be viable, imho. Anything before Sandy Bridge should be de-commissioned. This is my personal rule.
  • by Junta ( 36770 ) on Tuesday September 02, 2014 @10:10AM (#47807127)

    If IBM did the processor, they would have called it 4 Core with SMT2. Basically you have 4 modules, with 2 of many of the components, but a lot of shared components. Notably, each of the 4 modules has a single FPU (so it's more like IBM's SMT8 versus SMT4 mode if you talk about their current stuff).

    So it's more substantial than hyperthreading, but at the same time not reasonable to call each chunk a 'core'. I think it behaves better than Bulldozer did at launch *if* you have the right platform updates to make the underlying OS schedule workload correctly, but it's still not going to work well (and some workloads work better if you mask one 'core' per module entirely).

    Basically, it's actually pretty analogous to NetBurst. NetBurst came along to deliver higher clock speeds since that was the focus of marketing, with some hope of significant workloads behaving a certain way to smooth over the compromises NetBurst made to get there. the workloads didn't evolve that way and NetBurst was a power hungry beast that gave AMD a huge opportunity. Now replace high clock speed with high core count and you basically have Bulldozer/Piledriver in a nutshell. I'm hoping AMD comes back with an architecture that challenges Intel agin, just like Intel came back from NetBurst.

    • Somewhat analogous to P4 but not quite in that Bulldozer IPC is about at Phenom II levels. See here [anandtech.com]: fully loaded, IPC is equivalent to Sandy Bridge/Ivy Bridge but single-threaded it's about at Phenom II IPC.

      AMD's original goal was to get Bulldozer to have similar IPC as Phenom II. Basically, Piledriver is what Bulldozer should have been.

      • by Junta ( 36770 )

        I'm not saying the IPC is netburst like, but that the overall performance characteristic is low performance relative to what the competition *would* be at '8 cores'. Just like a NetBurst 3.0 ghz would have been trounced by the contemperary AMD at 3.0 ghz (and even much lower), an '8 core' is bested by something with much lower core count. For example, in the url you cite, they effectively consider the FX 8350 as a quad core rather than 8 core solution for the performance to be comparable. This is with a

    • by dshk ( 838175 )
      Huh? The two cores of the Bulldozer module indeed has one common FPU unit, but that is a 256 bit one, which can be divided into two 128 bit unit (or even into four 64 bit unit!). I did test FPU performance and in the worst case it was 25% slower, in the best case it was actually faster, when I run two threads on a single module vs on two modules. Usually the difference is very small. Please do not compare the AMD Bulldozer architecture to Intel Hyperthreading, the two technology has very different purposes.
      • by Junta ( 36770 )

        Hence why I compared it very carefully to IBM'S SMT rather than Hyperthreading. IBM SMT has componentsto handle each 'thread' while sharing common components (including FPU in SMT8 but isn't shared in SMT4). It isn't 8 threads in the hyperthreading sense, but neither is it 8 'cores' with respect to how any other CPU vendor calls things cores. IBM is the only other microprocessor vendor that has something that resembles the AMD design, and they do not refer to the components as 'cores'.

        I haven't seen any

    • /p>

      Basically, it's actually pretty analogous to NetBurst. NetBurst came along to deliver higher clock speeds since that was the focus of marketing, with some hope of significant workloads behaving a certain way to smooth over the compromises NetBurst made to get there. the workloads didn't evolve that way and NetBurst was a power hungry beast that gave AMD a huge opportunity. Now replace high clock speed with high core count and you basically have Bulldozer/Piledriver in a nutshell. I'm hoping AMD comes back with an architecture that challenges Intel agin, just like Intel came back from NetBurst.

      I dont think it is as easy as that. If there were any major architectural changes that could wring any more large performance gains from x86, Intel/AMD would already have implemented that. But all the low hanging fruit have already been picked. The days of huge performance increases due to architectural changes are long gone. Thats why we went multicore, but beyond 8 cores, the gains diminish rapidly. These days we focus on power efficiency, but that only gets you so far.

      I think the next big gains wont come

"Confound these ancestors.... They've stolen our best ideas!" - Ben Jonson

Working...