Forgot your password?
typodupeerror
Graphics AMD Upgrades Hardware

AMD Confirms Kaveri APU Is a 512-GPU Core Integrated Processor 130

Posted by timothy
from the can't-even-count-that-high dept.
MojoKid writes "At APU13 today, AMD announced a full suite of new products and development tools as part of its push to improve HSA development. One of the most significant announcements to come out the sessions today-- albeit in a tacit, indirect fashion, is that Kaveri is going to pack a full 512 GPU cores. There's not much new to see on the CPU side of things — like Richland/Trinity, Steamroller is a pair of CPU modules with two cores per module. AMD also isn't talking about clock speeds yet, but the estimated 862 GFLOPS that the company is claiming for Kaveri points to GPU clock speeds between 700 — 800MHz. With 512 cores, Kaveri picks up a 33% boost over its predecessors, but memory bandwidth will be essential for the GPU to reach peak performance. For performance, AMD showed Kaveri up against the Intel 4770K running a low-end GeForce GT 630. In the intro scene to BF4's single-player campaign (1920x1080, Medium Details), the AMD Kaveri system (with no discrete GPU) consistently pushed frame rates in the 28-40 FPS range. The Intel system, in contrast, couldn't manage 15 FPS. Performance on that system was solidly in the 12-14 FPS range — meaning AMD is pulling 2x the frame rate, if not more."
This discussion has been archived. No new comments can be posted.

AMD Confirms Kaveri APU Is a 512-GPU Core Integrated Processor

Comments Filter:
  • nVidia has at least three [geforce.com] versions of the GT630, each fairly different from one another. None of these would be an amazing accomplishment to beat, although they are more powerful than Intel's normal integrated offerings.
    • by timeOday (582209)
      The Intel Iris Pro 5200 is 28% faster than the Geforce GT 630 on the PassMark G3D [videocardbenchmark.net] benchmark. (I don't know how much the variants you linked differ in performance?)
      • by Anonymous Coward

        The Intel Iris Pro 5200 is 28% faster than the Geforce GT 630 on the PassMark G3D benchmark. (I don't know how much the variants you linked differ in performance?)

        Except the Iris Pro variant is found in shiny i7 Haswells that start at about $470 in bulk for the cheapest part. In which case the price/performance ratio is clearly against Intel.

    • by Anonymous Coward

      Actually, the GT 630 scores 720 on passmark, while the Iris Pro 5200 scores 922. So not only did AMD choose a remarkably shitty graphics card to test against, they also chose one that's slower even than the integrated chip on the Intel CPU.

      • by Rockoon (1252108)
        AMD's last generation APU scores 865 on that benchmark, so not sure what point you are trying to make here.

        I expect the top end Kaveri to score ~1150 on passmarks G3D, and it wont cost $450 or $650 like the two Intel chips that actually have the 128 MB of L4 cache that distinguishes the Iris Pro 5200 from the Intel HD 4600 (which only scores 598 on G3D)
  • Cool (Score:1, Troll)

    by 0123456 (636235)

    So if I buy an AMD CPU, I can play games with low frame-rates at low detail settings (yeah, I know it says 'medium', but when almost all games now go at least up to 'ultra', 'medium' is the new 'low').

    Or I could just buy a better CPU and a decent graphics card and play them properly.

    • Re:Cool (Score:4, Insightful)

      by Fwipp (1473271) on Tuesday November 12, 2013 @01:58PM (#45403771)

      Yes, if you spend more money you can get more performance. The whole point of the APU is that you can spend less on a single piece of silicon than you would for "a better CPU and a decent graphics card."

    • Re:Cool (Score:5, Insightful)

      by asliarun (636603) on Tuesday November 12, 2013 @02:06PM (#45403865)

      So if I buy an AMD CPU, I can play games with low frame-rates at low detail settings (yeah, I know it says 'medium', but when almost all games now go at least up to 'ultra', 'medium' is the new 'low').

      Or I could just buy a better CPU and a decent graphics card and play them properly.

      Yes, but could you do that in a compact HTPC cabinet (breadbox sized or smaller) and have your total system draw less than 100W or so?

      I'm really excited by this news - because it allows traditional desktops to reinvent themselves.

      Think Steam Machines, think HTPC that lets you do full HD and 4k in the future, think HTPC that also lets you do light-weight or mid-level gaming.
      Think of a replacement to consoles - a computing device that gives you 90% of the convenience of a dedicated console, but gives you full freedom to download and play from the app store of your choice (Steam or anything else), gives you better control of your hardware, and lets you mix and match controllers (Steam Controller, keyboard and mouse, or something else that someone invents a year down the line).

      I'm long on AMD for this reason. Maybe I'm a sucker. But there is a chance that desktops can find a place in your living room instead of your basement. And I'm quite excited about that.

      • by Nemyst (1383049)
        Why the needlessly stringent power draw? You can get passively cooled discrete GPUs or low-noise active cooling which would give you a major bump in performance. APUs won't be able to do 4K for a loooong time for anything but video.
        • by Rockoon (1252108)

          APUs won't be able to do 4K for a loooong time for anything but video.

          ..if a "looooong time" means as soon as AMD and Intel support DDR4, which is in 2014... sure.

          The main bottleneck for on-die GPU's is memory bandwidth. Intel "solved" the bandwidth problem in Iris Pro by including a massive L4 cache that cannot be manufactured cheaply. AMD hasn't solved the problem, but is far enough ahead in GPU design that the Iris Pro is only on par with AMD's 6800K APU.

          • by Kjella (173770)

            APUs won't be able to do 4K for a loooong time for anything but video.

            ..if a "looooong time" means as soon as AMD and Intel support DDR4, which is in 2014... sure.

            I think by "anything but video" he was referring to gaming, even the 780 Ti and R9 290X struggle with 4K. What do you think DDR4 would change? As far as I know they already support the 4K resolution but it'll play like a slide show.

            • by Rockoon (1252108)

              even the 780 Ti and R9 290X struggle with 4K.

              30+ FPS in Far Cry 3 with Ultra settings at 4K resolution doesnt sound to me like "struggling"
              49 FPS Grid 2 with Ultra settings at 4K resolution doesnt sound to me like "struggling"
              45+ FPS in Just Cause 2 with Very High settings at 4K resolution doesnt sound like "struggling"
              30+ FPS in Total War: Rome II with HQ settings at 4K resolution doesnt sound like "struggling"

              If you arent right about existing hardware.. how could you possibly be a good judge of future hardware? Are you unaware that people are

          • If expensive solutions count, why not a PC version of the PS4 board?
            8Gbyte would be enough for an average gaming PC, and with DDR5 the bandwidth would be decent too :-)

            • by Rockoon (1252108)

              If expensive solutions count, why not a PC version of the PS4 board?

              Maybe because nobody is selling such a thing?

              The deal here is that memory controllers are now integrated into CPU''s (for performance reasons), so you can really only effectively use memory that the CPU was specifically designed for. Even if the mobo provided some emulation so that you could drop gddr5 into it instead of ddr3, your throughput would still be limited to 64 times the cpu's base memory clock, which is already attainable with ddr3.

        • by asliarun (636603)

          Why the needlessly stringent power draw? You can get passively cooled discrete GPUs or low-noise active cooling which would give you a major bump in performance. APUs won't be able to do 4K for a loooong time for anything but video.

          You make a valid point - and I don't know *all* the options that exist.
          It would actually be a very interesting exercise to do this kind of a comparison. Say, take some HTPC like constraints such as space and heat, identify the options available - both CPU+discrete graphics and CPU+GPU integrated, and compare the options using price and performance.

          Back to your point, it is not just power draw - space and cooling are also factors. A reasonably strong integrated CPU+GPU system lets you build a cabinet that ca

        • Bassed on the requirements I'm guessing this is for HTPC purposes.

          In that case the 'needlessly stringent' power draw is because

          A. The case is probably tiny, it may not even have space for a discrete GPU. Less power input = less heat to dissipate.

          B. For a completely fanless solution you want a picoPSU. These max out at around 160watts.

          C. Most people looking for quiet HTPC could care less if you can run Gears of Warcraft 5 on it.
        • My HTPC has a passively cooled Geforce 210. Works like a charm, quiet enough that the wife hasn't thrown it out of the living room. Sure, it's slower than the Intel HD 4000, but it cost me $30 when the old GPU died.
      • by StikyPad (445176)

        Won't happen. Integrated devices like "smart" TVs and dedicated media streaming hardware have already obsoleted HTPCs, and as much as I like to play some PC games on a big screen, the market is clearly satisfied with the walled gardens known as game consoles, most of which can serve as media streaming clients as well.

    • by cloud.pt (3412475)
      The real deal here is you are purchasing an APU for roughly less than half the price when compared to a mid-range Intel & discrete graphics solution, and getting double the performance. Apple knows this is the way to go for price-performance and that is why the new entry-level 15' MBPs lost discrete GPU. OEMs like Apple are forcing Intel to catch up with integrated GPU technology. It's all about trade-offs: you place a less performing GPU in the same die as the CPU in order to get the best possible me
    • So I _COULD_ buy a car for under $20k that does 0-60 in a modest amount of time...
      Or I could just buy a Bugatti Veyron with a better engine and drive properly.

      Is that the argument you're making?

  • Yes, but... (Score:5, Funny)

    by mythosaz (572040) on Tuesday November 12, 2013 @01:52PM (#45403695)

    ...how much faster does it mine Bitcoins?

    I need to mine some so I can put them in a totally safe online wallet.

    • by PRMan (959735)
      It won't. There are custom chips for that now. And you're right, online wallets aren't safe.
  • What market is amd shooting for?
    Haswell with iris pro will probably beat out amd for integrated graphics performance and will have better battery life.
    On the top end, desktop users will always go for a dedicated graphics card.
    On the mobile end, these things will eat up battery and have no reason to be on a tablet.
    All that's left is the cheap oem side of things. Haswell is still fairly expensive on the low end. If intel can bring down the price a bit and make it competitive they will beat out amd in every

    • by amiga3D (567632)

      The cheap OEM side is a huge market. This is good because now the cheap OEM side is decent instead of shitty in terms of performance. Hardcore gamers with money to blow are not the market for this.

      • sorry, i should of said cheap oem side with mid 3d graphics. The market doesn't exist for that. The typical non gaming user wouldn't care if it was the 3d graphics capability was from 3 years ago. If it can play video and run business apps that's all they care about

        Old sandy bridge/ivy core chips already fit that market perfectly and the price will/have come down for those chips already. Or throw an old richland/trinity and they wouldn't know the difference.

    • by symbolset (646467) *
      Small form factor business PCs, Media center PCs, low-end Steambox, emerging economies desktop. Strangely enough, servers. Integrating the GPU into the CPU gets the BOM cost down and raises the minimum performance standard. They are now approaching a teraflop on an APU. That is amazing.
      • Re: (Score:2, Insightful)

        by 0123456 (636235)

        Small form factor business PCs,

        Don't need 3D performance. Don't need GPGPU performance in 99% of cases.

        Media center PCs

        Plenty fast enough already to play video at 1920x1080.

        low-end Steambox

        If you want your games to look like crap.

        Integrating the GPU into the CPU gets the BOM cost down and raises the minimum performance standard.

        Because lots of people run 3D games on servers.

        Certainly we do use GPUs for some floating-point intensive tasks on servers, but this is nowhere near fast enough to be useful.

        • by SirSlud (67381)

          Because lots of people run 3D games on servers.

          Certainly we do use GPUs for some floating-point intensive tasks on servers, but this is nowhere near fast enough to be useful.

          We're not that far off thin client gaming. So suggesting that lots of companies won't be running 3D games server-side in the near future is disingenuous.

          • by 0123456 (636235)

            We're not that far off thin client gaming. So suggesting that lots of companies won't be running 3D games server-side in the near future is disingenuous.

            If you're buying a server to run 'thin client gaming', you sure as heck won't be using integrated graphics to do so.

            • Why fucking not. A single 100W server that stream the games to 2W terminals and old computers and laptops that can't fucking run the game in the first place, because of wrong OS, wrong hardware and wrong software. But only the server needs maintained and the games work everywhere. Sign me up, even if for four 1024x768 instance of an old networked game ; that feels valuable to me.
              That beats running flash games slowed down by the VNC or X11 streaming on the thin clients.

        • Re: (Score:2, Troll)

          by jkflying (2190798)

          Small form factor business PCs,

          Don't need 3D performance. Don't need GPGPU performance in 99% of cases.

          Doesn't matter, because it's cheap. Also, CAD and Photoshop *do* use GPGPU these days.

          Media center PCs

          Plenty fast enough already to play video at 1920x1080.

          This should handle 4k video decoding.

          low-end Steambox

          If you want your games to look like crap.

          I think you missed the "low end" part of that quote. Also, it will be really, really cheap compared to something with an additional dGPU. You don't even need PCIe on the motherboard. Not everybody can afford to game at 3x 1080p on high. These should handle 1080p on medium just fine.

          Integrating the GPU into the CPU gets the BOM cost down and raises the minimum performance standard.

          Because lots of people run 3D games on servers.

          Certainly we do use GPUs for some floating-point intensive tasks on servers, but this is nowhere near fast enough to be useful.

          These have HUMA. GPGPU-CPU interactions will be much faster than on any previous architecture because not

    • by Rockoon (1252108)
      Intels problem is that the method they used to get Iris Pro to perform so well for them (which is actually only about equal to the existing Radeon HD 8670D in the 6800K APU) is expensive, and the method isnt going to get any cheaper any time soon.

      The method is simply to add another cache level to their cache hierarchy, and to make it a massive and ridiculously expensive 128MB.

      If cache was cheap, all their processors would have a 128MB L4 cache. Cache is quite expensive tho, which is why their budget Has
    • by edxwelch (600979)

      The eDRAM in the Iris pro is quite expensive to manufacture and hence, Intel charges a premium for it. The Core i7-4770R for instance, is listed at $392.00.

    • by Aighearach (97333)

      This is huge because it means low-end systems having strong performance with HTML5 apps, WebGL, and casual/educational 3d rendering. It also means that gaming on low-end systems will be vastly improved.

      I can't imagine how you can claim how the market will respond to this vs. Intel's offering without some prices and delivery dates. Historically, the AMD offering will have a more favorable price/performance ratio, and Intel will sell more in the high end based on their brand.

      And these are low power. An APU us

    • by Anonymous Coward

      What market is amd shooting for?

      Clearly the Finnish one - 'kaveri' means a friend in Finnish.

    • by Iniamyen (2440798)

      All that's left is the cheap oem side of things.

      Isn't that pretty much the biggest market out there??

  • by Anonymous Coward

    We need a CPU/GPU SoC based on the tech that's going in to the xbone and ps4. They both have multicore procs with a built in GPU that's capable of pushing next gen games at HD resolution.

    We need that. A cheap PC built on a powerful single-chip solution. Not this wimpy shit.

    Personally, I'd go for the the PS4 solution. 8 gigs of high speed GDDR5 that's both the main memory and graphics memory? Fuck yes. Give me that. I'd forgo the ability to expand memory if I could have GDDR5 as main memory. (The DDR3+128meg

    • by dc29A (636871) *

      They both have multicore procs with a built in GPU that's capable of pushing next gen games at HD resolution.

      Not the Xbox One, it will render 720p and 900p and upscale it to 1080p. PS4 can render 1080p without upscaling.

  • AMD should at least try using Intel Iris Pro which is their highest end GPU. The 630 GT is a ok low end GPU depending on which version they use.

  • by Ecuador (740021) on Tuesday November 12, 2013 @02:28PM (#45404145) Homepage

    Actually the clock speed for the 862GFLOPS figure is in the footnotes, see here: http://images.anandtech.com/doci/7507/amd_kaveri_specs-100068009-orig.png [anandtech.com]
    So, even unintentionally, they are talking about clock speeds...

  • by Anonymous Coward on Tuesday November 12, 2013 @02:42PM (#45404315)

    These machines share the memory between CPUs and GPUs, and that's the advantage:
    You can use the GPU cores to do otherwise forbiddingly expensive operations (such as detailed
    physics, particle simulations, etc) very easily. Traditional systems need to copy data between vram and main memory over the bus system bus, which takes time.

    Programming languages are already starting to support mixed CPU/GPU programming with through new language constructs. At the moment, mainly rendering and physics is done on the GPU, soon it will be easy to do anything that can be efficiently parallelized.

    • Just like the CBM Amiga
      • by idontgno (624372)

        Yup.

        It's the wheel of reincarnation: [catb.org]

        ...a well-known effect whereby function in a computing system family is migrated out to special-purpose peripheral hardware for speed, then the peripheral evolves toward more computing power as it does its job, then somebody notices that it is inefficient to support two asymmetrical processors in the architecture and folds the function back into the main CPU, at which point the cycle begins again.

    • by Alomex (148003)

      soon it will be easy to do anything that can be efficiently parallelized.

      ...on a SIMD architecure. GPUs still run the same program (or at best a few programs) over groups of cores. GPUs are quite a bit aways from supporting multicore style MIMD parallelism.

  • Brand new AMD APU with 512 GPU cores beats Discrete NVidia card with 128 cores that's more than a year old now.

    Hang on, was this supposed to be impressive?

    • by idontgno (624372)

      Of course it was.

      It's benchmarketing. You're not supposed to pay attention to the unbalanced comparison behind the curtain. You're supposed to suspend all critical thought and begin Pavlovian salivation. Otherwise, you're not fanboi enough and need some re-education. Or something.

      Meh. The way you can tell a marketer isn't lying is when he's not breathing.

    • It's impressive because it all sits in one and the same chip, is technically innovative with its new HSA, and is dead cheap. The AMD A-series cover 4/5 desktop users needs at bargain prices. This sort of integration has a huge market potential, and AMD is leading the development.
      • You could have replaced the Intel 4770k with a hamster in a wheel and it would have got the same FPS. AMD is trying to infer their APU's CPU is better at gaming than the reasonably high end Intel CPU.

    • by tibman (623933)

      lol, i hope your cellphone isn't still the size of a brick. What's the point of smaller and cheaper, right?

      • What I'm saying is it's par for the course. Nothing to be impressed about or news worthy.

        You're saying the latest and greatest 5."+ screen cellphone isn't approaching brick size?
        Cell phones have been getting bigger since the 3.5" screen iPhone came out ~5 years ago.

        • by tibman (623933)

          I think it's pretty normal for new CPUs and GPUs to be news here though. You're right about phones getting bigger though.

  • As SOC become more capable, less need for dedicated GPU. Lower cost and smaller devices will drive this. If Kaveri plays BF4, it could take a big chunk of video card sales. AMD is going to catch up with Intel offloading parallel work to the GPU. Nvidia GPU will die out since they do not have a CPU to pair with it. Would have been good for Intel to by Nvidia. I don't think Intel will now that Nvidia is making Tegra ARM chips. Tegra is uphill battle against qualcomm and dozen other ARM companies.

"The whole problem with the world is that fools and fanatics are always so certain of themselves, but wiser people so full of doubts." -- Bertrand Russell

Working...