Forgot your password?
typodupeerror
Graphics AMD

AMD FirePro W9100 16GB Workstation GPU Put To the Test 42

Posted by Unknown Lamer
from the more-power dept.
Dputiger (561114) writes "It has been almost two years since AMD launched the FirePro W9000 and kicked off a heated battle in the workstation GPU wars with NVIDIA. AMD recently released the powerful FirePro W9100, however, a new card based on the same Hawaii-class GPU as the desktop R9 290X, but aimed at the professional workstation market. The W9100's GPU features 2,816 stream processors, and the card boasts 320GB/s of memory bandwidth, and six mini-DisplayPorts, all of which support DP1.2 and 4K output. The W9100 carries more RAM than any other AMD GPU as well, a whopping 16GB of GDDR5 on a single card. Even NVIDIA's top-end Quadro K6000 tops out at 12GB, which means AMD sits in a class by itself in this area. In terms of performance, this review shows that the FirePro W9100 doesn't always outshine its competition, but its price/performance ratio keep it firmly in the running. But if AMD continues to improve its product mix and overall software support, it should close the gap even more in the pro GPU market in the next 18-24 months."
This discussion has been archived. No new comments can be posted.

AMD FirePro W9100 16GB Workstation GPU Put To the Test

Comments Filter:
  • But how many KH/s does it do?
    • any way to use some of that ram as a ram disk?

      • by asmkm22 (1902712)
        Trust me. The people using these cards, already have workstations with a shit ton of RAM anyway, not to mention SSD's are becoming more common for those use-scenarios.
      • by Anubis350 (772791)
        RAM is cheap, most people who are planning to toss this card in their workstation can also max its memory out to a level that 16GB wouldn't make much of a difference if they needed a ramdisk. For the few remaining people there are better solutions.

        Add to that that then you're competing on bandwidth on the PCIe lanes for access to the card's memory with anything you're actually using the card for and it makes no sense to bother.
      • Those people have enough money for DIMMs. Here you pay premium for GPU-to-VRAM bandwidth. And you certainly wouldn't be able to use that over PCIe properly. So you do pay much more money, but at least it's way slower...is that what you wanted? ;-)
  • by Anonymous Coward

    Is this a new Slashdot advertising site? Or is it the Slashdot review site?

  • what? (Score:1, Troll)

    by Charliemopps (1157495)

    A $3500 video card is news? wtf...

    Other than some crazy very specific applications this cards worthless.

    • by asmkm22 (1902712)
      This isn't "Slashdot: News for Charliemopps, Stuff that Matters"
    • Re:what? (Score:4, Informative)

      by arielCo (995647) on Monday July 14, 2014 @08:38PM (#47452953)

      Understanding the Workstation Market:

      The first thing we need to talk about is the difference between workstation and consumer GPUs. The GPUs themselves are essentially identical -- NVIDIA's Quadro K6000 is based on GK104 (Kepler) the older Quadro 6000 is a GF100 (Fermi)-based chip, the W9000 uses the same GCN core that powers the HD 7970/R9 280X, and today's W9100 is essentially identical to the Hawaii XT core inside the R9 290X. What sets these workstation cards aside are the amount of RAM they carry (typically 2-3x as much as a consumer card), their validation cycles (workstation GPU cores are hammered on far more than the consumer equivalents) and the amount of backend vendor support and optimization that AMD and NVIDIA both perform.

      This optimization process and long-term vendor partnership is what distinguishes the workstation market from the consumer space and the need to pay for some of those development costs is part of why workstation cards tend to cost so much more than their consumer equivalents.

      From TFA [hothardware.com].

      • I think they also have ECC.
        • Exactly. Also double precision is a requirement for scientific modeling and calculations. Do you really want a 1 bit error on a simulation that took a month to process?

          • Even with 16 gigs of state, one should still be able to do some checkpointing. But it is a nuisance anyway, and ECC at least gives warnings that your RAM may be well on its way to give way for good. Without it, one is almost clueless.
          • by SJester (1676058)
            This. I'm (90%) a scientist and I model some pretty nifty stuff. Our lab desktops have consumer GPUs. We write our code, run it for a bit, and if it looks good we send it over to the supercomputing center where it's run on Tesla systems. Beats the hell out of the days (before my time) when you'd have to stick a Post-It on your monitor that says" Do Not Turn Off - Working" and then come back three weeks later to find that it's crashed.
      • by Kjella (173770)

        Except that hothardware being tech geeks confuse cause and effect. The estimated cost of the 8GB of GDDR5 in the PS4 is ~$100, the hardware costs are almost the same and the "extra validation" mainly involves staying a little conservative with clock speeds and code optimizations. The real reason is market differentiation and if there is none you create one like with student and senior citizen discounts even though they all take up one seat. That lets you set entirely different prices based on the willingnes

    • by Anubis350 (772791)
      GPGPU is an important segment of the high end market these days...

      /The machine I'm typing this on right now has an nvidia Tesla in it
    • Crazy and very specific applications like CAD, video editing, video transcoding, and stuff like that you mean? Yeah, that's what they benchmarked.

  • by arielCo (995647) on Monday July 14, 2014 @08:42PM (#47452979)

    From TFA:

    Understanding the Workstation Market:

    The first thing we need to talk about is the difference between workstation and consumer GPUs. The GPUs themselves are essentially identical -- NVIDIA's Quadro K6000 is based on GK104 (Kepler) the older Quadro 6000 is a GF100 (Fermi)-based chip, the W9000 uses the same GCN core that powers the HD 7970/R9 280X, and today's W9100 is essentially identical to the Hawaii XT core inside the R9 290X. What sets these workstation cards aside are the amount of RAM they carry (typically 2-3x as much as a consumer card), their validation cycles (workstation GPU cores are hammered on far more than the consumer equivalents) and the amount of backend vendor support and optimization that AMD and NVIDIA both perform.

    This optimization process and long-term vendor partnership is what distinguishes the workstation market from the consumer space and the need to pay for some of those development costs is part of why workstation cards tend to cost so much more than their consumer equivalents.

  • As I sit here at my power hungery, but yet fairly slow FX4170 budgeting out a new i7 machine, I have to wonder if they noticed the very large pile of old video chipsetmakers sitting out in the street?

    the R series are not that impressive, hell a few of them are outran by their own cheaper older cards, workstation cards are a niche market, even most 3d cad tools like solid works runs reasonably on a integrated intel.

    It makes me wonder what they are thinking, there was a time where AMD made a huge variaty of i

    • by Anonymous Coward

      What they 'forgot' how to do was something that is arguably untenable given the transistor complexity of modern architectures; design a high end CPU with hand tuned design.

      They are instead suffering a logical but not quite real world realized architecture and are pretty much stuck with that until their next is ready for release (in something like 12 to 30 months is what I recall some story speculating the date should be like).

  • by Virtucon (127420) on Monday July 14, 2014 @09:46PM (#47453329)

    Nice writeup but I wish these folks writing these kinds of articles would at least take a look at the graphics they're putting out there. The graphs are senseless drivel and you don't know if the new card is better or worse. It's still a nice card but here's a better write up at Toms Hardware. [is.gd]

  • Wow, TFS is copied almost verbatim from the first two paragraphs of TFA. The only difference is the awkward cutting and editing that ultimately contort and torture the prose in TFA.

    Way to add value, editors!

    How well did they do? You decide! From TFA:

    It has been almost two years since AMD launched the FirePro W9000 and kicked off a new battle in the workstation GPU wars. Today, we're reviewing the company's FirePro W9100 -- a new card based on the same Hawaii-class GPU as the desktop R9 290 and R9 290X, but aimed at the workstation market and professional consumers. Does AMD's new card have what it takes to seize the professional performance crown?

    The W9100 is a full Hawaii GPU with 2,816 stream processors, 320GB/s of memory bandwidth, and six mini-DisplayPorts, all of which support DP1.2 and 4K output. It carries more RAM than any other AMD GPU -- a whopping 16GB of GDDR5 on a single card. Even NVIDIA's top-end Quadro K6000 tops out at 12GB, which means AMD sits in a class by itself in this area. The W9000 and W9100 have one other major point of differentiation -- each offers support for up to six 4K displays using DisplayPort 1.2. NVIDIA's top-end Quadro K6000 tops out at just two DP 1.2 ports. You can still theoretically run more DisplayPort 1.2 displays if you use a hub, but if you want to hook everything up through the video card, AMD has a distinct advantage here.

  • Seriously, the release of the W9100 means the 2013 Mac Pro no longer has the latest FirePro chipset in it ... making it a great opportunity to see if Apple will actually release video card upgrades for this machine, or if it's true that owners will just be stuck with whatever was ordered with it.

    • They might do a refresh with the new part, but upgrade kit? Not likely. Not first party anyway. I wouldn't be shocked if owc figures out the secret sauce to make it work.

  • I'm always so impressed when AMD manages to compete against Nvidia while AMD is only one tenth the size, they have far less resources to invest back into R&D and yet every generation they show up to the ring with a worthy entry ready for another fight. Both my GPU and CPU are AMD make, I love supporting the underdog.

"Irrationality is the square root of all evil" -- Douglas Hofstadter

Working...