Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics AMD Hardware

AMD Launches Radeon R9 380X, Fastest GPU Under $250 (hothardware.com) 110

MojoKid writes: Although AMD's mid-range GPU line-up has been relatively strong for a while now, the company is launching the new Radeon R9 380X today with the goal of taking down competing graphics cards like NVIDIA's popular GeForce GTX 960. The Radeon R9 380X has a fully-functional AMD Tonga GPU with all 32 compute units / 2048 shader processors enabled. AMD's reference specifications call for 970MHz+ engine clock with 4GB of 1425MHz GDDR5 memory (5.7 Gbps effective). Typical board power is 190W and cards require a pair of supplemental 6-pin power feeds. The vast majority of the Radeon R9 380X cards that will hit the market, however, will likely be custom models that are factory overlcocked and look nothing like AMD's reference design. The Radeon R9 380X, or more specifically the factory overclocked Sapphire Nitro R9 380X tested, performed significantly better than AMD's Radeon R9 285 or NVIDIA's GeForce GTX 960 across the board. The 380X, however, could not catch more powerful and more expensive cards like the GeForce GTX 970. Regardless, the Radeon R9 380X is easily the fastest graphics card on the market right now, under $250.
This discussion has been archived. No new comments can be posted.

AMD Launches Radeon R9 380X, Fastest GPU Under $250

Comments Filter:
  • by Anonymous Coward

    They're really drawing a fine line on this one.

    The 380X, however, could not catch more powerful and more expensive cards like the GeForce GTX 970. Regardless, the Radeon R9 380X is easily the fastest graphics card on the market right now, under $250.

    Seeing as how the GTX 970 has broken under the $300 mark in the last few weeks, they're not doing much to sell me on the R9 380X.

    • by PIBM ( 588930 )

      Yeah, for 50$ more which is usually much less than 5% of your computer price (I include peripherals like monitors when I evaluate the effectiveness of an upgrade) you are getting 25%+ better performance with the GTX970... (shadow of mordor, low resolution of 2560x1440) Why would you want that card exactly?

      Even if you would only take that video card value in mind while evaluating, 300/250 still equals only 20% more.. Price point should have been less than 200$ to be in a different range alltogether for it to

      • by Anonymous Coward

        Yeah, for 50$ more which is usually much less than 5% of your computer price (I include peripherals like monitors when I evaluate the effectiveness of an upgrade) you are getting 25%+ better performance with the GTX970... (shadow of mordor, low resolution of 2560x1440) Why would you want that card exactly?

        Even if you would only take that video card value in mind while evaluating, 300/250 still equals only 20% more.. Price point should have been less than 200$ to be in a different range alltogether for it to be somewhat interesting. Oh well, maybe someday AMD will bring something interesting again :)

        On what planet is 2560x1440 considered 'low resolution'?

        • by Anonymous Coward

          > On what planet is 2560x1440 considered 'low resolution'?

          Planet Fourkay?

      • by Anonymous Coward

        Plus AMD still can't write drivers for shit.

    • On Amazon, I see GTX 970s for $290 - http://amzn.to/1PPGikI [amzn.to] - That's a full $50 - $60 more than the 380X or a 20% premium or so, for about 10 - 15% more performance. This card is 25 - 30% faster than a 4GB GTX 960, for about a 10% premium. So, not sure how fine a line it is but it's definitely an incremental step up from a 960 for a good chunk less than a 970, with the caveat I would toss in that's "if" AMD AIB partners actually hit that $229 MSRP AMD is claiming.
    • And that the GTX 960 was available at the beginning of the year...too little too late...
  • Cheapest 970 was 285 on newegg (and out of stock) 35 bucks call thats 15% more and it's averages a good bit more FPS on some games as much as 50% more then this cards numbers.

    • https://slickdeals.net/f/8262137-zotac-geforce-gtx-970-4gb-256-bit-video-card-250-free-shipping?src=SiteSearch

      That's actually 249.99 for a GTX 970 including an AAA game. Free shipping.
    • When they say 190W "board power", I'm thinking holy cow - that is about $40 a month in electricity (here in socal, in the high tier).

      But that's assuming 190W draw, 24/7... So how much power do video cards really use? Assuming in typical use; mostly normal apps, some gaming, a lot of screen asleep time... does anyone have an idea?
      • Re: (Score:2, Interesting)

        by Anonymous Coward

        But that's assuming 190W draw, 24/7... So how much power do video cards really use? Assuming in typical use; mostly normal apps, some gaming, a lot of screen asleep time... does anyone have an idea?

        Typical idle draw is around 20W. Much like a CPU, the GPU underclocks and undervolts when idle so as to reduce heat and power requirements.

      • How often do you game not much else pushes a modern card.

        • How often do you game not much else pushes a modern card.

          Someone is running a bitcoin miner (or altcoin) screen saver. Someone who doesn't pay their own electricity bill. :-)

          • by jandrese ( 485 )
            Someone who is mining bitcoins on a GPU is either losing money or stealing electricity off of someone. Bitcoin is way past the point where it's economically feasible to mine them on a GPU.
            • Someone who is mining bitcoins on a GPU is either losing money or stealing electricity off of someone. Bitcoin is way past the point where it's economically feasible to mine them on a GPU.

              As I said, its not people who pay their own electricity bill. And its not just the bitcoin and other sha256 altcoins, its also all the scrypt based altcoins. Its an ASIC world now, and only fairly recent ASICs at that.

            • Someone who is mining bitcoins on a GPU is either losing money or stealing electricity off of someone. Bitcoin is way past the point where it's economically feasible to mine them on a GPU.

              Unless the person is cold and would have been running a space heater otherwise. One might think of a GPU as a subsided space heater. :-)

  • by jbssm ( 961115 ) on Thursday November 19, 2015 @10:02AM (#50962157)

    The performance per energy consumption still lags greatly behind NVIDIA offerings.

    Besides that, there is CUDA, yes I know it's a closed standard but there is a reason most GPU computing libraries, specially in Deep Learning fields use preferably CUDA: it's just easier to get more performance out of it with less hassle.

    If you just want to play games and electricity costs are not a concern to you (so, most teenagers I suppose) Radeon is ok, but if you are not in that category, I find it hard not to have to choose a GeForce.

    • by Anonymous Coward

      Heterogeneous-compute Interface for Portability (HIP) – CUDA Compilation For AMD GPUs

      some time.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      I'm always surprised to see this mentality in tech oriented groups. AMD's contributions to GPU tech are huge, and very community friendly. They use and produce open source software, and actively support Khronos group efforts. Their tech is always non-proprietary, and works across even non-AMD devices. For development of any kind debugging information provided by the AMD gear is just plain more useful. As for CUDA - it is almost directly inferior to OpenCL. CUDA's prevalence is largely due to NVIDIA's

      • As for CUDA - it is almost directly inferior to OpenCL. CUDA's prevalence is largely due to NVIDIA's attempts to jam it down every available throat.

        Not even close. CUDA came out well before OpenCL (CUDA in June 2007, OpenCL 1.0 in August 2009), and has remained ahead features, tools and stability-wise ever since. (yes I have used both). I would really like for AMD + OpenCL to be better than NVIDIA + CUDA, but I've been wishing for that for the last 6 years and it has yet to happen.

  • by Galaga88 ( 148206 ) on Thursday November 19, 2015 @10:10AM (#50962219)

    It's been a long time (relatively speaking) since I've played the graphics card game. I remember that AMD's cards were technically solid, but often plagued with driver issues. Even now I'm reading about performance issues with Fallout 4 (which is probably Bethesda's fault because it's an unpatched Bethesda game.)

    Has the situation improved? Am I holding onto old biases?

    (Alas, for the heady days of my Voodoo2.)

    • by netsavior ( 627338 ) on Thursday November 19, 2015 @10:23AM (#50962345)
      AMD has better hardware, nvidia actually writes good drivers. Look at Fallout 4, the nvidia minimum system requirements...

      Minimum card to play fallout 4 Radeon HD 7870 or GeForce GTX 550 Ti:
      when you compare the two cards it is insane, the difference in the driver is not to be taken lightly.
      Radeon HD 7870 vs GeForce GTX 550 Ti
      2,560 GFLOPS vs 691.2 GFLOPS
      23,592 vs 9,923 3dMark Vantage score
      80 GTexel/s vs 28.8 GTexel/s


      Basically at this point the general advice is: If you want to play games, buy nvidia... if you want to mine crypto currency block chains, buy Raedon.
      • Re: (Score:2, Interesting)

        by Anonymous Coward

        You're missing the entire point here.

        nVidia is intentionally segmenting THEIR customers into consumer and professional grade, byt INTENTIONALLY LIMITING compute functionality of consumer grade products.

        ATI(or AMD if you want to maintain the fiction) segments them in a similar fashion as well, however they do no limit compute functionality and consumer grade cards.

        In either case you're missing out on ECC memories, so I guess it's down to how important it is that your results are as accurate as possible. So,

      • I find AMD has much, much better image quality though. Even with my crappy color vision I can see it. OTOH their drivers are notoriously unstable since the 9800 line. I miss my die shrunk 9800.... Time passed it by when it couldn't run Street Fighter IV...
      • This gets modded +5 informative despite lacking crucial information. Fallout 4 is one game and it's an Nvidia gameworks game. Read about gameworks if you want to hear about shady practices in tech.
    • Fallout 4 runs terribly on AMD cards right now (although there was a recent update that bumps performance quite a bit) but it doesn't do that much better on NVidia hardware either. The biggest culprit seems to be the Godrays that are part of GameWorks (A proprietary NVidia set of effects that developers can use in their games) where the visual difference between the ultra and low setting is practically non-existent to most people, but the performance penalty (even on NVidia cards) is huge.

      The new Star Wa [guru3d.com]
      • by kage.j ( 721084 )

        I have a Radeon 280X and I guess that's pretty good (but lower than fallout 4 recommended system requirements) and the game automatically set everything to Ultra and it runs perfectly fine for me.

        Just my personal experience.

      • I've got AMD HD 7700, below Fallout 4 minimum specs. It runs the game just fine on medium settings.

    • I have a Sapphire Radeon HD 7970 and Fallout 4 runs smooth even on Ultra. AMD's drivers are not nearly as bad as they used to be, and sometimes I think people just repeat what they hear or otherwise they don't look at things objectively to see how much has improved.
    • by Ramze ( 640788 )

      I used to love AMD (even used to own shares in the company at one time)... and their graphics cards always had better specs for the price, but... no. Their drivers are crap.

      More importantly, AMD and nVidia typically don't make their own graphics cards -- they just sell the chips and give a reference spec to others. Then, they release regular driver updates to the spec, but caution that your card manufacturer may have better drivers and/or not meet the specs so the drivers may not work right. Most card

    • I have been playing Fallout 4 since release. I have an AMD card, and have had little problems. I was a bit worried, as my particular card model (7850) was technically below the "minimum" specs, which call for at least a 7870. Which first of all doesn't make any sense, as the 7850 would beat the pants off of the nVIDIA card they listed and several others above that which doesn't make a lot of sense.

      When I first tried to play the game, I had to swallow hard, as it initially refused to load, and dumped me a me

  • by Jodka ( 520060 ) on Thursday November 19, 2015 @10:14AM (#50962265)

    A good deal except for that AMD's Linux drivers are pretty bad. Link [pcworld.com].

  • by Anonymous Coward

    1- buy 100 of these and set up a sweet bitcoin miner rig
    2- Mine billions of dollars worth of bitcoins
    3- use some of the money to design and kickstart an ASIC beowulf cluster of bitcoin miners that do TERRAhashes per second
    4- Take orders for the ASICs and then use the money to build the ASICs and use them to mine bitcoins before shipping them really really late
    5- Ship the ASICs to the customers after the difficulty rating is significantly above the point where any profit would be made off of them.
    6- ???
    7- Pr

  • Is it good enough for the Oculus Rift? That's all I care about now, as I look to replace my 5 year old laptop with something Rift capable when it comes out next year.

  • If a video card needs a fan it's wasting too much power. This high end gaming stuff is idiotic.
    • by jandrese ( 485 )
      I can run Minesweeper just fine on my 8 year old Intel graphics card. Buying a video card for games is stupid!

Genius is ten percent inspiration and fifty percent capital gains.

Working...