AMD Launches Radeon R9 380X, Fastest GPU Under $250 (hothardware.com) 110
MojoKid writes: Although AMD's mid-range GPU line-up has been relatively strong for a while now, the company is launching the new Radeon R9 380X today with the goal of taking down competing graphics cards like NVIDIA's popular GeForce GTX 960. The Radeon R9 380X has a fully-functional AMD Tonga GPU with all 32 compute units / 2048 shader processors enabled. AMD's reference specifications call for 970MHz+ engine clock with 4GB of 1425MHz GDDR5 memory (5.7 Gbps effective). Typical board power is 190W and cards require a pair of supplemental 6-pin power feeds. The vast majority of the Radeon R9 380X cards that will hit the market, however, will likely be custom models that are factory overlcocked and look nothing like AMD's reference design. The Radeon R9 380X, or more specifically the factory overclocked Sapphire Nitro R9 380X tested, performed significantly better than AMD's Radeon R9 285 or NVIDIA's GeForce GTX 960 across the board. The 380X, however, could not catch more powerful and more expensive cards like the GeForce GTX 970. Regardless, the Radeon R9 380X is easily the fastest graphics card on the market right now, under $250.
Talk about drawing a fine line... (Score:2, Insightful)
They're really drawing a fine line on this one.
Seeing as how the GTX 970 has broken under the $300 mark in the last few weeks, they're not doing much to sell me on the R9 380X.
Re: (Score:2)
Yeah, for 50$ more which is usually much less than 5% of your computer price (I include peripherals like monitors when I evaluate the effectiveness of an upgrade) you are getting 25%+ better performance with the GTX970... (shadow of mordor, low resolution of 2560x1440) Why would you want that card exactly?
Even if you would only take that video card value in mind while evaluating, 300/250 still equals only 20% more.. Price point should have been less than 200$ to be in a different range alltogether for it to
Re: (Score:1)
Yeah, for 50$ more which is usually much less than 5% of your computer price (I include peripherals like monitors when I evaluate the effectiveness of an upgrade) you are getting 25%+ better performance with the GTX970... (shadow of mordor, low resolution of 2560x1440) Why would you want that card exactly?
Even if you would only take that video card value in mind while evaluating, 300/250 still equals only 20% more.. Price point should have been less than 200$ to be in a different range alltogether for it to be somewhat interesting. Oh well, maybe someday AMD will bring something interesting again :)
On what planet is 2560x1440 considered 'low resolution'?
Re: (Score:1)
> On what planet is 2560x1440 considered 'low resolution'?
Planet Fourkay?
Re:Talk about drawing a fine line... (Score:5, Insightful)
2560 x 1440 is not "low resolution" regardless of how many thousands of dollars you spent 8 years ago on the tech. Age is irrelevant. MacBook Gen 3 13" Retina Displays are 2560 x 1440. Those are being sold THIS YEAR as high-end displays.
Blu Ray is 1920 x 1080
4K is 3840 x 2160, but 4K has not made it out the showroom yet for TV or most monitors.
IBM came out with some spiffy T220/T221 LCD monitors that you could buy way back in 2003 for about $8,500 each that had 3840 x 2400, but that doesn't mean that 3840 x 2400 is "outdated low resolution" simply because one could buy it 12 years ago.
Re: (Score:1)
Well, looking at the ads I've had for this year incoming black friday, I could only spot a few 1080p tvs through the tons of lowly priced 4K tvs. Also, the 2560x1440 monitor you are referring to is a 13 inches monitor .. who wants to play on such a small screen ? Most of my friends have left their 24" 1080p monitors and went the 40"+ TV route as their new monitors. Here's a list of great monitors if you don't want a tv or want something smaller:
http://4k.com/monitor/ [4k.com]
And they are all priced lower than a 19"
Re: (Score:3)
All I want is a display with a 16:10 ratio that doesn't cost triple what 16:9 displays do.
Re: (Score:2)
Re: (Score:2)
I just remember 15 years ago where something like a high end 21" Trinitron cost over $1000. Monitors (and TVs) nowadays are incredibly cheap, even without factoring in inflation.
Re: (Score:2)
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
>Yes, but you said they stopped making them. Clearly not the case
I said they pretty much stopped making them, not that they all stopped making them. The searches I did with your examples support this. 16x10 monitor choices are slim unless you want to pay out the ass or stick with a 24in minitor.
>I'm gonna call bullshit on that one, unless you can point to an example?
Maybe I was a bit overzealous will sub $200. How about sub $300... I had in mind a hanns g monitor a friend of mine had purchased. This i
Re: (Score:1)
Plus AMD still can't write drivers for shit.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Same power use but not as fast as a 970 (Score:2)
Cheapest 970 was 285 on newegg (and out of stock) 35 bucks call thats 15% more and it's averages a good bit more FPS on some games as much as 50% more then this cards numbers.
So much for your headline (Score:2)
That's actually 249.99 for a GTX 970 including an AAA game. Free shipping.
OT: power use question (Score:2)
But that's assuming 190W draw, 24/7... So how much power do video cards really use? Assuming in typical use; mostly normal apps, some gaming, a lot of screen asleep time... does anyone have an idea?
Re: (Score:2, Interesting)
Typical idle draw is around 20W. Much like a CPU, the GPU underclocks and undervolts when idle so as to reduce heat and power requirements.
Re: (Score:2)
How often do you game not much else pushes a modern card.
Re: (Score:2)
How often do you game not much else pushes a modern card.
Someone is running a bitcoin miner (or altcoin) screen saver. Someone who doesn't pay their own electricity bill. :-)
Re: (Score:2)
Re: (Score:2)
Someone who is mining bitcoins on a GPU is either losing money or stealing electricity off of someone. Bitcoin is way past the point where it's economically feasible to mine them on a GPU.
As I said, its not people who pay their own electricity bill. And its not just the bitcoin and other sha256 altcoins, its also all the scrypt based altcoins. Its an ASIC world now, and only fairly recent ASICs at that.
Re: (Score:2)
Someone who is mining bitcoins on a GPU is either losing money or stealing electricity off of someone. Bitcoin is way past the point where it's economically feasible to mine them on a GPU.
Unless the person is cold and would have been running a space heater otherwise. One might think of a GPU as a subsided space heater. :-)
Re: (Score:2)
Re: (Score:2)
FPS per watt (Score:3)
The performance per energy consumption still lags greatly behind NVIDIA offerings.
Besides that, there is CUDA, yes I know it's a closed standard but there is a reason most GPU computing libraries, specially in Deep Learning fields use preferably CUDA: it's just easier to get more performance out of it with less hassle.
If you just want to play games and electricity costs are not a concern to you (so, most teenagers I suppose) Radeon is ok, but if you are not in that category, I find it hard not to have to choose a GeForce.
Re: (Score:1)
Heterogeneous-compute Interface for Portability (HIP) – CUDA Compilation For AMD GPUs
some time.
Re: (Score:2, Insightful)
I'm always surprised to see this mentality in tech oriented groups. AMD's contributions to GPU tech are huge, and very community friendly. They use and produce open source software, and actively support Khronos group efforts. Their tech is always non-proprietary, and works across even non-AMD devices. For development of any kind debugging information provided by the AMD gear is just plain more useful. As for CUDA - it is almost directly inferior to OpenCL. CUDA's prevalence is largely due to NVIDIA's
Re: (Score:2)
As for CUDA - it is almost directly inferior to OpenCL. CUDA's prevalence is largely due to NVIDIA's attempts to jam it down every available throat.
Not even close. CUDA came out well before OpenCL (CUDA in June 2007, OpenCL 1.0 in August 2009), and has remained ahead features, tools and stability-wise ever since. (yes I have used both). I would really like for AMD + OpenCL to be better than NVIDIA + CUDA, but I've been wishing for that for the last 6 years and it has yet to happen.
Is AMD Better Now? (Score:3)
It's been a long time (relatively speaking) since I've played the graphics card game. I remember that AMD's cards were technically solid, but often plagued with driver issues. Even now I'm reading about performance issues with Fallout 4 (which is probably Bethesda's fault because it's an unpatched Bethesda game.)
Has the situation improved? Am I holding onto old biases?
(Alas, for the heady days of my Voodoo2.)
Re:Is AMD Better Now? (Score:5, Informative)
Minimum card to play fallout 4 Radeon HD 7870 or GeForce GTX 550 Ti:
when you compare the two cards it is insane, the difference in the driver is not to be taken lightly.
Radeon HD 7870 vs GeForce GTX 550 Ti
2,560 GFLOPS vs 691.2 GFLOPS
23,592 vs 9,923 3dMark Vantage score
80 GTexel/s vs 28.8 GTexel/s
Basically at this point the general advice is: If you want to play games, buy nvidia... if you want to mine crypto currency block chains, buy Raedon.
Re: (Score:2, Interesting)
You're missing the entire point here.
nVidia is intentionally segmenting THEIR customers into consumer and professional grade, byt INTENTIONALLY LIMITING compute functionality of consumer grade products.
ATI(or AMD if you want to maintain the fiction) segments them in a similar fashion as well, however they do no limit compute functionality and consumer grade cards.
In either case you're missing out on ECC memories, so I guess it's down to how important it is that your results are as accurate as possible. So,
Re: Is AMD Better Now? (Score:2)
Re: (Score:1)
Re: (Score:3)
The new Star Wa [guru3d.com]
Re: (Score:1)
I have a Radeon 280X and I guess that's pretty good (but lower than fallout 4 recommended system requirements) and the game automatically set everything to Ultra and it runs perfectly fine for me.
Just my personal experience.
Re: (Score:2)
I've got AMD HD 7700, below Fallout 4 minimum specs. It runs the game just fine on medium settings.
Re: (Score:2)
Re: (Score:2)
I used to love AMD (even used to own shares in the company at one time)... and their graphics cards always had better specs for the price, but... no. Their drivers are crap.
More importantly, AMD and nVidia typically don't make their own graphics cards -- they just sell the chips and give a reference spec to others. Then, they release regular driver updates to the spec, but caution that your card manufacturer may have better drivers and/or not meet the specs so the drivers may not work right. Most card
Re: (Score:2)
AMD works fine for me... (Score:2)
I have been playing Fallout 4 since release. I have an AMD card, and have had little problems. I was a bit worried, as my particular card model (7850) was technically below the "minimum" specs, which call for at least a 7870. Which first of all doesn't make any sense, as the 7850 would beat the pants off of the nVIDIA card they listed and several others above that which doesn't make a lot of sense.
When I first tried to play the game, I had to swallow hard, as it initially refused to load, and dumped me a me
The power Intel would have as sole x86 vendor (Score:5, Insightful)
Why doesn't AMD just shut down and leave all the work to Intel?
Because AMD is the only way we have to keep Intel from going full monopolist. The qualitative effect on the market from the difference between one and two makers of a particular product, such as CPUs that run a particular instruction set, is far greater than that between two and three.
Re: (Score:2)
But is it AMD's to license? (Score:2)
They should take the ARM model and simply license their AMD64 core (i.e. instruction set and everything) to top of the line fabs aside from Intel
AMD has a cross-license with Intel to cover patented parts of x86 and x86-64 instruction sets. I'm not entirely certain to what extent this license extends to licensing AMD64 cores to SoC makers beyond what AMD is already doing with the APUs in Xbox One and PlayStation 4.
Re: (Score:2)
Re: (Score:2)
On Windows but not on Linux (Score:5, Informative)
A good deal except for that AMD's Linux drivers are pretty bad. Link [pcworld.com].
Here are the steps (Score:1)
1- buy 100 of these and set up a sweet bitcoin miner rig
2- Mine billions of dollars worth of bitcoins
3- use some of the money to design and kickstart an ASIC beowulf cluster of bitcoin miners that do TERRAhashes per second
4- Take orders for the ASICs and then use the money to build the ASICs and use them to mine bitcoins before shipping them really really late
5- Ship the ASICs to the customers after the difficulty rating is significantly above the point where any profit would be made off of them.
6- ???
7- Pr
Good enough for Rift? (Score:2)
Is it good enough for the Oculus Rift? That's all I care about now, as I look to replace my 5 year old laptop with something Rift capable when it comes out next year.
Re: (Score:3)
No. Minimum requirement for Rift is GTX 970. This card is slower than that.
Waste (Score:1)
Re: (Score:2)
Some "old" Linux boxes don't need graphics ... (Score:2)
They pulled drivers for "obsolete* GPUs from the Linux kernel, making all of those cards broken, mine included. Nvidia drivers may be closed, but at least they support pretty much everything they ever made.
Not necessarily. I have old AMD cards in Linux boxes that are now headless servers in the closet. At most a KVM switch will access them in text mode, their GUI days are over.
Re: (Score:2)
Use the open source drivers, they work well on older cards
If you are talking on really old cards, like ati rage and similar... that was not AMD, it was the Xorg developers, as the drivers where broken due X server changes and no one fixed the drivers (because no one cared anymore). If you need this drivers, fix then or pay someone to fix then... or use older distros... because if the have card have 15 years, the remaining hardware should also not be that recent... or get a better (AGP/PCI) card, like a ATI
Re: (Score:2)
They pulled drivers for "obsolete* GPUs from the Linux kernel, making all of those cards broken, mine included.
That's not true, older AMD cards are supported by the open source driver, and for these older, pre-OpenGL 4.0 cards the mesa implementation is actually quite good, and since it is open source it will probably be maintained for a very long time.
It is true, however, that they pulled support for the Radeon 4XXXHD series from their Catalyst driver too soon, before the mesa implementation was in a good shape.