AMD's New Radeon HD 7950 Tested 120
MojoKid writes "When AMD announced the high-end Radeon HD 7970, a lower cost Radeon HD 7950 based on the same GPU was planned to arrive a few weeks later. The GPU, which is based on AMD's new architecture dubbed Graphics Core Next, is manufactured using TSMC's 28nm process and features a whopping 4.31 billion transistors. In its full configuration, found on the Radeon HD 7970, the Tahiti GPU sports 2,048 stream processors with 128 texture units and 32 ROPs. On the Radeon HD 7950, however, a few segments of the GPU have been disabled, resulting in a total of 1,792 active stream processors, with 112 texture units and 32 ROPs. The Radeon HD 7950 is also clocked somewhat lower at 800MHz, although AMD has claimed the cards are highly overclockable. Performance-wise, though the card isn't AMD's fastest, pricing is more palatable and the new card actually beats NVIDIA's high-end GeForce GTX 580 by just a hair."
How is it at mining BitCoins? (Score:5, Funny)
What's the calculations per watt? Will I be able to put them in a crossfire frankenbox to make my fortune?
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
The 6000 series mostly was the 5000 series. The high end may be a bit different, but the upper-midrange (6770, 6870 stuff) was literally the same chips with some minor stuff tacked on. 3D and some more advanced video support mainly, IIRC.
Re: (Score:1)
More RAM too. There were not too many 2GB models in the 5000 series. There are a bunch of 2GB or 3GB models in the 6000 series. Nvidia did the same thing with the 8000 series and the 9000 series. The 9000 series was the 8000 series with a few more features. I have a 8800GT 512 MB card and a 9800GT 1GB card. The 9800GT card does not take any extra power while the 8800GT needed a 6 pin power plug to function correctly. The difference nothing that I can see. The 8800GT is actually 'faster' according to tests.
Re:How is it at mining BitCoins? (Score:4, Insightful)
So when will there be affordable cards (Score:3)
Re:So when will there be affordable cards (Score:5, Informative)
When Kepler comes out expect all these cards to significantly drop in price.
GCN was a huge cost on AMDs part, and Kepler will be a refinement of Fermi, so Nvidia will aggressively price the 600 series (especially since they won't launch for another 2 months) and make profit on them. And expect AMD to take a loss on the investment but not on the returns from fabrication on the 7900 series (assuming they fab the 7800 and lower cards on their old VLIW architecture like the roadmap from last years aid they would).
So when Kepler comes out, it will probably be aggressively priced, and AMD will drop prices to match. For now they are exclusively the only maker of "next gen" gpus after 2010s 500 and 6000 series, and Kepler is 2 months away, so AMD is milking it.
Re: (Score:3)
According to MSI via Fudzilla, the 77xx series will launch in two weeks at $139/$149 and the 78xx series in March at $249/$299. After that the ball is in nVidia's court, but the current guesses are they're not ready until April, sometime around Intel's Ivy Bridge processors. I think it's working, I've looked at the 7950s and is tempted but will probably wait until then and see if they bring better performance or lower prices, if nothing else to get a better price from AMD. Currently the 7950 costs about dou
Re: (Score:2)
That aside I'll be looking for benchmarks since it might have a bit more DX11 oomph in the same ~85W max TDP envelope.
Re: (Score:2)
I thought they added 100 to the version numbers for the 6000 series.
You should compare to the HD5600s.
Re: (Score:1)
The 7700 series will definitely be interesting, if you want to build a quiet computer that still can handle most games (albeit not at the highest graphics settings).
My latest PC upgrade a few months ago used 6770, and so far it has handled all I've thrown at it.
Re: (Score:1)
Quiet computer? Well I bought a 4870, which just burned up, thank god. I got tired of leaving my computer on all night because I couldn't go through boot up without my computer sounding like a jet plane taking off and waking the whole house. At least I could control the fan in Windows for gaming but in Linux it just sat there at 5k rpm. Well at my old age I went back to school and gave up gaming, no time or money for a video card that costs as much as a console. BTW playing even WoW, or Rift at ultimate cau
Re: (Score:2)
So when will there be cards affordable by normal people?
Well the sweet spot is usually about 8-9mo after the release of a new card. That gets all the major bugs out of the manufacturing, and all the driver issues hammered out. And the prices have pretty much bottomed out too.
Re: (Score:2)
Re: (Score:2)
Wait. New hardware is old and busted? Okay. I mean it's not like the new stuff based on the old stuff, doesn't support current generation tech or anything. Like it did last time.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Game what at 1080p? 4 year old games? 1080p is a resolution (1920x1080) and a frequency, it says nothing about the quality of the image being drawn, it just clearly defines the size and refresh of the image. You could run the original Xcom at 1080p on a 25 dollar air cooled card, but that's not what you mean is it?
If you want to play Arkam city with something close to max settings at 1080p you need a better card than the 5750. If you're willing to take crappy settings then it *might* be possible for an
Re: (Score:2)
Not very likely to happen. Most modern games do put a lot of stress on GPU, which means that you either forego quality, fps, or you install a proper active cooling solution.
Market for functional "silent" solutions is generally an expensive one as it either uses expensive fans with high end bearings and bigger blades (allowing for slower rotation speeds for same air flow), or liquid cooling in high end. You're not going to enter it with a sub-150USD card with passive cooling - these cards are notorious for b
Re: (Score:2)
Well, the 6870/6850 was pretty much the bang-for-your-buck card in the last gen, with the 6770/6670/6570 being really affordable for most any aspiring gamer - so I'd assume you'll need to wait for a 7870/7770/7670... shouldn't be all too long now. I'm waiting for the 7770 (or the 7 series equivalent of the 6770) myself - should be a nice reduction in power consumption and noise, coming from an 8800GT.
Re: (Score:2)
You could have one for a while now, Gigabyte HD5770 Silent Cell.
Re: (Score:2)
Re: (Score:2)
I was just nitpicking.
Disabled? (Score:4, Interesting)
As in, can be re-enabled with a custom BIOS or something?
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
As in, can be re-enabled with a custom BIOS or something?
Probably. Though since the cards have a very uniform architecture, with many repeats of the same thing, my guess is that they bin the chips according to the number of stream processors which are defective. This allows them to fab nothing but top end cards and get good yields of slightly off the top end cards.
GPU manufacturers certainly used to sidable non-"pro" level features in cheaper cards (which could be re-enabled by various hacks) though the car
Re: (Score:1)
Also, mods: WTF? Why is this post marked redundant.
I was curious about that myself.
But... (Score:4, Informative)
No, seriously... last time I tried to install Ubuntu with an ATI card (a few months ago), I couldn't get dual monitors to work correctly.
The restricted drivers exist, but are unstable, awkward and painful. Linux and Nvidia - a bit better in my experience..
Re: (Score:1)
I've been running a dual-monitor setup since Ubuntu 8.04 with a radeon 4850 + proprietary drivers without any troubles.
Re: (Score:3)
Re: (Score:3, Informative)
That's why the driver is called 'nvidia' not 'nv'. 'nv' is the incomplete, OSS driver. 'nvidia' is the driver supported by nVidia. At its core, it's the same driver as on Windows.
Re: (Score:2)
Re: (Score:2)
Comment removed (Score:4, Interesting)
HDCP and protected path is only one aspect (Score:1)
So far, 3D acceleration is also significantly slower than in the closed sorce Catalyst driver. Some of that technology may also be owned by 3rd parties, but it is not as clear-cut as in the case of HDCP.
I suspect AMD's reasons for not releasing that stuff are part legal and part not wanting to give away the latest know-how.
But the latter seems a bit silly, as NVidia drivers already have the better reputation and probably the better code. AMDs advantage seems to be on the hardware side, with their chips cran
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
The Catalyst drivers just landed on the 27th of Januray I think, before that there was a hotfix release for real enthusiasts. Open source support is as far as I know still missing, but basic support should not be far away. They've consistently come closer to release date with each release, last it took 2.5 months and I expect less this time. If you want it the moment it's released expect to compile your own kernel/xorg/driver though. Don't expect any miracles from the OSS drivers though, as I understand it
Re: (Score:2)
But does it run Linux? No, seriously... last time I tried to install Ubuntu with an ATI card (a few months ago), I couldn't get dual monitors to work correctly. The restricted drivers exist, but are unstable, awkward and painful. Linux and Nvidia - a bit better in my experience..
I have been doing dual monitor with ATI/AMD X300 (Benq Joybook 5200G), HD3470 (Toshiba Satelite M300), and HD5650 (Sony Vaio VPCEA36FG). The only time that dual monitor failed me is when I'm using Ubuntu 8.10. Currently I'm using 10.10, with a Samsung 43' LCD tv as secondary monitor via HDMI. Mirror and splitscreen works
Re: (Score:2)
well I have an older ATI card in a linux box in the other room, if I load the propitary drivers as soon as X loads the screen shuts off, so I agree ATI+Linux = worthless
always has been, probably always will be
Re: (Score:2)
Is the price really that horrible? (Score:5, Interesting)
When AMD puts out a faster card for 10% less, it draws complaints about the price from the same reviewer. What gives?
Re:Is the price really that horrible? (Score:5, Interesting)
Re: (Score:1)
You might not be joking. [phoronix.com]
Re: (Score:3)
People expect AMD to be cheaper, even when they are competitive on a performance standpoint. AMD usually aims for the mid-range market more, so I expect seeing a top-end card from them (at top-end prices) is a little surprising.
Re:Is the price really that horrible? (Score:4, Insightful)
When Nvidia puts out a $500 card, it's attractively priced [hothardware.com]. When AMD puts out a faster card for 10% less, it draws complaints about the price from the same reviewer. What gives?
To be fair, that review you linked is from November 2010. Perhaps second-hand 580s are better value or something.
Re: (Score:2)
The market is changing, and the reviewer is reflecting that. People don't want to spend 600 dollars on a top end card, even if 5 years ago the 'top end' cost 800 dollars (or whatever it was).
The perception is (rightly or wrongly) that all of these things should be getting faster and cheaper at the same time. That's not entirely wrong, but it's not entirely right either. A die shrink should mean lower cost for the chip itself, depending on yields but has nothing to do with any of the other parts on the PC
Re: (Score:2)
Maybe that the first of the 28nm process generation costs about the same as the last of the 40nm process generation released a year and a half ago? Currently the effect on the price/performance ratio has been almost nothing, they've offered higher performance at a higher price. Yes, the 7950 is now beating the GTX 580 in most ways but it's not exactly a massively better deal. Hopefully nVidia will be a bit more aggressive but if they're both held back by TSMC's delivery capacity the duel can get a bit lame.
I wasn't an ATI/AMD fan until... (Score:4, Interesting)
...well, let's clear things up: I was always an AMD fan. Their CPUs rocked. I had a seriously great time overclocking my SS7 gear until it boiled.
The graphics cards sucked though. I'm talking about the old Radeon AGP cards. Put down your paddles, lads, 2006 was the last time I bought an ATI branded card (an X1800) and IMHO it sucked monkey balls. I couldn't even get it to perform at low resolution on Unreal 2002. That's why I went straight back to the store and swapped it for an NVidia 7600GT. Oh, yeah, life was sweet after that.
A couple weeks ago I bought a secondhand Sapphire HD3650 with 512MB DDR2. OK, it's a bloody old and very low spec card by tech standards, but it blows my GF 7600GT right out of the water - even on a slower, single core 64-bit processor running 32-bit platform. That made me a fan of ATI/AMD graphics right there. The old machine (Core Duo) with the NVidia is now collecting dust.
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
yea my 9600GT kicked my 7600GT right square in the nuts, actually just about any card after the 7600GT would have rocked it, your comparing a sports car to a yugo. The 7600GT was the absolute worst waste of money I have ever spent on a video card as my 6600GT actually performed just as well
Re: (Score:2)
X950s are overclockable like hell (Score:2)
im on a 6950, it is clocked at 810 mhz, but it can do 910 mhz by just using the ati catalyst slider. no fancy stuff. if you go into serious overclocking, you can approach 1000 mhz easily, if you play with the voltages and stuff.
moreover, X950s are generally unlockable. for example i unlocked the 6950 im sitting on, unlocking 24-30 or so shaders, basically making it a 6950. i could also flash a 6970 bios and make it a full 6970, but that's totally unnecessary, since i can get more than that by overclocking.
a
Really? (Score:1)
"Beats NVIDIA's high-end GeForce GTX 580 by just a hair."
You don't say. Must not have factored in Nvidia's history of selling and shipping GPUs that were known to be defective and then conspiring with the purchasers to hide this fact from the users until after their warranties ran out.
If they had, this new GPU would out perform Nvidia's by huge leaps and bounds.
6150 Go. Look it up.
um yea that shit better come with a Asain hooker (Score:2, Insightful)
really what is the point of this any more? 90+ % of your games are optimised for consoles first giving you at best a geforce8800GT, computer monitors are not getting any higher resolution and they still have not come up with a cooling system that doesnt clog with dust in a month!
nevermind the absolute shit drivers ati ships
Re: (Score:2)
I know my basement isn't that clean - I hardly sweep my office room; but clogging with dust in a month? Holy hell.
AMD/ATI still have scheissy Linux support (Score:1)
Re: (Score:1)
My 5770 works fine on my current KDE desktop, multiple monitors with different resolutions and refresh rates. The gpu acceleration in Blender's new rendering engine is a lot of fun to mess about with.
Re: (Score:2)
Which APU family is this in? (Score:2)
Re: (Score:1)
Re: (Score:1)
Comment removed (Score:5, Interesting)
Re:Faster video card, huh? (Score:4, Interesting)
Re:Faster video card, huh? (Score:4, Informative)
There are console first person shooters, and then there are PC first person shooters.
Try running BF3 on high/ultra in high resolution. My reasonably overclocked GTX 560Ti can just barely handle high in 1080p, ultra utterly murders it with clear jerkiness present in many situations. On the other hand, it eats MW3 for breakfast in pretty much any resolution/quality I could throw at it. You don't need to crank out a "20 km horizon" to overload a modern card.
And frankly, if a game makes your card render 20km of ground in level of detail that actually affects it, of which you will literally see only a few hundred meters, it's doing it wrong. Badly wrong.
Re: (Score:3)
Re: (Score:2)
Again, if your game renders background the way you suggest it does, it does it TERRIBLY WRONG. I once again present the case study, battlefield 3. It often renders huge backgrounds without the catastrophically increasing impact on either video memory or GPU load (i.e. view from a plane looking over entire map vs view of a foot soldier looking at his spawn).
This is done using various LOD techniques and is called "optimization". Notably end result looks worlds better then any of the games you presented as exa
Re: (Score:2)
lol. The maps in BF3 are *tiny*, just a few km by a few km. The backgrounds are merely animated 'sky domes'.
The equivalent sky dome in a flight simulation is much much more distant than that. nb: BF3 is a 'game', and doesn't cut the mustard in *simulation* terms (that's ok, it's not trying to be a sim, but let's call it what it is). Even Arma2, which is a vastly better in terms of simulation than BF3 of ground combat (which is ok, since Arma2 is a sim and BF3 is merely a game) is weak when it comes to air
Re: (Score:2)
No offence, but the way graphics are handled in most simulators nowadays is afterthought at best. And it shows. BF3 can produce a beautiful scenery for several kilometers, while utterly ugly (both aesthetically and graphically) simulator graphics in most modern simulators can eat almost as much of both GPU and memory and end result will look like something utterly horrible in comparison to BF3.
Has it ever occurred to you that most of graphics engine design is not about looking as realistic as possible, but
Re: (Score:2)
> No offence, but the way graphics are handled in most simulators nowadays is afterthought at best. And it shows.
No offense taken. You are wrong, however. Graphics are not an afterthought at all in sims (assuming you have actually used anything modern - oops that's right, your card can't handle them, which was my point). Go and check out the in-cockpit shadows on the A-10C or the Ka-50. BF3's cockpits are lame in comparison (they have nice textures but are essentially static). The soldiers, foliage and
Re: (Score:2)
Re: (Score:2)
These are the circles I move in, here is someone talking about their setup:
http://forums.eagle.ru/showthread.php?p=1390782#post1390782 [eagle.ru]
It is good lots of people like BF3, it is a good game. However, the original argument was not a lame "Is BF3 the bestest game out there" debate but was an assertion that an old video card is good enough. I said that there are people out there (eg. myself and the other folks playing in my genre, as with the person listed above) for which video cards can't ever have enough
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
That's what happens though. Expect that the Xbox 3, PS4 will have something on par with a 7000 series Radeon or 600 series (not yet commercial) nvidia card, at which point, to keep up with a console you'll need something new ( I don't have any insider information here, but that would be consistent with the projected timelines and everything that has happened in the past).
That's how the market has worked for a long time. The consoles come in and converge performance parity to PC's by being sold at a loss f
Stuff that might NOT run on current consoles (Score:1)
Take any shooter or MMO with really large maps and corresponding memory requirements.
For instance, "All Points Bulletin" comes to mind. After a few minutes, it always brought my PC (AMD dual core, 2GByte RAM, NVidia 8600 GT) to its knees due to requiring 2GByte of memory or more for itself.
CPU and GPU seemed to have no problem, as the game ran fine until the memory limitation kicked in. So I guess the CPU and GPU in current-gen consoles might be able to handle the load as well. But memory-wise, they would r
Re: (Score:3)
Memory on consoles is a different baby completely than on PC. On a console you know exactly how quickly you can pull data in from the optical drive, and have a good idea about the hard drive. On the PC you figure most people have a couple of gigs of RAM, so you may as well use it, and you have no control over what else is using those resources on the system, so you're better to use RAM than to rely on disk access. You also have very different memory space requirements with the GPU (you might be mirrorin
Re: (Score:2)
Re: (Score:1)
Thanks for the hint, but that was several months ago and I have since upgraded one of my other computers (an old P4).
The upgrade consists of a new AM3 board, a Phenom II 910e quad core, 4GByte of DDR3 ECC Ram, a Radeon HD6670 and a new harddisk. While not the very fastest, this system is easy on the electricity bill (only about 80 watt when doing light duty) and should last me a few more years.
The ex-P4 is now my primary PC, and the dual core I tried APB on has been demoted to secondary.
Re: (Score:2)
Re: (Score:1)
Did you mean the 9100e, as in this link: http://www.starmicroinc.net/product/HD9100OBJ4BGD/-AMD-Phenom-X4-9100e-AM2-18GHZ-4MB-3200MHZ-HD9100OBJ4BGD-CPU-OEM/ [starmicroinc.net]?
I mean this one:http://products.amd.com/pages/DesktopCPUDetail.aspx?id=623 [amd.com]
The 910e is a "Deneb" core. At 2.6 GHz, it is not that bad in performance, and the official TDP is half that of most standard AMD quad cores at the time I bought it (65 W vs. 125 W). You may get lucky with a standard chip that happens to be close to the 910e in power consumption,
Re: (Score:2)
Re: (Score:2)
you should have bought a card from a proper manufacturer. 5970s are still monster cards.
Re: (Score:2)
Proper manufacturer wouldn't even matter.
Unless he's running Linux, for which I will admit ATI/AMDs driver support has been nearly or completely non-existant for quite some time, he's totally full of shit.
I have never gone more than a total of two weeks with a driver-related problem on an ATI card on a windows based system for any game. As opposed to NVidia from whom I haven't purchased a card since the debacle where I couldn't play SW:KotoR for over a month due to their problems.
To be fair, if he's running
Re: (Score:2)
Unless he's running Linux, for which I will admit ATI/AMDs driver support has been nearly or completely non-existant for quite some time, he's totally full of shit.
i am able to easily play games like mount and blade warband under wine in ubuntu.
I have never gone more than a total of two weeks with a driver-related problem on an ATI card on a windows based system for any game
further, i havent had ANY problems with any games until this date. and im no usual gamer - i game on 3 monitor eyefinity resolution.
Re: (Score:2)
I meant ever. There was a problem with SW:KotoR 2 with it crashing like clockwork after an hour playing that took 13 days to resolve. Other than that, and one 6 hour period where there was a conflict with WoW on my 4890, which I heard about after it was fixed already, there haven't been any major problems that I know about that could possibly have effected me.