Forgot your password?
typodupeerror
Graphics AMD Technology

AMD's New Radeon HD 7950 Tested 120

Posted by Soulskill
from the showing-those-pixels-who's-boss dept.
MojoKid writes "When AMD announced the high-end Radeon HD 7970, a lower cost Radeon HD 7950 based on the same GPU was planned to arrive a few weeks later. The GPU, which is based on AMD's new architecture dubbed Graphics Core Next, is manufactured using TSMC's 28nm process and features a whopping 4.31 billion transistors. In its full configuration, found on the Radeon HD 7970, the Tahiti GPU sports 2,048 stream processors with 128 texture units and 32 ROPs. On the Radeon HD 7950, however, a few segments of the GPU have been disabled, resulting in a total of 1,792 active stream processors, with 112 texture units and 32 ROPs. The Radeon HD 7950 is also clocked somewhat lower at 800MHz, although AMD has claimed the cards are highly overclockable. Performance-wise, though the card isn't AMD's fastest, pricing is more palatable and the new card actually beats NVIDIA's high-end GeForce GTX 580 by just a hair."
This discussion has been archived. No new comments can be posted.

AMD's New Radeon HD 7950 Tested

Comments Filter:
  • by Anonymous Coward on Tuesday January 31, 2012 @10:08PM (#38885751)

    What's the calculations per watt? Will I be able to put them in a crossfire frankenbox to make my fortune?

  • by afidel (530433) on Tuesday January 31, 2012 @10:10PM (#38885773)
    So when will there be cards affordable by normal people? Also for me the biggest thing to come out of the new design is that we should be able to get a passively cooled card with more performance than the HD5750.
    • by Xanny (2500844) on Tuesday January 31, 2012 @10:17PM (#38885865)

      When Kepler comes out expect all these cards to significantly drop in price.

      GCN was a huge cost on AMDs part, and Kepler will be a refinement of Fermi, so Nvidia will aggressively price the 600 series (especially since they won't launch for another 2 months) and make profit on them. And expect AMD to take a loss on the investment but not on the returns from fabrication on the 7900 series (assuming they fab the 7800 and lower cards on their old VLIW architecture like the roadmap from last years aid they would).

      So when Kepler comes out, it will probably be aggressively priced, and AMD will drop prices to match. For now they are exclusively the only maker of "next gen" gpus after 2010s 500 and 6000 series, and Kepler is 2 months away, so AMD is milking it.

    • by Kjella (173770)

      According to MSI via Fudzilla, the 77xx series will launch in two weeks at $139/$149 and the 78xx series in March at $249/$299. After that the ball is in nVidia's court, but the current guesses are they're not ready until April, sometime around Intel's Ivy Bridge processors. I think it's working, I've looked at the 7950s and is tempted but will probably wait until then and see if they bring better performance or lower prices, if nothing else to get a better price from AMD. Currently the 7950 costs about dou

      • by afidel (530433)
        Bah, the 7700 series is only going to have ~10% more memory bandwidth than the 30 month old HD5700 series, this is supposed to be progress?

        That aside I'll be looking for benchmarks since it might have a bit more DX11 oomph in the same ~85W max TDP envelope.
        • by nzac (1822298)

          I thought they added 100 to the version numbers for the 6000 series.
          You should compare to the HD5600s.

        • The 7700 series will definitely be interesting, if you want to build a quiet computer that still can handle most games (albeit not at the highest graphics settings).
          My latest PC upgrade a few months ago used 6770, and so far it has handled all I've thrown at it.

          • Quiet computer? Well I bought a 4870, which just burned up, thank god. I got tired of leaving my computer on all night because I couldn't go through boot up without my computer sounding like a jet plane taking off and waking the whole house. At least I could control the fan in Windows for gaming but in Linux it just sat there at 5k rpm. Well at my old age I went back to school and gave up gaming, no time or money for a video card that costs as much as a console. BTW playing even WoW, or Rift at ultimate cau

    • by Mashiki (184564)

      So when will there be cards affordable by normal people?

      Well the sweet spot is usually about 8-9mo after the release of a new card. That gets all the major bugs out of the manufacturing, and all the driver issues hammered out. And the prices have pretty much bottomed out too.

      • by Squiddie (1942230)
        Yes, but at that point these cards will be old and busted and the new hotness will have been announced. It does backfire sometimes, though, you know Fermi and all.
        • by Mashiki (184564)

          Wait. New hardware is old and busted? Okay. I mean it's not like the new stuff based on the old stuff, doesn't support current generation tech or anything. Like it did last time.

          • by Squiddie (1942230)
            I was mostly making a joke, but it is true that eight months from now some will start to wait for next gen rather than buy current gen at a good price. The way I do it is to just buy whenever I think I need it. Like that there is no remorse.
            • by dadioflex (854298)
              People waiting for the latest next gen card are lucky. They never have to buy anything. Just wait.
    • Well, the 6870/6850 was pretty much the bang-for-your-buck card in the last gen, with the 6770/6670/6570 being really affordable for most any aspiring gamer - so I'd assume you'll need to wait for a 7870/7770/7670... shouldn't be all too long now. I'm waiting for the 7770 (or the 7 series equivalent of the 6770) myself - should be a nice reduction in power consumption and noise, coming from an 8800GT.

    • You could have one for a while now, Gigabyte HD5770 Silent Cell.

      • by afidel (530433)
        Yeah, the 5770 is a whole what 100MHz faster on the core clock and maybe a smidge more on the memory clock than my 5750, certainly not worth spending another $150 to "upgrade".
  • Disabled? (Score:4, Interesting)

    by schitso (2541028) on Tuesday January 31, 2012 @10:11PM (#38885785)
    "a few segments of the GPU have been disabled"

    As in, can be re-enabled with a custom BIOS or something?
    • Yes. I bought a first run 6950 and with a bios flash i now have a 6970 for $80 less.
    • by Squiddie (1942230)
      I sure hope so. I have a 6950 that I flashed. I wanted to get another, but I could not find any more. Still, I might just get the normal 7970 because it just overclocks so well. Still waiting on nvidia so that we can get some price drops, though.
    • As in, can be re-enabled with a custom BIOS or something?

      Probably. Though since the cards have a very uniform architecture, with many repeats of the same thing, my guess is that they bin the chips according to the number of stream processors which are defective. This allows them to fab nothing but top end cards and get good yields of slightly off the top end cards.

      GPU manufacturers certainly used to sidable non-"pro" level features in cheaper cards (which could be re-enabled by various hacks) though the car

  • But... (Score:4, Informative)

    by goldaryn (834427) on Tuesday January 31, 2012 @10:15PM (#38885827) Homepage
    But does it run Linux?

    No, seriously... last time I tried to install Ubuntu with an ATI card (a few months ago), I couldn't get dual monitors to work correctly.

    The restricted drivers exist, but are unstable, awkward and painful. Linux and Nvidia - a bit better in my experience..
    • by Anonymous Coward

      I've been running a dual-monitor setup since Ubuntu 8.04 with a radeon 4850 + proprietary drivers without any troubles.

    • Really? Because getting the nv drivers to work correctly with a 1440x900 monitor was like pulling teeth, which is why I abandoned my brand of choice for an ATI card this latest go 'round.
      • Re: (Score:3, Informative)

        by Anonymous Coward

        That's why the driver is called 'nvidia' not 'nv'. 'nv' is the incomplete, OSS driver. 'nvidia' is the driver supported by nVidia. At its core, it's the same driver as on Windows.

    • by chromas (1085949)
      In my experience (OpenSuse, though, not Ubuntu), install first, add extra monitors later, especially if they run at different resolutions. If you use the official/proprietary drivers, be sure the open drivers are completely removed from your system or you'll have a conflict.
    • by Kjella (173770)

      The Catalyst drivers just landed on the 27th of Januray I think, before that there was a hotfix release for real enthusiasts. Open source support is as far as I know still missing, but basic support should not be far away. They've consistently come closer to release date with each release, last it took 2.5 months and I expect less this time. If you want it the moment it's released expect to compile your own kernel/xorg/driver though. Don't expect any miracles from the OSS drivers though, as I understand it

    • But does it run Linux? No, seriously... last time I tried to install Ubuntu with an ATI card (a few months ago), I couldn't get dual monitors to work correctly. The restricted drivers exist, but are unstable, awkward and painful. Linux and Nvidia - a bit better in my experience..

      I have been doing dual monitor with ATI/AMD X300 (Benq Joybook 5200G), HD3470 (Toshiba Satelite M300), and HD5650 (Sony Vaio VPCEA36FG). The only time that dual monitor failed me is when I'm using Ubuntu 8.10. Currently I'm using 10.10, with a Samsung 43' LCD tv as secondary monitor via HDMI. Mirror and splitscreen works

    • by Osgeld (1900440)

      well I have an older ATI card in a linux box in the other room, if I load the propitary drivers as soon as X loads the screen shuts off, so I agree ATI+Linux = worthless

      always has been, probably always will be

    • by webheaded (997188)
      Actually the OSS ATI drivers aren't too bad on Linux. I hadn't really messed with any of that stuff before in KDE so when I did my new Arch install, I was surprised by how easy it was to configure all that. I was kind of irritated that hitting apply didn't save my settings and it took me quite some time to figure out there was a separate "save" button somewhere in the display dialogs, but other than that...it's not bad. The only thing that's kind of annoying is the power control. You have to manually se
  • by Anonymous Coward on Tuesday January 31, 2012 @10:15PM (#38885833)
    When Nvidia puts out a $500 card, it's attractively priced [hothardware.com].

    When AMD puts out a faster card for 10% less, it draws complaints about the price from the same reviewer. What gives?
    • by Dyinobal (1427207) on Tuesday January 31, 2012 @10:29PM (#38885977)
      Nvidia pays better, and sends better swag I'd guess.
    • by Baloroth (2370816)

      People expect AMD to be cheaper, even when they are competitive on a performance standpoint. AMD usually aims for the mid-range market more, so I expect seeing a top-end card from them (at top-end prices) is a little surprising.

    • by goldaryn (834427) on Tuesday January 31, 2012 @10:30PM (#38885987) Homepage

      When Nvidia puts out a $500 card, it's attractively priced [hothardware.com]. When AMD puts out a faster card for 10% less, it draws complaints about the price from the same reviewer. What gives?

      To be fair, that review you linked is from November 2010. Perhaps second-hand 580s are better value or something.

    • by Kjella (173770)

      Maybe that the first of the 28nm process generation costs about the same as the last of the 40nm process generation released a year and a half ago? Currently the effect on the price/performance ratio has been almost nothing, they've offered higher performance at a higher price. Yes, the 7950 is now beating the GTX 580 in most ways but it's not exactly a massively better deal. Hopefully nVidia will be a bit more aggressive but if they're both held back by TSMC's delivery capacity the duel can get a bit lame.

  • by Tastecicles (1153671) on Tuesday January 31, 2012 @10:44PM (#38886087)

    ...well, let's clear things up: I was always an AMD fan. Their CPUs rocked. I had a seriously great time overclocking my SS7 gear until it boiled.

    The graphics cards sucked though. I'm talking about the old Radeon AGP cards. Put down your paddles, lads, 2006 was the last time I bought an ATI branded card (an X1800) and IMHO it sucked monkey balls. I couldn't even get it to perform at low resolution on Unreal 2002. That's why I went straight back to the store and swapped it for an NVidia 7600GT. Oh, yeah, life was sweet after that.

    A couple weeks ago I bought a secondhand Sapphire HD3650 with 512MB DDR2. OK, it's a bloody old and very low spec card by tech standards, but it blows my GF 7600GT right out of the water - even on a slower, single core 64-bit processor running 32-bit platform. That made me a fan of ATI/AMD graphics right there. The old machine (Core Duo) with the NVidia is now collecting dust.

    • Lol. You replaced one old outdated card with another :) My personal experience has been that NVidia has excellent drivers. ATI/AMD have better hardware and better visual quality (NVidia often as strange visual artifacts). Downside of ATI is their drivers are dodgy. It is always a risk upgrading an ATI driver. Sometimes new drivers can break your favourite game until a hotfix comes out (usually takes a fortnight or so). So, whether you go NVidia or AMD depends on what you want (NVidia ease of use) or ATI (mo
      • by webheaded (997188)
        ATi drivers aren't just dodgy...they are awful. I've had a 4870x2 for a while now and I've seen issues ranging from buggy games, to crashing video drivers playing flash, and green video for flash. I did a completely clean install for the last release and got about 2 days of being able to use Youtube before it started green videoing again. It is truly incredible that they can make a video driver that can't properly play fucking YOUTUBE VIDEOS with hardware acceleration while at the same time being able to
    • by Osgeld (1900440)

      yea my 9600GT kicked my 7600GT right square in the nuts, actually just about any card after the 7600GT would have rocked it, your comparing a sports car to a yugo. The 7600GT was the absolute worst waste of money I have ever spent on a video card as my 6600GT actually performed just as well

    • ATI have a better hardware, but their drivers are pure, total crap. It's like buying a Ferrari and put a mediocre pilot to drive. After many problems with drivers I gave up buying a new Radeon and now I use a GTX580
  • im on a 6950, it is clocked at 810 mhz, but it can do 910 mhz by just using the ati catalyst slider. no fancy stuff. if you go into serious overclocking, you can approach 1000 mhz easily, if you play with the voltages and stuff.

    moreover, X950s are generally unlockable. for example i unlocked the 6950 im sitting on, unlocking 24-30 or so shaders, basically making it a 6950. i could also flash a 6970 bios and make it a full 6970, but that's totally unnecessary, since i can get more than that by overclocking.

    a

  • "Beats NVIDIA's high-end GeForce GTX 580 by just a hair."

    You don't say. Must not have factored in Nvidia's history of selling and shipping GPUs that were known to be defective and then conspiring with the purchasers to hide this fact from the users until after their warranties ran out.

    If they had, this new GPU would out perform Nvidia's by huge leaps and bounds.

    6150 Go. Look it up.

  • really what is the point of this any more? 90+ % of your games are optimised for consoles first giving you at best a geforce8800GT, computer monitors are not getting any higher resolution and they still have not come up with a cooling system that doesnt clog with dust in a month!

    nevermind the absolute shit drivers ati ships

    • by karnal (22275)

      I know my basement isn't that clean - I hardly sweep my office room; but clogging with dust in a month? Holy hell.

  • I won't buy an ATI card until the Linux driver situation is fixed.
    • by Anonymous Coward

      My 5770 works fine on my current KDE desktop, multiple monitors with different resolutions and refresh rates. The gpu acceleration in Blender's new rendering engine is a lot of fun to mess about with.

      • by JonJ (907502)
        Lucky you. On my box, regardless of distro, kernel, and catalyst drivers, VLC always segfaults when trying to play accelerated video. It works fine with nvidia, so I have to conclude that the drivers for the AMD card are worthless.
  • Which APU platform is this a part of?

Remember: use logout to logout.

Working...