Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Hardware

NVIDIA Launches New Midrange Maxwell-Based GeForce GTX 960 Graphics Card 114

MojoKid writes NVIDIA is launching a new Maxwell desktop graphics card today, targeted at the sweet spot of the graphics card market ($200 or so), currently occupied by its previous gen GeForce GTX 760 and older GTX 660. The new GeForce GTX 960 features a brand new Maxwell-based GPU dubbed the GM206. NVIDIA was able to optimize the GM206's power efficiency without moving to a new process, by tweaking virtually every part of the GPU. NVIDIA's reference specifications for the GeForce GTX 960 call for a base clock of 1126MHz and a Boost clock of 1178MHz. The GPU is packing 1024 CUDA cores, 64 texture units, and 32 ROPs, which is half of what's inside their top-end GeForce GTX 980. The 2GB of GDDR5 memory on GeForce GTX 960 cards is clocked at a speedy 7GHz (effective GDDR5 data rate) over a 128-bit memory interface. The new GeForce GTX 960 is a low-power upgrade for gamers with GeForce GTX 660 class cards or older that make up a good percentage of the market now. It's usually faster than the previous generation GeForce GTX 760 card but, depending on the game title, can trail it as well, due to its narrower memory interface.
This discussion has been archived. No new comments can be posted.

NVIDIA Launches New Midrange Maxwell-Based GeForce GTX 960 Graphics Card

Comments Filter:
  • My "sweet spot" is $100, with perhaps $20 fudge factor for impatience and/or shipping. Hence, I am now running an Asus 450 GTS OC 1GB. If I were to buy a card today, which one would I buy assuming I weren't even considering throwing away money on an ATI card?

    • by slaker ( 53818 ) on Thursday January 22, 2015 @07:04PM (#48880855)

      You might be able to find a Geforce 750 for $125 or so. They're adequate but definitely not ideal for 1920x1080 in most PC games. They're also ridiculously efficient; they don't even need an extra PCIe power connection.
      If anyone tells me they want to game on a PC and don't immediately mention a game with a more serious demand, that's the hardware I use.

      I generally prefer ATI hardware because I think nVidia's stock cooling kills graphics cards and I'd rather deal with crappy drivers, but the current ATI hardware is a complete non-starter. There's really no level at all where it can be justified.

      • Geforce 750 should be able to run 1920x1080 at high settings on most games, so I would say it's ideal. I run 2560x1440 on a Geforce 660, medium and sometimes medium-high settings no problem.

        • by epyT-R ( 613989 )

          Yes, if you're happy chugging along at 20-35fps with dips into the low teens.

          • Yes, if you're happy chugging along at 20-35fps with dips into the low teens.

            If you know what I mean.

      • by Salgat ( 1098063 )
        I use a 750 Ti on a setup with 3 monitors (main being 1440p) and runs my games just fine on high settings, including WoW, LoL, Skyrim, Civ 5, and a few others. It's more the sufficient for most gaming on a budget, even at higher resolutions.
      • I generally prefer ATI hardware because I think nVidia's stock cooling kills graphics cards and I'd rather deal with crappy drivers, but the current ATI hardware is a complete non-starter. There's really no level at all where it can be justified.

        Well, yeah, because ATI doesn't exist anymore, even as a brand. AMD bought them in 2006 and retired the brand in 2010. The last ATI card was the 5870 Eyefinity Edition, which packs about as much punch as this 960, but in a card with the size, noise and power draw of a top-end card. Everything since has been AMD.

        I know exactly what you meant, and I even agree with you on your points, it's just hard to take those points seriously when you're using a half-decade-old name.

        • Re: (Score:2, Insightful)

          by Anonymous Coward

          As a former ATI employee, I think many will join me in lamenting its demise. Was a great place to work, and I enjoyed contributing to some cool products.

          As a former AMD employee (as per the buy-out), all I can say is AMD sucked. First the "green" management couldn't get a product to production, year after year after year. Now they're trying to play as a mostly "red" team, and unfortunately don't have half the staff that made ATI work well. Rory was just weirdly goofy, and sometimes bizarrely candid with

    • by mc6809e ( 214243 ) on Thursday January 22, 2015 @07:05PM (#48880859)

      Personally I love the GTX 750. It gives the biggest bang-for-the-buck [videocardbenchmark.net] and running at about 55 watts max or so it usually doesn't require a larger power supply. It can run completely off motherboard power going to a 16-lane 75 watt PCIe slot.

      It's the perfect card for rescuing old systems from obsolescence, IMO.

      The only trouble you might have is finding a single-slot-wide card if your system doesn't have room for a double slot card, though in my case I found a double-slot card that I could modify to fit in a single-slot of an old Core 2 Duo E8500 system.

      And heat doesn't seem to be a problem at all, even with the mod I did. The low power of the card means less heat. Even if heat becomes a problem, the card is capable of slowly clocking itself down, though I've never seen that yet, even running Furmark.

    • Tom's hardware has the latest Jan update for graphics comparisons. The R7 260x is best bang for buck but the 750ti is not quite as good value but as you don't like ATI it may be a better choice. All really comes down to why you are upgrading and what you play though.
      http://www.tomshardware.com/re... [tomshardware.com]

    • by idealego ( 32141 )

      I managed to get a GTX 750 Ti for about $80 USD (I price matched it at $120 CAD and it had a $20 MIR, so it ended up being $100 CAD plus tax).

      I bought it because my old 6850 wasn't running Far Cry 4 all that well. The GTX 750 Ti runs it like a champ at 1080P with settings between medium and high. Huge performance increase over the 6850, it's much quieter, doesn't need an external power connector, and I even sold my old card to someone for $70.

    • Psh. I spent $500 on my GTX 780ti. I have no regrets.
    • Right now the GT750 is the best price/performance. Save the $25 to $35 bucks just isn't worth it in the case if you are a gamer.

      If you are not a gamer, then sure.

  • I think the 'range' depends on what resolution you are playing at..
    For 3840x2160 - Low end
    For 2560x1440 - 'Midrange'
    For 1920x1080 - High end

    • by vux984 ( 928602 )

      For 3840x2160 - Low end

      Are there any good monitors at that resolution though? I bought a pair of 27" screens this holiday season and ended up opting for 2560x1440 because 3840x2160 were all terrible for gaming; with pretty much any video card it seemed.

      So if you are going to shell out for a top-line nvidia card... what monitor are you pairing it with? A 30Hz QuadHD monitor with high lags, and latency?

      I don't get the logic of that.

      I couldn't find a good 3840x2160 screen that was remotely any good at least at

      • by PIBM ( 588930 )

        2 of my friends bought 40" 4K samsung tvs (hu7000) on black friday for around $600. I find that the size is just perfect, coming from multiple 30" 2560x1600. The colors were a bit off, but it had not been calibrated. The intensity was adjustable to a working range, and playing Battlefield 4 on it was fast enough to finish 1st on a 64 people pistols only hardcore server :) Last I heard they haven't been able to get 4:4:4 working but it's supposed to get fixed by a firmware upgrade.

      • by zlives ( 2009072 )

        i have an asus 4k monitor, 1 ms gtg 60hz on displayport. works pretty well though for gaming i am still using 2560 on a single 980gtx.

        • by vux984 ( 928602 )

          i have an asus 4k monitor, 1 ms gtg 60hz on displayport

          1ms gtg though means TN display right? If performance / gaming is your primary and only driving consideration that's fine.

          But I wanted something that does better with picture quality and color representation than a TN will deliver. I ended up with Asus as well but selected a 27" QHD PLS based panel; (the pair of which so far I'm very happy with.)

          But I know they'll be obsoleted with really good 4K stuff soon.

          Still the fact that you are choosing to game

          • by zlives ( 2009072 )

            "choosing to game at 256" the screen size comes into play. i would play 4k with maybe a 32+ inch screen but then it may be too close for a desktop experience. I output to a 4k projector if I truly need color corrected picture quality, plus my old eyes really appreciate the beauty of high res but at a much bigger screen.

            • by vux984 ( 928602 )

              the screen size comes into play. i would play 4k with maybe a 32+ inch screen but then it may be too close for a desktop experience. I output to a 4k projector if I truly need color corrected picture quality, plus my old eyes really appreciate the beauty of high res but at a much bigger screen.

              I don't get this at all. The only reasons ever not to game at the screens native resolution is

              a) due to framerate losses due to pushing more pixels

              b) due to poor game designs where the fonts become unread-ably small b

              • by zlives ( 2009072 )

                in one game wasteland2 4k native is too small for me/my eyes so i stick with 2560
                the other game StarCitizen 4k looks beautiful but game is in dev, needs optimization and even then i will need a second 980gtx and maybe a third because the game can use what you throw at it.

      • Re: (Score:3, Informative)

        by rcht148 ( 2872453 )

        I never said that I'm using 4k.
        After your response I think my original comment might make me look like a resolution whore.
        My point was NOT that the card should be labeled low end because of 4k. Rather, it should be labeled high end.
        I play at 1920x1080 resolution. So from my perspective, the card is high end.
        It's just that cards get labeled based on series (like Nvidia x60 series is mid-end) which is primarily based on price.
        When there are graphics cards available from $30-3000, the low, mid and high end wil

      • I'm using an Asus PB287Q with a GTX970. It sits on the boundary, so Elite Dangerous and Wolfenstein are comfortable in 4k. Far Cry and Metro need to drop to 2560x1440 to hit 50-60fps. Anything under 40fps is unplayable on this combo, not so much to do with looking bad, it feels like the input lag jumps below that rate.

        The monitor looks alright at 1440p, a little soft and washed out but still better than my previous monitors in their native modes. In 4k the picture is unbelievable. At this size and sitting a

        • by vux984 ( 928602 )

          Asus PB287Q with a GTX970

          Yeah, that's a TN panel. It's good for gaming; as it gets the response times etc where they need to be, but its not really suitable for anything that requires an accurate color space; which was one of my requirements.

          It's also telling that even with a GTX970 you are finding running at 4k to be a bit hit and miss.

          1080p is painful to watch.

          And that's unfortunate too because 99% of content is not available yet for QHD / UHD so your going to be looking at a lot of 1080p content for a w

    • by epyT-R ( 613989 )

      It really depends on the game and your expectations for minimum framerate. If you want lots of gfx detail and high minimums, 1920x1080 still requires high end graphics.

    • by Cederic ( 9623 )

      Nah, it means that Nvidia offer a range of graphics cards for gaming, and this one is not "cheap and nasty" and it's not "top end". It's sort of near the middle of the range.

  • There are other, better reviews for this card, for instance Tom's hardware, but every single hardware review story on Slashdot seems to be obliged to link to Hothardware.
    So, which editor own's Hothardware shares?

    • Re: (Score:2, Interesting)

      by rahvin112 ( 446269 )

      Tom's a whore, I wouldn't be surprised at all if the ranking or quality of the product in his reviews is impacted by the amount of money he was paid. Some of the recommended products he's had over the years to me indicate that the site's "approval" is for sale. I haven't been there is years so it might have changed but once someone sells out you shouldn't trust them again.

  • by Anonymous Coward

    Mid-range Nvidia cards alway seem to launch a little expensive. There's always an older model from AMD that's got a better value on paper.

    But I'd take this in a heartbeat over an AMD counterpart. The maxwell chips are leagues ahead of anything AMD's got. Very low power consumption and solid performance with great features. Maxwell is a huge leap over their previous offerings. Cards that are 150% faster while consuming 60% less energy than their previous generation counterparts.

    "AMD has shitty drivers" is an

    • by sd4f ( 1891894 )

      I decided to go nvidia this round, had issues with my previous AMD gpu where certain functions didn't work properly. The one that annoyed me the most was that I have a 1920x1200 monitor, which is an older model that won't do 1:1 pixel scaling, so if a 1920x1080 signal gets fed, it will stretch it to full screen. The AMD software could be set to maintain the aspect ratio, so it didn't bother me, after a certain update, that feature stopped working, and no matter how hard I tried, it would always stretch the

      • GPU's are easy to change, 16:10 monitors not so much. There's no way I'm departing from 16:10!

        This. Love my 16:10's. Getting harder and harder to find, so can only hope they keep holding up.

        • by Cederic ( 9623 )

          I gave up and went 16:9. I cheated though; I didn't drop to 1080 height, I switched to 2560x1440.

          The aspect ratio is less important to me than having good vertical resolution. This way I get more vertical than 1920x1200 and the bonus of a few extra pixels on the sides too.

    • by epyT-R ( 613989 )

      It might be an old meme, but it's still true, especially for opengl. Their d3d is ok if you plan to run the games tweaked in that particular version of the driver.

    • by mc6809e ( 214243 )

      But I'd take this in a heartbeat over an AMD counterpart. The maxwell chips are leagues ahead of anything AMD's got.

      WIth one exception: the R9 280x when used for DP floating point compute.

      For about $250 you can get an R9 280x that in one second will do one trillion double precision floating point operations. That's about 10x faster than the Maxwell cards.

      With such a card AMD should have had the scientist/engineer space for GPGPU locked up by now.

      But, you know, they're AMD, so...

    • by Cederic ( 9623 )

      A legit complaint I can see is 2GB of memory. Modern games are starting to crave lots of memory. I suspect Nvidia may be gating that feature in higher tier SKUs, or maybe we'll see 4GB cards not long after launch.

      That's the bit that confuses and disappoints me. The consoles seem to be expecting 3GB of graphics RAM so it feels pretty asinine to release a PC card with only 2GB. Surely matching console performance in a midrange PC is a fairly basic expectation?

  • "Raw Press Releases."

    C'mon, guys, this is copy-pasted marketing fluff. Better is expected of you.

  • From the reviews I've read it's basically it's cool, quiet, has all the latest features but in the end has almost the same performance as the 760. Where the 970 went really aggressive on pricing the 960 looks to be their "money maker" that TechPowerUp called "a cheap-to-make GPU they paired with an extremely cost-efficient PCB design that has loads of margins in it for future price wars with AMD".

    Not that I think AMD is in any mood for price wars after their Q4 financials, they posted a $330 million loss, a

    • by Cederic ( 9623 )

      I practically begged a friend to get a 970 on the PC he bought a week ago. Their pricing was so aggressive that as soon as you went within £40 of it the performance uplift was just too appealing to turn down. You get a PC that'll play new release games beautifully at 1920x1080 for another 4-5 years instead of just 2-3.

  • Isn't Maxwell that talking pig in the teevee commercials? Not the first thing I want people to think of when they hear about my new product.
    • Or a Beatles song... "Bang! Bang! Maxwell's silver hammer came down upon his head. Clang! Clang! Maxwell's silver hammer made sure that he was dead." (I bet the GTX 970 price is definitely hurting AMD).
    • by dbIII ( 701233 )
      The other poster did the hammer thing, so second choice is it running like a demon (Maxwell's demon).
  • by rsilvergun ( 571051 ) on Thursday January 22, 2015 @07:19PM (#48880961)
    nVidia's been using that for years to keep their mid range from biting into their high end. At those price points I wish I was confident enough in AMD's driver stability to buy an R9 270X. 256 bit interface + 1500 mhz core clock for $210 bucks.
    • What's funny is that before the mid-2014 crypto-coin mining rage, the R9 270X was priced well below 200$CAD.

    • Normally the x60 card uses a 192-bit bus, but nVidia's betting big on their texture compression and insane clock rate. Same reason the 980 uses a 256-bit bus instead of 384-bit.

  • I'm glad to see ASUS understands that a lot of gamers have a small PC. Their Asus Strix GeForce GTX 960 looks like it might fit in a Cooler Master Elite 110.

    • by SeaFox ( 739806 )

      I can't speak much for NVidia products, but Sapphire makes a short AMD-based card [newegg.com] that would fit in an Elite 110 I'm sure there are others (might have to put the power connector into the cutout for the front panel if it's a card-rear power connector model),

    • http://www.asus.com/Graphics_C... [asus.com]

      Short 970 from asus that will fit in a 110 for sure.
  • My problem with this new card is, as far as I can tell it barely meets the minimum requirements for some next-gen games. Isn't the min req for Assassins Creed Unity like a GeForce 680?

    The specs for all new games are out of whack with the latest generation of video cards. If you buy a $200 card, you should be able to play any game released in the same year (though probably not on Ultra).

    I just don't want to pull the trigger on a $350 video card, but it looks like game devs are making that the entry level f

    • by Luckyo ( 1726890 )

      That is because its Unity. Unity doesn't run well on anything.

      And I do mean anything. No hardware in the world runs Unity well. Not because hardware is bad, but because Unity is a buggy piece of shit that won't run well even on SLI 980s.

      As for "playing games released on the same year", I'm yet to find a game that won't work with my 560Ti. Though many of these releases I had to bump quality down to medium to get solid frame rate.

    • Those minimum requirements are a joke. I've got a 580 and I can run everything at full detail for the monitor resolution I have. None of them even stress the games. The fact is most games are designed, spec wise, around consoles and won't even stress 5 year old graphics cards.

      • I've been using a 550 Ti for the last three years and it's been great. It could be better I suppose but I'm not some elite gamer that needs 60 FPS no matter what, or photo realistic video quality.

        That said I just worked a weekend and holiday so had a little extra cash coming this paycheck so I ordered my EVGA 960 lastnight. While the performance may not be spectacularly better than the previous generation it will be a big upgrade for me, and with it's lower power requirements it'll fit nicely in my computer

  • You can get a used 280X for cheaper that will wipe the floor with it. The dual fan models are very quiet. They use a bit of power, but not at idle... so unless you game 24/7 and live in a super pricy electric area OR have a garbage PSU, its the far better deal. The plain ol 280 is also better.
    • by jedidiah ( 1196 )

      You lost me at "used".

    • by Anonymous Coward

      If you like dealing with half-assed drivers then maybe.

      • by oic0 ( 1864384 )
        I just swapped from a AMD to a GTX 970 (I needed HDMI 2.0 for 4k). I would not say the Nvidia drivers are better. Not by a long shot. Alt tabbing out often leaves my screen tinted a funny color until I change resolutions and back. A few of the driver options either don't work or mess things up (attempting to change color modes scrambled my screen even on compatible settings). Oh and my cursor is all sorts of messed up. Yes I uninstalled the AMD drivers and ran a cleaner. There are some features I really lik
  • I bought a 980 over the holidays. Runs 3x 1080p excellently. If I had the extra money to spend, I would have bought g-sync monitors, but they would cost more than my PC did, so screw it.
  • There's finally a reasonable bump coming this year from cards with a new architecture and using a totally different ram system. Up to 9x higher bandwidth.

    I doubt they won't bleed us with slow bumps and increments initially but none the less I suspect the first 'significant bump' in years might occur when these cards come out in the next couple of months. Look to see the 9xx series drop in price if you prefer nvidia - but unless you need an upgrade right now urgently, apply some patience and wait to see w

  • says the guy who bought a 980 just before Christmas. Yeah... hypocrisy much.

    However, be aware that minimum specs for games are in a bit of a state of flux at the moment. In some senses, it's not before time; they've only risen very slowly for many years, as development of most games was targeted first and foremost at the Xbox 360 and PS3, with PC versions usually not receiving much more than a few cosmetic upgrades. For quite a few years now, a reasonably recent i3/middle-aged i5 (or AMD equivalent) and a s

  • 120 Watts

    The summary should have included this.

Think of it! With VLSI we can pack 100 ENIACs in 1 sq. cm.!

Working...