NVIDIA Launches New Midrange Maxwell-Based GeForce GTX 960 Graphics Card 114
MojoKid writes NVIDIA is launching a new Maxwell desktop graphics card today, targeted at the sweet spot of the graphics card market ($200 or so), currently occupied by its previous gen GeForce GTX 760 and older GTX 660. The new GeForce GTX 960 features a brand new Maxwell-based GPU dubbed the GM206. NVIDIA was able to optimize the GM206's power efficiency without moving to a new process, by tweaking virtually every part of the GPU. NVIDIA's reference specifications for the GeForce GTX 960 call for a base clock of 1126MHz and a Boost clock of 1178MHz. The GPU is packing 1024 CUDA cores, 64 texture units, and 32 ROPs, which is half of what's inside their top-end GeForce GTX 980. The 2GB of GDDR5 memory on GeForce GTX 960 cards is clocked at a speedy 7GHz (effective GDDR5 data rate) over a 128-bit memory interface. The new GeForce GTX 960 is a low-power upgrade for gamers with GeForce GTX 660 class cards or older that make up a good percentage of the market now. It's usually faster than the previous generation GeForce GTX 760 card but, depending on the game title, can trail it as well, due to its narrower memory interface.
Re: (Score:2, Informative)
is that one processor or a collection ?
Each "CUDA core" is a single processor, which can run a CUDA [wikipedia.org] instance, or an OpenCL [wikipedia.org] instance, or run a shading algorithm. The card has 1024 of them.
Re: (Score:3)
No, a CUDA core is better thought of an ALU or sub-ALU, speaking in terms of cores is clear abuse because AMD or nvidia once started to use that term. As a limited analogy, the original Pentium has two integer pipelines but is not called a dual-core CPU.
"CUDA cores" are organised into units that house 128 of them here, called an "SMM". But to understand how things are dealt with from the software point of view (threads, warps) some extensive reading is needed.
Re: (Score:2)
This is incorrect. CUDA cores are at a higher level than ALUs or FPUs; they're like small, simple cpu cores. They can do integer and floating point arithmetic, and they have hardware support for thread context switching, which they can generally do in a single clock tick. There can be varying numbers of CUDA cores in a streaming multiprocessor, but CUDA thread blocks are arranged in groups of 32 ("warps") which share a scheduling unit and which execute the same instruction in lock-step on different memor
Re: (Score:2)
Cool. Our research folks at $DAYJOB have been building GPU-computing clouds, and have found that for many workloads, the GTX 750i was extremely cost-effective (that's the predecessor to this card, and costs include the server you plug it into and electricity as well as the graphics card), compared to much higher-end computation-focused systems. But they bought their lab hardware months ago; this looks to be about 50% faster, for a slightly higher price, so that's a win.
Re: (Score:2)
A CUDA core is basically an active hardware thread [pool.]
For example my GTX Titan has 2,688 CUDA Cores.
This number is derived from:
Streaming Multiprocessors: 14
* 192 Cores/SM
====
2,688 Cores
In practice that means you have 2,688 threads doing "real work" at any one time.
See this SO question/answer
http://stackoverflow.com/quest... [stackoverflow.com]
Re: (Score:1)
That's exactly what Maxwell did vs Kepler.
Re: (Score:2)
Not completely true. When a chip draws less power to perform the same functions you can the push the clock, because consuming less power means less waste heat. As you raise the clock you get more speed. There is of course a balance to be struck, but ragardless figuring out how to accomplish the same work with less power will allow you to push the performance before hitting the limit of your heat disipation capabilities.
Awesome, I shall buy one in a year (Score:2)
My "sweet spot" is $100, with perhaps $20 fudge factor for impatience and/or shipping. Hence, I am now running an Asus 450 GTS OC 1GB. If I were to buy a card today, which one would I buy assuming I weren't even considering throwing away money on an ATI card?
Re:Awesome, I shall buy one in a year (Score:4, Interesting)
You might be able to find a Geforce 750 for $125 or so. They're adequate but definitely not ideal for 1920x1080 in most PC games. They're also ridiculously efficient; they don't even need an extra PCIe power connection.
If anyone tells me they want to game on a PC and don't immediately mention a game with a more serious demand, that's the hardware I use.
I generally prefer ATI hardware because I think nVidia's stock cooling kills graphics cards and I'd rather deal with crappy drivers, but the current ATI hardware is a complete non-starter. There's really no level at all where it can be justified.
Re: (Score:1)
Geforce 750 should be able to run 1920x1080 at high settings on most games, so I would say it's ideal. I run 2560x1440 on a Geforce 660, medium and sometimes medium-high settings no problem.
Re: (Score:2)
Yes, if you're happy chugging along at 20-35fps with dips into the low teens.
Re: (Score:2)
Yes, if you're happy chugging along at 20-35fps with dips into the low teens.
If you know what I mean.
Re: (Score:2)
Yeah, any card released in the last five years should be awesome for games released before 2005, so if those are the games you want to play then save your money and see how the onboard graphics cope with them before even buying a dedicated graphics card.
My year old graphics card is still coping fine with games released in the past few months at full detail and 2560x1440, so that gives you an idea of how far ahead of the current game demands the graphics cards are. 1920x1080 gaming is comparatively extremely
Re: (Score:1)
He seems to understand it perfectly well. performance wise both the 750 and 660 are similar with the 660 a little ahead but the 660 comes at a price premium. the 750 already is a little over the $100 budget mentioned, the 660 blows that budget completely. 750 is excellent in the bang for buck range, the 660 less so and is overkill for many people.
Re: (Score:2)
I don't think you understand what the numbers mean. x50 and below (450, 650, 630, etc) are workstation kind of cards. Good for web browsing and regular desktop stuff.
I don't think YOU understand the target market or numbers. Workstations that do web browsing and regular desktop stuff don't need standalone cards, they use integrated graphics.
The x00, x10 x20, x30, x40 cards are for those who have some use for better 3D hardware than just browsing and "regular desktop stuff", but not enough to require a higher end card. "Light gaming" like Minecraft, some F2P MMO's or HTPC use. That's why those cards are popular amongst Linux users.
Re: (Score:2)
Re: (Score:2)
I generally prefer ATI hardware because I think nVidia's stock cooling kills graphics cards and I'd rather deal with crappy drivers, but the current ATI hardware is a complete non-starter. There's really no level at all where it can be justified.
Well, yeah, because ATI doesn't exist anymore, even as a brand. AMD bought them in 2006 and retired the brand in 2010. The last ATI card was the 5870 Eyefinity Edition, which packs about as much punch as this 960, but in a card with the size, noise and power draw of a top-end card. Everything since has been AMD.
I know exactly what you meant, and I even agree with you on your points, it's just hard to take those points seriously when you're using a half-decade-old name.
Re: (Score:2, Insightful)
As a former ATI employee, I think many will join me in lamenting its demise. Was a great place to work, and I enjoyed contributing to some cool products.
As a former AMD employee (as per the buy-out), all I can say is AMD sucked. First the "green" management couldn't get a product to production, year after year after year. Now they're trying to play as a mostly "red" team, and unfortunately don't have half the staff that made ATI work well. Rory was just weirdly goofy, and sometimes bizarrely candid with
Re: (Score:2)
That's simply not true unless you're talking about edge cases like gaming on Linux.
It so happens that I game on Linux.
Re: (Score:2)
Aye aye. AMD drivers have been shit on Linux for like forever. Your only option there is to get a NVidia card. If it draws less power than the competition like now, it's even better.
Re: (Score:2)
That's no longer true.
The Catalyst drivers have made *huge* gains over the past year to year and a half, both in terms of performance and stability.
Civ V and the recent Civ-BE preview run just fine on my HD 7870.
Re: (Score:2)
Re:Awesome, I shall buy one in a year (Score:5, Informative)
Personally I love the GTX 750. It gives the biggest bang-for-the-buck [videocardbenchmark.net] and running at about 55 watts max or so it usually doesn't require a larger power supply. It can run completely off motherboard power going to a 16-lane 75 watt PCIe slot.
It's the perfect card for rescuing old systems from obsolescence, IMO.
The only trouble you might have is finding a single-slot-wide card if your system doesn't have room for a double slot card, though in my case I found a double-slot card that I could modify to fit in a single-slot of an old Core 2 Duo E8500 system.
And heat doesn't seem to be a problem at all, even with the mod I did. The low power of the card means less heat. Even if heat becomes a problem, the card is capable of slowly clocking itself down, though I've never seen that yet, even running Furmark.
Re: (Score:2)
Tom's hardware has the latest Jan update for graphics comparisons. The R7 260x is best bang for buck but the 750ti is not quite as good value but as you don't like ATI it may be a better choice. All really comes down to why you are upgrading and what you play though.
http://www.tomshardware.com/re... [tomshardware.com]
Re: (Score:2)
I managed to get a GTX 750 Ti for about $80 USD (I price matched it at $120 CAD and it had a $20 MIR, so it ended up being $100 CAD plus tax).
I bought it because my old 6850 wasn't running Far Cry 4 all that well. The GTX 750 Ti runs it like a champ at 1080P with settings between medium and high. Huge performance increase over the 6850, it's much quieter, doesn't need an external power connector, and I even sold my old card to someone for $70.
Re: (Score:1)
Re: (Score:1)
Right now the GT750 is the best price/performance. Save the $25 to $35 bucks just isn't worth it in the case if you are a gamer.
If you are not a gamer, then sure.
Midrange? (Score:1)
I think the 'range' depends on what resolution you are playing at..
For 3840x2160 - Low end
For 2560x1440 - 'Midrange'
For 1920x1080 - High end
Re: (Score:2)
For 3840x2160 - Low end
Are there any good monitors at that resolution though? I bought a pair of 27" screens this holiday season and ended up opting for 2560x1440 because 3840x2160 were all terrible for gaming; with pretty much any video card it seemed.
So if you are going to shell out for a top-line nvidia card... what monitor are you pairing it with? A 30Hz QuadHD monitor with high lags, and latency?
I don't get the logic of that.
I couldn't find a good 3840x2160 screen that was remotely any good at least at
Re: (Score:1)
2 of my friends bought 40" 4K samsung tvs (hu7000) on black friday for around $600. I find that the size is just perfect, coming from multiple 30" 2560x1600. The colors were a bit off, but it had not been calibrated. The intensity was adjustable to a working range, and playing Battlefield 4 on it was fast enough to finish 1st on a 64 people pistols only hardcore server :) Last I heard they haven't been able to get 4:4:4 working but it's supposed to get fixed by a firmware upgrade.
Re: (Score:2)
i have an asus 4k monitor, 1 ms gtg 60hz on displayport. works pretty well though for gaming i am still using 2560 on a single 980gtx.
Re: (Score:2)
i have an asus 4k monitor, 1 ms gtg 60hz on displayport
1ms gtg though means TN display right? If performance / gaming is your primary and only driving consideration that's fine.
But I wanted something that does better with picture quality and color representation than a TN will deliver. I ended up with Asus as well but selected a 27" QHD PLS based panel; (the pair of which so far I'm very happy with.)
But I know they'll be obsoleted with really good 4K stuff soon.
Still the fact that you are choosing to game
Re: (Score:2)
"choosing to game at 256" the screen size comes into play. i would play 4k with maybe a 32+ inch screen but then it may be too close for a desktop experience. I output to a 4k projector if I truly need color corrected picture quality, plus my old eyes really appreciate the beauty of high res but at a much bigger screen.
Re: (Score:2)
the screen size comes into play. i would play 4k with maybe a 32+ inch screen but then it may be too close for a desktop experience. I output to a 4k projector if I truly need color corrected picture quality, plus my old eyes really appreciate the beauty of high res but at a much bigger screen.
I don't get this at all. The only reasons ever not to game at the screens native resolution is
a) due to framerate losses due to pushing more pixels
b) due to poor game designs where the fonts become unread-ably small b
Re: (Score:2)
in one game wasteland2 4k native is too small for me/my eyes so i stick with 2560
the other game StarCitizen 4k looks beautiful but game is in dev, needs optimization and even then i will need a second 980gtx and maybe a third because the game can use what you throw at it.
Re: (Score:3, Informative)
I never said that I'm using 4k.
After your response I think my original comment might make me look like a resolution whore.
My point was NOT that the card should be labeled low end because of 4k. Rather, it should be labeled high end.
I play at 1920x1080 resolution. So from my perspective, the card is high end.
It's just that cards get labeled based on series (like Nvidia x60 series is mid-end) which is primarily based on price.
When there are graphics cards available from $30-3000, the low, mid and high end wil
Re: (Score:2)
I'm using an Asus PB287Q with a GTX970. It sits on the boundary, so Elite Dangerous and Wolfenstein are comfortable in 4k. Far Cry and Metro need to drop to 2560x1440 to hit 50-60fps. Anything under 40fps is unplayable on this combo, not so much to do with looking bad, it feels like the input lag jumps below that rate.
The monitor looks alright at 1440p, a little soft and washed out but still better than my previous monitors in their native modes. In 4k the picture is unbelievable. At this size and sitting a
Re: (Score:2)
Asus PB287Q with a GTX970
Yeah, that's a TN panel. It's good for gaming; as it gets the response times etc where they need to be, but its not really suitable for anything that requires an accurate color space; which was one of my requirements.
It's also telling that even with a GTX970 you are finding running at 4k to be a bit hit and miss.
1080p is painful to watch.
And that's unfortunate too because 99% of content is not available yet for QHD / UHD so your going to be looking at a lot of 1080p content for a w
Re: (Score:1)
It really depends on the game and your expectations for minimum framerate. If you want lots of gfx detail and high minimums, 1920x1080 still requires high end graphics.
Re: (Score:2)
Nah, it means that Nvidia offer a range of graphics cards for gaming, and this one is not "cheap and nasty" and it's not "top end". It's sort of near the middle of the range.
Slashdot affiliated with Hothardware (Score:2)
There are other, better reviews for this card, for instance Tom's hardware, but every single hardware review story on Slashdot seems to be obliged to link to Hothardware.
So, which editor own's Hothardware shares?
Re: (Score:2, Interesting)
Tom's a whore, I wouldn't be surprised at all if the ranking or quality of the product in his reviews is impacted by the amount of money he was paid. Some of the recommended products he's had over the years to me indicate that the site's "approval" is for sale. I haven't been there is years so it might have changed but once someone sells out you shouldn't trust them again.
Some people say it's too pricy. (Score:1)
Mid-range Nvidia cards alway seem to launch a little expensive. There's always an older model from AMD that's got a better value on paper.
But I'd take this in a heartbeat over an AMD counterpart. The maxwell chips are leagues ahead of anything AMD's got. Very low power consumption and solid performance with great features. Maxwell is a huge leap over their previous offerings. Cards that are 150% faster while consuming 60% less energy than their previous generation counterparts.
"AMD has shitty drivers" is an
Re: (Score:2)
I decided to go nvidia this round, had issues with my previous AMD gpu where certain functions didn't work properly. The one that annoyed me the most was that I have a 1920x1200 monitor, which is an older model that won't do 1:1 pixel scaling, so if a 1920x1080 signal gets fed, it will stretch it to full screen. The AMD software could be set to maintain the aspect ratio, so it didn't bother me, after a certain update, that feature stopped working, and no matter how hard I tried, it would always stretch the
Re: (Score:2)
GPU's are easy to change, 16:10 monitors not so much. There's no way I'm departing from 16:10!
This. Love my 16:10's. Getting harder and harder to find, so can only hope they keep holding up.
Re: (Score:2)
I gave up and went 16:9. I cheated though; I didn't drop to 1080 height, I switched to 2560x1440.
The aspect ratio is less important to me than having good vertical resolution. This way I get more vertical than 1920x1200 and the bonus of a few extra pixels on the sides too.
Re: (Score:2)
It might be an old meme, but it's still true, especially for opengl. Their d3d is ok if you plan to run the games tweaked in that particular version of the driver.
Re: (Score:2)
WIth one exception: the R9 280x when used for DP floating point compute.
For about $250 you can get an R9 280x that in one second will do one trillion double precision floating point operations. That's about 10x faster than the Maxwell cards.
With such a card AMD should have had the scientist/engineer space for GPGPU locked up by now.
But, you know, they're AMD, so...
Re: (Score:2)
A legit complaint I can see is 2GB of memory. Modern games are starting to crave lots of memory. I suspect Nvidia may be gating that feature in higher tier SKUs, or maybe we'll see 4GB cards not long after launch.
That's the bit that confuses and disappoints me. The consoles seem to be expecting 3GB of graphics RAM so it feels pretty asinine to release a PC card with only 2GB. Surely matching console performance in a midrange PC is a fairly basic expectation?
Proposed New Story Section: (Score:2)
C'mon, guys, this is copy-pasted marketing fluff. Better is expected of you.
Not all that impressive (Score:2)
From the reviews I've read it's basically it's cool, quiet, has all the latest features but in the end has almost the same performance as the 760. Where the 970 went really aggressive on pricing the 960 looks to be their "money maker" that TechPowerUp called "a cheap-to-make GPU they paired with an extremely cost-efficient PCB design that has loads of margins in it for future price wars with AMD".
Not that I think AMD is in any mood for price wars after their Q4 financials, they posted a $330 million loss, a
Re: (Score:2)
I practically begged a friend to get a 970 on the PC he bought a week ago. Their pricing was so aggressive that as soon as you went within £40 of it the performance uplift was just too appealing to turn down. You get a PC that'll play new release games beautifully at 1920x1080 for another 4-5 years instead of just 2-3.
Sounds Familiar (Score:2)
Re: (Score:1)
Re: (Score:2)
128-bit Memory Interface? (Score:3)
Re: (Score:2)
What's funny is that before the mid-2014 crypto-coin mining rage, the R9 270X was priced well below 200$CAD.
Re: (Score:2)
Normally the x60 card uses a 192-bit bus, but nVidia's betting big on their texture compression and insane clock rate. Same reason the 980 uses a 256-bit bus instead of 384-bit.
Shorter card (Score:2)
I'm glad to see ASUS understands that a lot of gamers have a small PC. Their Asus Strix GeForce GTX 960 looks like it might fit in a Cooler Master Elite 110.
Re: (Score:2)
I can't speak much for NVidia products, but Sapphire makes a short AMD-based card [newegg.com] that would fit in an Elite 110 I'm sure there are others (might have to put the power connector into the cutout for the front panel if it's a card-rear power connector model),
Re: (Score:2)
Stupid gamepads with tiny, over-sensitive and useless analog thumbsticks? No thanks.
Re: (Score:2)
tiny, over-sensitive
You are a PC gamer, you've spent years not using your thumb for anything except the space bar. You will have issues when you try to use the thumbs on a gamepad because you haven't developed the skills/dexterity to do so.
The people who have been using their thumbs can use those thumbsticks, d-pads and buttons just fine.
Re: (Score:2)
I can use a D-pad and buttons just fine, I even grew up with consoles, starting with the Intellivision. I just never was able to use those thumbsticks - they're too sensitive and don't allow the same precision as a mouse.
There's a reason why first-person shooter games don't let PC gamers play with console gamers. The console gamers would have no hope against the PC gamers.
Re: (Score:2)
You mean like the one I use on my PC? I call it a controller, but that's because I perceive a gamepad to have no analog thumbsticks.
It's the joy of a PC. I can use keyboard and mouse for strategy, FPS and RPG games (online / MMO / offline or otherwise), use HOTAS for flight/space sims, use a wheel or a controller for driving games and just use the mouse for Hexcells.
Or combine options. For Saints Row III the keyboard and mouse gives me the control and precision I want for moving around on foot, then switch
Re: (Score:2)
Short 970 from asus that will fit in a 110 for sure.
Will it play Batman Arkham Knight? (Score:2)
My problem with this new card is, as far as I can tell it barely meets the minimum requirements for some next-gen games. Isn't the min req for Assassins Creed Unity like a GeForce 680?
The specs for all new games are out of whack with the latest generation of video cards. If you buy a $200 card, you should be able to play any game released in the same year (though probably not on Ultra).
I just don't want to pull the trigger on a $350 video card, but it looks like game devs are making that the entry level f
Re: (Score:3)
That is because its Unity. Unity doesn't run well on anything.
And I do mean anything. No hardware in the world runs Unity well. Not because hardware is bad, but because Unity is a buggy piece of shit that won't run well even on SLI 980s.
As for "playing games released on the same year", I'm yet to find a game that won't work with my 560Ti. Though many of these releases I had to bump quality down to medium to get solid frame rate.
Re: (Score:2)
Those minimum requirements are a joke. I've got a 580 and I can run everything at full detail for the monitor resolution I have. None of them even stress the games. The fact is most games are designed, spec wise, around consoles and won't even stress 5 year old graphics cards.
Re: (Score:2)
I've been using a 550 Ti for the last three years and it's been great. It could be better I suppose but I'm not some elite gamer that needs 60 FPS no matter what, or photo realistic video quality.
That said I just worked a weekend and holiday so had a little extra cash coming this paycheck so I ordered my EVGA 960 lastnight. While the performance may not be spectacularly better than the previous generation it will be a big upgrade for me, and with it's lower power requirements it'll fit nicely in my computer
Weaksauce (Score:2)
Re: (Score:3)
You lost me at "used".
Re: (Score:1)
If you like dealing with half-assed drivers then maybe.
Re: (Score:2)
Re: (Score:2)
No, the card in question is Radeon R9 280X, which is a rebranded and high clocked 7970, does over 3 teraflops.
It's priced cheaper, but is significantly faster (depending on game) and has 3GB memory not 2GB. Sure, it uses about 200 watts.
Too late. (Score:1)
AMD has HBM coming (Score:2)
There's finally a reasonable bump coming this year from cards with a new architecture and using a totally different ram system. Up to 9x higher bandwidth.
I doubt they won't bleed us with slow bumps and increments initially but none the less I suspect the first 'significant bump' in years might occur when these cards come out in the next couple of months. Look to see the 9xx series drop in price if you prefer nvidia - but unless you need an upgrade right now urgently, apply some patience and wait to see w
Might want to wait a few months... (Score:1)
says the guy who bought a 980 just before Christmas. Yeah... hypocrisy much.
However, be aware that minimum specs for games are in a bit of a state of flux at the moment. In some senses, it's not before time; they've only risen very slowly for many years, as development of most games was targeted first and foremost at the Xbox 360 and PS3, with PC versions usually not receiving much more than a few cosmetic upgrades. For quite a few years now, a reasonably recent i3/middle-aged i5 (or AMD equivalent) and a s
TDP? (Score:1)
120 Watts
The summary should have included this.