NVIDIA GeForce GTX 780 Offers 2,304 Cores For $650 160
Vigile writes "When NVIDIA released the GTX Titan in February, it was the first consumer graphics card to use the GK110 GPU from NVIDIA that included 2,688 CUDA cores / shaders and an impressive 6GB of GDDR5 frame buffer. However, it also had a $1000 price tag that was the limiting specification for most gamers. With today's release of the GeForce GTX 780 they are hoping to utilize more of the GK110 silicon they are getting from TSMC while offering a lower cost version with performance within spitting range. The GTX 780 uses the same chip but disables a handful more compute units to bring the shader count down to 2,304 — still an impressive bump over the 1,536 of the GTX 680. The 384-bit memory bus remains though the frame buffer is cut in half to 3GB. Overall, the performance of the new card sits squarely between the GTX Titan ($1000) and AMD's Radeon HD 7970 GHz Edition ($439), just like its price. The question is, are PC gamers willing to shell out $220+ dollars MORE than the HD 7970 for somewhere in the range of 15-25% more performance?" As you might guess, there's similarly spec-laden coverage at lots of other sites, including Tom's, ExtremeTech, and TechReport. HotHardware, too.
Re:Still? (Score:5, Informative)
Is anyone else getting real tired of companies purposely crippling their high end products in order to sell them for less money? It's like openly broadcasting that their cards cost way too much to begin with.
It's a question of tolerances. The chips that come out of the fab are not 100% perfect. The designs are amazingly complex, and they usually contain some defects in the manufacturing process. If they don't meet the high-end specs, maybe they can disable the broken cores, relabel it as a mid-range chip, and sell for less money. It allows the yield to be higher and it lowers the price for ALL of the products.
Re:Still? (Score:4, Informative)
As long as Nvidia keeps crippling double-precision performance on their (non-Tesla) cards, I'll keep buying AMD.
One of the highlights of the GTX Titan was that the card did double-precision floating point at full speed, just like one of Nvidia's Tesla products. That's no longer the case here - the GTX 780 performs double-precision at 1/24 of normal rate, just like a standard desktop GPU.
Re: (Score:2, Insightful)
Alternate description: "Nvidia lowers the cost of standard desktop GPUs by not including features for high-speed high-accuracy functions that serve no purpose in gaming".
Re: (Score:2)
cutting silicon with laser to cripple it does NOT lower the cost.
Re: (Score:2)
Uhm... no. They're cutting double-precision floating point performance through the driver! Hardware is otherwise the same.
Re: (Score:1)
As long as Nvidia keeps crippling double-precision performance on their (non-Tesla) cards, I'll keep buying AMD.
they aren't the only ones: http://youilabs.com/blog/mobile-gpu-floating-point-accuracy-variances/ [youilabs.com] (although this is targeted mainly at mobile gpus, I suppose the same or very similar can be said about desktop GPUs)
Re: (Score:1)
The chips that come out of the fab are not 100% perfect.
While this may be true for these graphics cores, I don't think it's necessarily true for Intel's CPU chips. I think they have their design so refined that their yield is close to 100% for all but the highest density cores.
Otherwise they simply would not be able to offer multi core chips. Maybe someone in the know could comment on this.
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
AMD sold tri-core processors for a while - most if not all of those were just quadcores with one core either non-functional or intentionally crippled. Pretty smart move.
Re: (Score:1)
Re: (Score:1)
Re: (Score:3)
Re: (Score:3)
Nope. The NVIDIA GK110 has in it's design 16 clusters of 192 CUDA cores. And on the Titan, any two of those clusters may be defective and are turned off giving you a total of 2688 CUDA cores. In my opinion, that's a nice way of increasing their usable yield. On the GTX 780, they can have 4 defective clusters giving a total of 2304 CUDA cores. So chips that they would otherwise trash can still be used in a nice high end card. Isn't redundancy in design nice?
Re: (Score:2)
AMD did this back with their Phenom X3's. There was such a high demand for the low cost triple core processors that they disabled a core on some four core processors and sold them as X3s till they ramped up production. I was lucky enough to get one of the four cored processors and was able to re-enable the fourth core through my bios.
Graphics cards (Score:1)
Re: (Score:1)
You can still buy $2500+ video cards. The difference is they are targeted to the professional crowd who need warranty and support on professional applications like CAD software, Photoshop, and the sort. The FirePro (AMD) and Quadro (NVidia) lines aren't exciting as far as gaming and hardware goes, but you're not paying for gaming, you're paying for the support and certification that it will work for various professional applications.
Re: (Score:3)
Upwards of $1000 for a consumer-grade video card? I've spent less on road-worthy vehicles.
Re: (Score:2)
Meanwhile, my $200 graphics card runs all the games I own at max or high settings at 1920x1080.
This is a card for fanatics who want to run six monitors at > 1920x1080, not Joe Sixpack who bought a PC from Walmart.
Re: (Score:2)
Completely agree, current game market can't utilize this. Furhter, the irony here is most games don't work well in a multi-monitor environment. I still find myself turning my other one off to avoid accidental misclicks. Only game I've played that effectively compensates is Starcraft II, everything else has been an alt tab PITA.
I think maybe where you get your money's worth here is you can fire up vlc, a game, and netflix together , which is basically what you're saying, but how many people is that worth
Re: (Score:2)
Actually these cards are used to following reasons legitimately:
1. 120 FPS at 1080p in high end games (usually needed for proper 3D).
2. Extremely high resolution gaming at high detail (4k).
Notably even Titan chokes when trying to do both: 4k rendering of most graphically intensive games at 120FPS at high detail levels.
Re: (Score:2)
> at max or high settings at 1920x1080.
Good luck getting 60+ Hz on a $200 video card with modern (2013) games - you'll most likely playing at a crappy 30 frame per second. i.e. Tomb Raider 2013, BF3, etc. all run good at Ultra settings.
I bought my Titan for a) Win & Linux CUDA, and b) to game at 120 Hz on the Asus VG248QE (the Asus VG278H is also good.) using LightBoost because I can tell instantly when the frame rate drops from 60 Hz down to 30 Hz.
. /me *glares at Path of Exile*
Not everyone gives a
Re: (Score:2)
The key part in the statement is the:
"games I own"
You simply aren't playing games that need as many cycles as you can get, but they do exist. Simulations like DCS :Word, DCS:A-10C, DCS:Ka-50 all need a large amount of graphics power when you are at altitude. The Armed Assault III simulator also makes a lot of use of your CPU and GPU, since you are running around maps that are much bigger than the postage-stamp sizes of most games (eg the pitiful maps on the Call of Duty and Battlefield series).
I'd suggest checking
Re: (Score:1)
Re: (Score:2)
Nobody snivels at any priced hardware, as long as they are using it to make enough money to offset its price. There are as pricey workstation GPUs even now that people use for work.
Re: (Score:2)
what happened to 6 shader cores being a big deal? now we're in quadruple digits? holy cow.
Seriously? (Score:2)
Where have you been since 1997? Since then you've been able to consistently spend roughly $200 per video card and play the latest games at acceptable settings. That's why people cast a sneer at a $1,000 card let alone a $650 card that in of itself can build a really fast computer.
Re: (Score:2)
"Im a little surprised at people snivelling over 1000.00 video cards."
Here's the problem: they're pulling the same blunder that got Intel in hot water back in the 90s.
People aren't stupid (although many of them act that way sometimes). If you can take a part that sells for $1,000, disable some of the functionality, and sell it for $650... then you can sell the whole unit for $650. It's a ripoff and people know it.
Now, if it's a matter of disabling cores that don't pass testing anyway, that might be an effective way to dump "defective" parts on the market and still profi
Re: (Score:2)
People aren't as stupid as you make them out to be (or in fact are). The costs in chip production are not in manufacturing alone - else taiwanese would have won the chipmaking competition long ago. Design costs are very significant, as are the costs of setting up the production process.
Re: (Score:2)
"The costs in chip production are not in manufacturing alone - else taiwanese would have won the chipmaking competition long ago. Design costs are very significant, as are the costs of setting up the production process."
And that is 100% irrelevant to the point I was making. Just as with Intel, the "disabled" chips only became available well after the full-featured chips. The design didn't happen until the manufacturer decided to create a branch of the product line from the existing branch.
Do you not remember the customer revolt over Intel doing this? I was systems manager when it happened. And believe me, people were pissed off.
Re: (Score:2)
When you're a business, you'll piss people off for the weirdest reasons.
Doesn't mean you have to care every time someone will get pissed. Else you'd be doing ntohing but caring about pissed off people. For the record, I don't think anyone who wasn't a "systems manager" was revolting. At least I never even heard about a revolt when intel started doing it. Reaction was "meh, whatever, I still get a chip I pay for" at worst.
Re: (Score:2)
And so much of what people "know" is wrong.
It's entirely possible that at $650/chip Intel loses money and at $1000/chip Intel loses money -- but at the two price points combined, Intel makes a profit.
Suppose Intel can sell 1 million chips at $650 but only 500 t
Re: (Score:1)
Now imagine that your company has correctly identified that pricing sweetspot, but some of your product is partially defective, though still useful. You cannot sell it at that same ideal price as the fully-featured product. So, you necessarily sell it at s
More Coors is always better (Score:5, Funny)
More Coors takes the pain of remembering away.
Re:More Coors is always better (Score:5, Funny)
Re: (Score:2)
Dude I think he covered that: Moore Coors.
Re: (Score:1)
Re: (Score:1)
Moore Coors?
Yes, the amount of Coors doubles every 2 years.
Coors vicious business model cycle! (Score:2)
1) Drink Coors
2) Drink more Coors to forget you are drinking Coors
3) PROFIT!!!
I can get an entire laptop for that cost (Score:5, Insightful)
I must have crossed the border into adulthood somewhere back there because I would never pay that much for a performance uptick in a video game. I can get myself a nice new laptop for that cash, and it would be still be proficient at 90% of today's games.
Re:I can get an entire laptop for that cost (Score:5, Insightful)
The good news is that all the "I've got to have the latest and best to make all my friends and e-buddies drool" crowd will start unloading their barely-used last generation cards on eBay, and those of us that want good performance at a good price will benefit.
Re: (Score:2)
It's a lot easier to afford a $650 video card when you're selling last year's card for $350.
Re: (Score:2)
Re: (Score:3)
1920x1080
High settings
1x AA (and honestly dont even need that when running at this rez or higher)
60+ FPS in nearly all games including your precious "modern 3d games"...because honestly they dont push any harder now than they did a bit ago.
the card that does this for me? a 4 year old GTS 250
either your standards are too high, or you're deluding yourself as to the actual worth of running at 400000x rez with 20xAA and 50 bajillion gigawatts of memory.
probably both, as you;ve obviously falled for the graphics
Re: (Score:3)
Note that a 2560x1600 panel has almost double (1.98x) the number of pixels of a 1920x1080 one, and given how ugly scaling tends to be, it can be entirely worthwhile to have a high end graphics card.
On the other hand, I still have a GTX 580 (and when I bought it, the mid-range card couldn't get a smooth framerate above 1920x1200), and I don't have any impetus to upgrade yet, as the difference isn't that great.
Re: (Score:2)
You setup will barely run battlefield 3 (likely won't run at all or be an utter cripple at ultra settings). Same for crysis and other graphically intensive games.
But you will run many console ports just fine. After all, these are aimed at ancient hardware. Or at least you will for about a year more. Then console ports will become far more demanding as console hardware will get upgraded.
Reality is, it's not that your card is fast. It's that your standards are very low.
Re: (Score:1)
You should thank God (in whatever form you worship his Noodly Appendages) that people do buy high end devices such as graphics cards. These spendthrifts create the market and provide the initial downward push on prices so that the rest of us can afford them.
I still remember paying £2500 ($4000) for a 486DX-33 with 8MB of RAM and a 200Mb hard drive. Those were the days.....
Well, some people like to spend money on hobbies (Score:2)
Seriously, for some people, gaming is their hobby and that kind of money is not that much when you talk what people spend on hobbies. My coworker just bought himself like a $2000 turbo for his car, to replace (or augment, I'm not sure) the one that's already there. He has no need for it, but he likes playing with his car.
Now that you, and most others, don't want to spend that kind of money is understandable and not problematic. There's a reason why companies have a lineup of stuff and why the high end stuff
2,304 cores = 1 line of 2K HD (Score:1)
Ten years more and it will be one core per pixel. That's insane.
Re: (Score:2)
Sounds more like a regional thing.
Here in upper/middle NJ, $1000 gets you a pretty bare-bones apartment in most towns. Maybe a 2-bedroom in worse area.
Unless you're splitting rent with someone at $1000 doesn't carry much weight here.
Re: Are you insane? (Score:2)
Re: (Score:2)
Yes, but people living in higher cost of living centers generally have higher relative incomes as well. So it is less of a proportional hit on the budget than elsewhere.
Re: (Score:2)
Great, yet another reason I wish I was Canadian.
NJ has a high cost of living. Though it varies: if you want to leave in Western NJ near the PA border it's a little cheaper. Or if you move way down to South NJ it's a little cheaper still. Though hadn't seen a nice apt for $500.
But unfortunately there are a lot more jobs in northern NJ, and it's not worth driving 1+ hours each way let alone 2.
Re: (Score:2)
Hell, I could pay both my rent and electricity bill for two fucking months for the price of one GTX Titan card.
Then you're not the target market.
Some people buy $250,000 cars, some people buy $1,000 video cards, some people pay $2500 for Mac with the same hardware as a $1000 Windows PC. To most of us, they're just a curiosity.
Re: (Score:2)
At $650, the GTX 780 costs less than one monthly rent payment for me.
Well, that's lack of competition for you... (Score:3)
If this was a previous generation where AMD was actually still competitive, Titan would have been the high end part, and it would have cost $500 instead of $1000. The part known as GTX 780 would have been a slightly depopulated part capable of 90% the performance for a 20% savings or so and the rest of the line would have fallen under those two. Since AMD is no longer really a threat in the high-end GPU space, Nvidia can literally maintain the MSRPs of the old parts as if the new parts are merely higher performing extensions of the previous generation without any downward pricing pressure on anything.
I think you're missing something here (Score:2)
Their problem is that the cost of implementing large-die processors is getting extremely expensive compared to how it used to be. We used to see previous-generation processes used for high-end cores because the maturity overcame the extra cost of the large die. But now that large dies are prohibitive (and assuming prices cannot grow), the graphic
Re: (Score:2)
What on earth are you talking about? AMD is very competitive still.
7870 vs 660 Ti: similar price, similar performance.
7970 GHz Edition vs. 680: Similar price, similar performance.
The two companies are battling it out at every segment with neither having a clear lead anywhere. The exception being the GTX Titan and the 780 - both of which are brand new cards and AMD just hasn't yet released their new batch of cards. If AMD takes months to come out with something, then the will no longer be competitive. But ri
Re: (Score:2)
For me, drivers are more important than hardware. The difference in speed between the flaky, wonky proprietary drivers and the fairly steady but dog slow open source drivers are on the order of 10x.
They still aren't competing for the Linux market. I have older low end stuff, an AMD machine (Phenom II with an HD 5450), and an Intel+Nvidia machine (Core 2 Quad Q6600 with a GeForce 8500GT that I recently replaced with a fanless GT 610), and in neither can I get satisfactory Linux support. The proprietary
Re: (Score:2)
For me, drivers are more important than hardware. The difference in speed between the flaky, wonky proprietary drivers and the fairly steady but dog slow open source drivers are on the order of 10x.
Not for AMD. Phoronix has plenty of benchmarks, the open source drivers have 80% the performance of the proprietary ones.
They still aren't competing for the Linux market. I have older low end stuff, an AMD machine (Phenom II with an HD 5450), and an Intel+Nvidia machine (Core 2 Quad Q6600 with a GeForce 8500GT that I recently replaced with a fanless GT 610), and in neither can I get satisfactory Linux support. The proprietary driver on the Nvidia box has the best performance. Next best is the AMD box with the open source driver. (I haven't tried Catalyst, so I don't know how good AMD can be.) The Nouveau driver is horrible for 3D acceleration. ATI/AMD has repeatedly promised they would help open source drivers use the full potential of their hardware, but thus far they haven't delivered. NVidia has flat out refused to help, and has tried to claim that keeping their proprietary driver up to date is being supportive of open source.
The Linux market is full of masochists that continue to purchase and recommend the company that hates them (Nvidia) and shun the one that's actually doing what the community is asking for (AMD). AMD *has* delivered on the open source drivers. They *have* delivered on the specs. Everything the community has asked for, AMD has done. And yet, the community continues to buy Nvidia while compl
Yowsa (Score:1)
Imagine a Beowulf cluster of these babies rendering images of Natalie Portman covered in hot grits!
$500 is already close to insanity (Score:2)
Re: (Score:2)
Re: (Score:2)
Doesn't that advice apply to just about everything? TVs, cars, medicine...The ultra high end or bleeding edge technology usually isn't "worth it". Yet that edge always seems to move along and what is today's best, most expensive tech is tomorrow's everyday tech.
Depends on what your target is (Score:2)
If you want higher resolutions and frame rates, you need more powerful GPUs to handle it. For example moving to 2560x1600 or to 120fps doubles the pixel requirement over 1920x1080@60fps. So whatever amount of power you needed to achieve 1080p60, double that for either of those targets. 4k will require a quadrupling, and 120fps 4k would require 8x the power.
All this is assuming you are getting 60fps in the first place. Now maybe you are fine with trading off lower frame rates, or lower resolutions, that's al
Re: (Score:2)
Try an over decade-old game like Oblivion.
Bethesda next generation 'Skyrim' was a downgrade in graphics for
X-Boxes.
Oblivion can easily multi-GPU card setups, so I'd guess you don't know
what 'well' means.
It's not about the game, but about the quality and density of the textures -- do they approach what the naked eye can see? Can they support
multiple even 1 4k monitor ?
Considering the HDMI cable spec tops out at 1920x1080... that's
a jolt right there when you want to run 2560x1600.
Problem is most of the progr
Re: (Score:2)
With the new consoles, we'll be back to seeing benefit from better video cards.
Except the new consoles' graphics will probably be just about on par with next year's integrated GPUs.
Bitcoin / Litecoin mining? (Score:2)
Until I was asked to write a few tech. articles on bitcoin and other virtual currencies last year, I didn't really pay a lot of attention to them. But I've learned that high end ATI video cards are pretty much the "engines" required for any respectable bitcoin/litecoin mining rig to work successfully.
(As a rule, nVidia cards have been ignored as "not as good of performers as ATI" for this specific use -- though I wonder how this GTX 780 would do?)
People building these mining rigs generally cram 3 - 4 of the
Re: (Score:2)
ASIC machines are coming on the market for bitcoin mining. They're blowing any GPU rig out of the water in terms of BTC mined/watt-hour. GPU-mining is officially dead.
Re: (Score:2)
Not for scrypt based coins like litecoin. GPUs are still the best option.
Re: (Score:2)
It'll perform a bit worse than a GTX Titan, which gets in the region of 330Mhash/sec [bitcointalk.org]. For comparison, an AMD HD5870 from 2009 managed about 400Mhash/sec.
Well? We're waaaaaaaiting! (Score:1)
Welp, it took about 5 years [cnet.com].
Can somebody put this into non-gaming terms? (Score:2)
We do love our big numbers, but there are limits to what our eyes can perceive in FPS. What does this mean for real world applications like video encoding and password cracking? How long do we anticipate having to wait for tech like this to get affordable? Also, how does this compare to the nVidia Tesla, the current gold standard in password cracking?
I saw only one reference to nVidia Tesla (and no references to password cracking or video encoding) in those reviews (@Tech Report [techreport.com]), and it might be damni
Not Just Paying For the Cores (Score:2)
Re: (Score:3, Insightful)
Implementation trumps architecture. There's a reason nobody who's interested in power efficiency, noise and/or heat uses AMD products.
Re: (Score:3)
Re:Still slower than AMD (Score:4, Informative)
In GPU terms, yes. The shaders and cores are very different between AMD and NVIDIA (that's why AMD can have 1536 and compete with a HD 7970 with 2048 shaders).
Re: Still slower than AMD (Score:2)
Re: Still slower than AMD (Score:5, Insightful)
Re: (Score:2)
Re: (Score:1)
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
GPU manufacturers have a tendency to use the word "core" to mean "one ALU in the middle of a vector unit". It's not really very different in principle from saying that an AVX unit is 8 cores, though, so you have to be careful with comparisons.
If you look at the AMD architecture for each compute unit, it's not so different from the cores you see on the CPU side, so it's much more fair to call the 7970 a 32 core chip. The way that a work item in OpenCL, say, or a shader instance in OpenGL maps down to one of
Re: (Score:3)
Re: (Score:1)
Re: (Score:1)
Since WoW is CPU-bound the answer is a clear, resounding "it depends".
Re: (Score:2)
Re: (Score:2)
and still have a PC that beats your 8 year old console as it hasn't been/can't be upgraded when it comes to hardware.
Reply to This Share
This is something I don't get. Why does the existence of both PC's and console have to be a "competition" that "just one" wins. There's room for both.
SOmetimes I think PC gamers prefer playing benchmarks for bragging rights than actually playing games.
Re: (Score:2)
An eight year old console will play the latest games better than an eight year old PC.
Re: (Score:2)
Then you are essentially a bit like a wheel chair bound person arguing with experienced runners that you don't need >100 euro running shoes, 20 euro ones from supermarket are good enough. They are for you because you don't run.
Re: (Score:3)
Does AMD support VDPAU these days? Because VA-API support is mighty poor in my experience. Broke down and bought a card when I had a perfectly find integrated one for my TV box because I can't get VA-API to work with mplayer. There is a source version that supports it, I couldn't get it to compile cleanly, though.
I also had a ton of trouble with the "legacy" vs new ATI drivers (the computer was low end, but only a few months old when this nonsense happened). Not sure what caused that split, but it was hell
Re: (Score:2)
Does AMD support VDPAU these days? Because VA-API support is mighty poor in my experience.
Yes, actually! A month or two ago, AMD released VDPAU support for their open source driver [phoronix.com].
Bizarrely, the closed source driver is still XvBA only.
Broke down and bought a card when I had a perfectly find integrated one for my TV box because I can't get VA-API to work with mplayer.
I did actually get VA-API / XvBA working on my AMD system, but it could only do h264 and MPEG2. You could forget xvid, forget advanced GPU deinterlacing, etc. Since it was a weak E-350 box, that left it able to play the highest bitrate bluray rips, but not broadcast TV (MPEG2, 1080i). Replaced it with an Intel Atom / Nvidia ION2 box.
Re: (Score:2)
Run battlefield 3 at 4k with 3d glasses on "standard 3d card" of your choice. Run it again on titan. This isn't audiophile stuff. There is a very real and easily quantifiable difference known as "frames per second".