Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Graphics Hardware

NVIDIA GeForce GTX 780 Offers 2,304 Cores For $650 160

Vigile writes "When NVIDIA released the GTX Titan in February, it was the first consumer graphics card to use the GK110 GPU from NVIDIA that included 2,688 CUDA cores / shaders and an impressive 6GB of GDDR5 frame buffer. However, it also had a $1000 price tag that was the limiting specification for most gamers. With today's release of the GeForce GTX 780 they are hoping to utilize more of the GK110 silicon they are getting from TSMC while offering a lower cost version with performance within spitting range. The GTX 780 uses the same chip but disables a handful more compute units to bring the shader count down to 2,304 — still an impressive bump over the 1,536 of the GTX 680. The 384-bit memory bus remains though the frame buffer is cut in half to 3GB. Overall, the performance of the new card sits squarely between the GTX Titan ($1000) and AMD's Radeon HD 7970 GHz Edition ($439), just like its price. The question is, are PC gamers willing to shell out $220+ dollars MORE than the HD 7970 for somewhere in the range of 15-25% more performance?" As you might guess, there's similarly spec-laden coverage at lots of other sites, including Tom's, ExtremeTech, and TechReport. HotHardware, too.
This discussion has been archived. No new comments can be posted.

NVIDIA GeForce GTX 780 Offers 2,304 Cores For $650

Comments Filter:
  • Im a little surprised at people snivelling over 1000.00 video cards. I was in printing for 10 years back in "the day" and a supermac thunder card (with on board jpeg acceleration) was 2500.00
    • by Anonymous Coward

      You can still buy $2500+ video cards. The difference is they are targeted to the professional crowd who need warranty and support on professional applications like CAD software, Photoshop, and the sort. The FirePro (AMD) and Quadro (NVidia) lines aren't exciting as far as gaming and hardware goes, but you're not paying for gaming, you're paying for the support and certification that it will work for various professional applications.

      • by TWX ( 665546 )
        I am so glad that I grew up in the 3dFX Voodoo days for my gaming. The cards were relatively cheap, and spending $200 on one seemed like a huge sum of money. Like, this-is-your-only-Christmas-present money.

        Upwards of $1000 for a consumer-grade video card? I've spent less on road-worthy vehicles.
        • by 0123456 ( 636235 )

          Meanwhile, my $200 graphics card runs all the games I own at max or high settings at 1920x1080.

          This is a card for fanatics who want to run six monitors at > 1920x1080, not Joe Sixpack who bought a PC from Walmart.

          • Completely agree, current game market can't utilize this. Furhter, the irony here is most games don't work well in a multi-monitor environment. I still find myself turning my other one off to avoid accidental misclicks. Only game I've played that effectively compensates is Starcraft II, everything else has been an alt tab PITA.

            I think maybe where you get your money's worth here is you can fire up vlc, a game, and netflix together , which is basically what you're saying, but how many people is that worth

          • by Luckyo ( 1726890 )

            Actually these cards are used to following reasons legitimately:

            1. 120 FPS at 1080p in high end games (usually needed for proper 3D).
            2. Extremely high resolution gaming at high detail (4k).

            Notably even Titan chokes when trying to do both: 4k rendering of most graphically intensive games at 120FPS at high detail levels.

          • > at max or high settings at 1920x1080.
            Good luck getting 60+ Hz on a $200 video card with modern (2013) games - you'll most likely playing at a crappy 30 frame per second. i.e. Tomb Raider 2013, BF3, etc. all run good at Ultra settings.

            I bought my Titan for a) Win & Linux CUDA, and b) to game at 120 Hz on the Asus VG248QE (the Asus VG278H is also good.) using LightBoost because I can tell instantly when the frame rate drops from 60 Hz down to 30 Hz.

            . /me *glares at Path of Exile*

            Not everyone gives a

          • The key part in the statement is the:

            "games I own"

            You simply aren't playing games that need as many cycles as you can get, but they do exist. Simulations like DCS :Word, DCS:A-10C, DCS:Ka-50 all need a large amount of graphics power when you are at altitude. The Armed Assault III simulator also makes a lot of use of your CPU and GPU, since you are running around maps that are much bigger than the postage-stamp sizes of most games (eg the pitiful maps on the Call of Duty and Battlefield series).

            I'd suggest checking

    • by jma05 ( 897351 )

      Nobody snivels at any priced hardware, as long as they are using it to make enough money to offset its price. There are as pricey workstation GPUs even now that people use for work.

    • by dywolf ( 2673597 )

      what happened to 6 shader cores being a big deal? now we're in quadruple digits? holy cow.

    • Where have you been since 1997? Since then you've been able to consistently spend roughly $200 per video card and play the latest games at acceptable settings. That's why people cast a sneer at a $1,000 card let alone a $650 card that in of itself can build a really fast computer.

    • "Im a little surprised at people snivelling over 1000.00 video cards."

      Here's the problem: they're pulling the same blunder that got Intel in hot water back in the 90s.

      People aren't stupid (although many of them act that way sometimes). If you can take a part that sells for $1,000, disable some of the functionality, and sell it for $650... then you can sell the whole unit for $650. It's a ripoff and people know it.

      Now, if it's a matter of disabling cores that don't pass testing anyway, that might be an effective way to dump "defective" parts on the market and still profi

      • by Luckyo ( 1726890 )

        People aren't as stupid as you make them out to be (or in fact are). The costs in chip production are not in manufacturing alone - else taiwanese would have won the chipmaking competition long ago. Design costs are very significant, as are the costs of setting up the production process.

        • "The costs in chip production are not in manufacturing alone - else taiwanese would have won the chipmaking competition long ago. Design costs are very significant, as are the costs of setting up the production process."

          And that is 100% irrelevant to the point I was making. Just as with Intel, the "disabled" chips only became available well after the full-featured chips. The design didn't happen until the manufacturer decided to create a branch of the product line from the existing branch.

          Do you not remember the customer revolt over Intel doing this? I was systems manager when it happened. And believe me, people were pissed off.

          • by Luckyo ( 1726890 )

            When you're a business, you'll piss people off for the weirdest reasons.

            Doesn't mean you have to care every time someone will get pissed. Else you'd be doing ntohing but caring about pissed off people. For the record, I don't think anyone who wasn't a "systems manager" was revolting. At least I never even heard about a revolt when intel started doing it. Reaction was "meh, whatever, I still get a chip I pay for" at worst.

      • by mc6809e ( 214243 )

        People aren't stupid (although many of them act that way sometimes). If you can take a part that sells for $1,000, disable some of the functionality, and sell it for $650... then you can sell the whole unit for $650. It's a ripoff and people know it.

        And so much of what people "know" is wrong.

        It's entirely possible that at $650/chip Intel loses money and at $1000/chip Intel loses money -- but at the two price points combined, Intel makes a profit.

        Suppose Intel can sell 1 million chips at $650 but only 500 t

      • by synaptik ( 125 ) *
        Companies are not looking to maximize price. They are looking to maximize the area of the rectangle defined by Price times Volume. If they think the area of that rectangle will be larger at a smaller price, then they would sell at that smaller price.
        Now imagine that your company has correctly identified that pricing sweetspot, but some of your product is partially defective, though still useful. You cannot sell it at that same ideal price as the fully-featured product. So, you necessarily sell it at s
  • by TWiTfan ( 2887093 ) on Thursday May 23, 2013 @10:01AM (#43803105)

    More Coors takes the pain of remembering away.

  • by MetalliQaZ ( 539913 ) on Thursday May 23, 2013 @10:02AM (#43803115)

    I must have crossed the border into adulthood somewhere back there because I would never pay that much for a performance uptick in a video game. I can get myself a nice new laptop for that cash, and it would be still be proficient at 90% of today's games.

    • by MachineShedFred ( 621896 ) on Thursday May 23, 2013 @10:13AM (#43803239) Journal

      The good news is that all the "I've got to have the latest and best to make all my friends and e-buddies drool" crowd will start unloading their barely-used last generation cards on eBay, and those of us that want good performance at a good price will benefit.

    • If by 90% of today's games you mean 90% of today's facebook games, then yes, I suppose ;). But seriously, are games less technically demanding than they were 10 years ago? Yes. Are you really going to have that much fun trying to play actual modern 3D games on medium low detail settings, no AA, low resolution, at ~30 fps? You might, obviously a lot of people wouldn't, myself included.
      • by dywolf ( 2673597 )

        1920x1080
        High settings
        1x AA (and honestly dont even need that when running at this rez or higher)
        60+ FPS in nearly all games including your precious "modern 3d games"...because honestly they dont push any harder now than they did a bit ago.

        the card that does this for me? a 4 year old GTS 250

        either your standards are too high, or you're deluding yourself as to the actual worth of running at 400000x rez with 20xAA and 50 bajillion gigawatts of memory.
        probably both, as you;ve obviously falled for the graphics

        • Note that a 2560x1600 panel has almost double (1.98x) the number of pixels of a 1920x1080 one, and given how ugly scaling tends to be, it can be entirely worthwhile to have a high end graphics card.

          On the other hand, I still have a GTX 580 (and when I bought it, the mid-range card couldn't get a smooth framerate above 1920x1200), and I don't have any impetus to upgrade yet, as the difference isn't that great.

        • by Luckyo ( 1726890 )

          You setup will barely run battlefield 3 (likely won't run at all or be an utter cripple at ultra settings). Same for crysis and other graphically intensive games.

          But you will run many console ports just fine. After all, these are aimed at ancient hardware. Or at least you will for about a year more. Then console ports will become far more demanding as console hardware will get upgraded.

          Reality is, it's not that your card is fast. It's that your standards are very low.

    • You should thank God (in whatever form you worship his Noodly Appendages) that people do buy high end devices such as graphics cards. These spendthrifts create the market and provide the initial downward push on prices so that the rest of us can afford them.

      I still remember paying £2500 ($4000) for a 486DX-33 with 8MB of RAM and a 200Mb hard drive. Those were the days.....

    • Seriously, for some people, gaming is their hobby and that kind of money is not that much when you talk what people spend on hobbies. My coworker just bought himself like a $2000 turbo for his car, to replace (or augment, I'm not sure) the one that's already there. He has no need for it, but he likes playing with his car.

      Now that you, and most others, don't want to spend that kind of money is understandable and not problematic. There's a reason why companies have a lineup of stuff and why the high end stuff

  • Ten years more and it will be one core per pixel. That's insane.

  • by Gordo_1 ( 256312 ) on Thursday May 23, 2013 @10:22AM (#43803347)

    If this was a previous generation where AMD was actually still competitive, Titan would have been the high end part, and it would have cost $500 instead of $1000. The part known as GTX 780 would have been a slightly depopulated part capable of 90% the performance for a 20% savings or so and the rest of the line would have fallen under those two. Since AMD is no longer really a threat in the high-end GPU space, Nvidia can literally maintain the MSRPs of the old parts as if the new parts are merely higher performing extensions of the previous generation without any downward pricing pressure on anything.

    • If this was a previous generation where AMD was actually still competitive, Titan would have been the high end part, and it would have cost $500 instead of $1000.

      Their problem is that the cost of implementing large-die processors is getting extremely expensive compared to how it used to be. We used to see previous-generation processes used for high-end cores because the maturity overcame the extra cost of the large die. But now that large dies are prohibitive (and assuming prices cannot grow), the graphic

    • What on earth are you talking about? AMD is very competitive still.

      7870 vs 660 Ti: similar price, similar performance.
      7970 GHz Edition vs. 680: Similar price, similar performance.

      The two companies are battling it out at every segment with neither having a clear lead anywhere. The exception being the GTX Titan and the 780 - both of which are brand new cards and AMD just hasn't yet released their new batch of cards. If AMD takes months to come out with something, then the will no longer be competitive. But ri

      • For me, drivers are more important than hardware. The difference in speed between the flaky, wonky proprietary drivers and the fairly steady but dog slow open source drivers are on the order of 10x.

        They still aren't competing for the Linux market. I have older low end stuff, an AMD machine (Phenom II with an HD 5450), and an Intel+Nvidia machine (Core 2 Quad Q6600 with a GeForce 8500GT that I recently replaced with a fanless GT 610), and in neither can I get satisfactory Linux support. The proprietary

        • For me, drivers are more important than hardware. The difference in speed between the flaky, wonky proprietary drivers and the fairly steady but dog slow open source drivers are on the order of 10x.

          Not for AMD. Phoronix has plenty of benchmarks, the open source drivers have 80% the performance of the proprietary ones.

          They still aren't competing for the Linux market. I have older low end stuff, an AMD machine (Phenom II with an HD 5450), and an Intel+Nvidia machine (Core 2 Quad Q6600 with a GeForce 8500GT that I recently replaced with a fanless GT 610), and in neither can I get satisfactory Linux support. The proprietary driver on the Nvidia box has the best performance. Next best is the AMD box with the open source driver. (I haven't tried Catalyst, so I don't know how good AMD can be.) The Nouveau driver is horrible for 3D acceleration. ATI/AMD has repeatedly promised they would help open source drivers use the full potential of their hardware, but thus far they haven't delivered. NVidia has flat out refused to help, and has tried to claim that keeping their proprietary driver up to date is being supportive of open source.

          The Linux market is full of masochists that continue to purchase and recommend the company that hates them (Nvidia) and shun the one that's actually doing what the community is asking for (AMD). AMD *has* delivered on the open source drivers. They *have* delivered on the specs. Everything the community has asked for, AMD has done. And yet, the community continues to buy Nvidia while compl

  • by Anonymous Coward

    Imagine a Beowulf cluster of these babies rendering images of Natalie Portman covered in hot grits!

  • Anything more the $250, and it's likely you won't find any game out there that can't be played well with a $250 card.
    • Definitely, don't ever shell out the money it costs to have the #1 best fastest hottest GPU (I wouldn't recommend the #2 either). It's basically digital viagra. Lasts about as long too.
      • by cdrudge ( 68377 )

        Doesn't that advice apply to just about everything? TVs, cars, medicine...The ultra high end or bleeding edge technology usually isn't "worth it". Yet that edge always seems to move along and what is today's best, most expensive tech is tomorrow's everyday tech.

    • If you want higher resolutions and frame rates, you need more powerful GPUs to handle it. For example moving to 2560x1600 or to 120fps doubles the pixel requirement over 1920x1080@60fps. So whatever amount of power you needed to achieve 1080p60, double that for either of those targets. 4k will require a quadrupling, and 120fps 4k would require 8x the power.

      All this is assuming you are getting 60fps in the first place. Now maybe you are fine with trading off lower frame rates, or lower resolutions, that's al

    • by lpq ( 583377 )

      Try an over decade-old game like Oblivion.

      Bethesda next generation 'Skyrim' was a downgrade in graphics for
      X-Boxes.

      Oblivion can easily multi-GPU card setups, so I'd guess you don't know
      what 'well' means.

      It's not about the game, but about the quality and density of the textures -- do they approach what the naked eye can see? Can they support
      multiple even 1 4k monitor ?

      Considering the HDMI cable spec tops out at 1920x1080... that's
      a jolt right there when you want to run 2560x1600.

      Problem is most of the progr

  • Until I was asked to write a few tech. articles on bitcoin and other virtual currencies last year, I didn't really pay a lot of attention to them. But I've learned that high end ATI video cards are pretty much the "engines" required for any respectable bitcoin/litecoin mining rig to work successfully.

    (As a rule, nVidia cards have been ignored as "not as good of performers as ATI" for this specific use -- though I wonder how this GTX 780 would do?)

    People building these mining rigs generally cram 3 - 4 of the

    • ASIC machines are coming on the market for bitcoin mining. They're blowing any GPU rig out of the water in terms of BTC mined/watt-hour. GPU-mining is officially dead.

    • by Fweeky ( 41046 )

      It'll perform a bit worse than a GTX Titan, which gets in the region of 330Mhash/sec [bitcointalk.org]. For comparison, an AMD HD5870 from 2009 managed about 400Mhash/sec.

  • We do love our big numbers, but there are limits to what our eyes can perceive in FPS. What does this mean for real world applications like video encoding and password cracking? How long do we anticipate having to wait for tech like this to get affordable? Also, how does this compare to the nVidia Tesla, the current gold standard in password cracking?

    I saw only one reference to nVidia Tesla (and no references to password cracking or video encoding) in those reviews (@Tech Report [techreport.com]), and it might be damni

  • You're paying for the luxury of what appears to be the world's first high-end video card with a built-in speaker. Nvidia finally reached the point where the polygons their products could produce exceeded the nominal human capacity to perceive them, so now they've added the ability to hear the extra polygons you can't see, as ultra-soothing HD Brown noise [wikipedia.org]! The only side effect is that it reduces your available gaming time by increasing the number of bathroom breaks you need to take.

You are always doing something marginal when the boss drops by your desk.

Working...