Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Graphics

Nvidia GeForce GTX 780 Ti Review: GK110, Fully Unlocked 88

An anonymous reader writes "Nvidia lifted the veil on its latest high-end graphics board, the GeForce GTX 780 Ti. With a total of 2,880 CUDA cores and 240 texture units, the GK110 GPU inside the GTX 780 Ti is fully unlocked. This means that the new card has an additional SMX block, 192 more shader cores, and 16 additional texture units than the $1,000 GTX Titan launched back in February! Offered at just $700, the GTX 780 Ti promises to improve gaming performance over the Titan, yet the card has been artificially limited in GPGPU performance — no doubt in order to make sure the pricier card remains relevant to those unable or unwilling to spring for a Quadro. The benchmark results simply illustrate the GTX 780 Ti's on-paper specs. The card was able to beat AMD's just-released flagship, the Radeon R9 290x by single-digit percentages, up to double-digits topping 30% — depending on the variability of AMD's press and retail samples."
This discussion has been archived. No new comments can be posted.

Nvidia GeForce GTX 780 Ti Review: GK110, Fully Unlocked

Comments Filter:
  • heh (Score:5, Funny)

    by jakobX ( 132504 ) on Saturday November 09, 2013 @04:31PM (#45378683)

    Offered at JUST 700 dollars. Nice try anonymous Nvidia.

    • that is just at stock mhz, gk110 is known to get a good 200mhz OC and least another 250-500mhz on memory so that lead is a little bit bigger. the AMD card givin how HOT it runs can't overclock it.
      • Who said you can't overclock the R9? Just add better cooling. Amazing how suddenly there's a problem because of heath.
        • Amazing how suddenly there's a problem because of heath.

          Unless you're talking about The Dark Knight, you may have misspelled.

        • by smash ( 1351 )
          By that reasoning, just apply more (equivalent to AMD) cooling to the GeForce and overclock it more?
    • by Fwipp ( 1473271 )

      For a $150 price premium over the 290X, I'd expect more than "single-digit percentages." I know there's always a tax on the high end cards, but 27% pricier for (up to) 9% speed doesn't seem like a great trade-off.

      • by Fwipp ( 1473271 )

        Aaaaaand I just looked back at the summary and noticed the "up to 30%." I should really double-check before I post these things.

        • Re:heh (Score:5, Informative)

          by Billly Gates ( 198444 ) on Saturday November 09, 2013 @05:12PM (#45378867) Journal

          Tomshardware is known to be biased as they take in ad money and partnerships with Nvidia and Intel. They put in x87 non IEEE FPU tests where Intels own chips win and declare anything AMD/ATI a loser as a result rather than real world performance. They do not test the later versions of Skyrim which have proper FPU support as an example in their benchmarks.

          For a more accurate benchmark click here [extremetech.com]?

        • Re:heh (Score:5, Insightful)

          by PopeRatzo ( 965947 ) on Saturday November 09, 2013 @06:18PM (#45379189) Journal

          No, you were right the first time. Toms Hardware is the only site claiming these benchmark victories for the nVidia card. I'm not saying they allow their advertising department to influence their reporting and rankings, but it's a bit fishy that they're such an outlier regarding the flagship video cards of the two manufacturers.

          It's also worth noting that comparing these cards without taking AMD's Mantle technology into account is to say the least, incomplete.

          • This sounds a bit biased to me. Firstly Toms isn't the only site. Secondly taking into account something which is currently non-existent to test is pretty tricky.

            What performance advantage are you going to give the Mantle API without knowing anything about it? The best they can do is give the figures they currently get for both cards and add an addendum which states that Mantle may improve the performance for the AMD cards. They should probably add that in real world situations the ATI may have performance

            • These are both reference cards. Once Gigabyte and MSI and Sapphire get hold of them, I think the AMD's will run cooler and the nVidia cards will become a little more cost-effective.

              All it all, it's great to have them actually competing.

            • by Dr Max ( 1696200 )
              Water cooling is getting pretty cheap, and it's quite effective it will also eliminate a lot of noise. If you run linux it's worth while going amd, they are more open and it shows (like being able to give a virtual machine access to the graphics card).
              • by aliquis ( 678370 )

                But isn't their drivers also slower? Because that's why I _don't_ want to go AMD on Linux.

                • by Dr Max ( 1696200 )
                  What? I always thought it was the other way around. Maybe with steam OS coming out, Nvidia has stepped up it's game, but i haven't seen any proof yet. Feel free to enlighten me.
                  • by aliquis ( 678370 )

                    As far as I know the Nvidia drivers has always been better.

                    Linus and friends may have liked AMD more for providing more information and hence make a better open-source driver.

                    Personally I wouldn't want to use an open-source driver if it gave half performance even if that would be better than say 10% performance =P. And at least previously I was under the impression not even the closed driver from AMD was competitive against the Nvidia one. But that could had changed I suppose.

                    Also Nvidias one also existed f

          • by aliquis ( 678370 )

            It's also worth noting that comparing these cards without taking AMD's Mantle technology into account is to say the least, incomplete.

            If it's game benchmarks what are they supposed to do?

            For anything except Battlefield 4 I think it's correct because that's the only title I know of which make use of it so far.

            Maybe it would be nice to see a demo which really uses it vs some other traditional method and in the future for more games but for now this is what we have.

            • mantle won't appear in battlfield 4 until next, but it'll be very interesting to see how it benefits performance

          • by smash ( 1351 )
            AMD's mantle technology was taken into account. They ran all the common software currently utilising it.
    • This card is only $700. And how much are you paying for that iPhone and its data plan that you almost never use?

      • Well according to those who are broke the iPhone is free of charge!

        It wont cost anything to use at all compared to these silly users who pay up front ... well off to go to pay $120 a month for my 1 user phone bill. Wow I can't imagine why it is so high and why I can't leave for 2 years?!

    • I like the part where they said "Depending on the variability of AMD's press and retail samples."

      The variability in the results was mostly caused by some last minute driver changes that caused a performance boost in the Radeon 290 and 290X cards, but the submitter seems to make it look some sort of "golden sample" conspiracy from AMD.

  • Basically the Titan was a publicity stunt. The 780Ti is faster and a lot cheaper. Then again, the R290s are just as fast as a Titan and a lot cheaper, but it seems they have some cooling (throttling) issues. One should wait for the custom cooled R290s if the target are games and just get a 780Ti if you really need the CUDA processing (e.g. Adobe applications like Premiere). Also, as for the price, it's top-end, it's normal to be expensive. One can game with a GTX670 or a GTX770, which are more than enough
    • by Anonymous Coward

      The Titan is and always ha been geared toward entry level 3D graphics level performance applications. NVidia noticed many of the just starting out game/movie creators were using hacks to enable their gaming cards that cost less than a thousand dollars to be used in stead of their 2-8 thousand dollar professional series. This is why it it has 6GB of Ram, for a cheap option for those just getting into that type of system.

      The fact that it worked great for games was a side benefit, allowing nVidia to grab some

    • > Basically the Titan was a publicity stunt.

      Right, you want to tell that to the World's #1 Super computer which has 18,688 Tesla K20X GPUs.
      http://en.wikipedia.org/wiki/Titan_(supercomputer) [wikipedia.org]

      Second, you are glossing over the fact that:

      Titan = 6 GB VRAM, 780 Ti = 3 GB
      Titan Float64 performance = 1/3 FP32, 780 Ti = 1:24 FP32

      For gamers they couldn't give a shit about that. The fact that Titan could also game was a bonus. It was never primarily targeted at gamers just budget scientific computing.both: Proces

  • At least I do not have driver issues that plauge Nvidia cards like the 320.x drivers which are known to brick their cards and mess around with Aero on Windows 7.

    Before I got modded down or flamed galore for this ... I am taking about 2010 - 2013 ATI drivers vs Nvidia ones. I kept having quality issues when I was a nvidia fan boy and switched to an ATI 5750 and now a 7850 with HDMI and things have been great since! 2002 with ATI rage pro's ... well that is a different story altogether compared for today.

    If I

    • so 1 small bad nividia set of drivers vs all the years of bad AMD drivers? How many years did AMD's crossfire not even really work right and who's tool was that that was released before AMD even did something about it? So if you want to complain about driver issues, keep in mind AMD has had issues since day 1 and still do.
  • by xhrit ( 915936 ) on Saturday November 09, 2013 @05:26PM (#45378935) Journal
    What does it matter, since no game that will be released in the next 10 years is going to need more graphics power then the shitty xbox one can crap out?
    • What does it matter, since no game that will be released in the next 10 years is going to need more graphics power then the shitty xbox one can crap out?

      ... uh Crysis and anything on 4k. This card is still not fast enough at that resolution.

      Graphics still are not photorealistic yet and we have a long way to go before that happens. At 1080p these would suffice but Battle Field 4 barely plays at a lousy 40 fps at 4k even with this Titan?!

      • by smash ( 1351 )
        I think the guy's point is that no one is going to write games that REQUIRE that because they're all likely to be directly ported between Xbone/PS4/PC as its all PC hardware.
    • by nhat11 ( 1608159 )

      4k easily... in fact 2+ monitors are becoming the norm. People keep underestimating the need for more powerful gfx cards.

  • The main advantage the TITAN has over the 780 Ti is the memory; having 6GB compared to the 780 Ti's 3GB. If you're only looking at running 1080p that's not such an issue, but if you're one of those people with more money than sense and you're looking at running a 4k panel, it is.

    I really don't know why they didn't stuff 6GB on the 780 Ti. My *580* has 3GB. I'd really expect two series down the line for there to be a bit more RAM on it as standard (3GB wasn't the standard memory configuration for a 580; t

    • by gman003 ( 1693318 ) on Saturday November 09, 2013 @06:21PM (#45379217)

      The other advantage of the Titan is the double-precision performance. Almost all of Nvidia's cards, including the 780 Ti, run double-precision floating-point calculations at 1/24th the rate of single-precision, but for the Titan and the Tesla pure-GPGPU cards, it's 1/3rd the rate.

      While I'm not sure if that's an actual hardware difference, or if it's some software limitation, or a mix of both or whatever, it's definitely real. That's the main reason a Titan is still $1000 - it's being sold as a low-end compute card, not a high-end gaming card.

      • It's a limitation implemented in firmware/microcode for marketing purposes.

        The Titan, GTX 780 and GTX 780 Ti all use the same physical chip.

  • Are there that many people running multi-mon flight sim and driving sims? I know there's the guy on here that bought a $1000.00 card 10 years ago, sells it every year for $800 and then buys another $1000.00 card with the proceeds. But I can't believe there are that many people on the bleeding edge. Heck, Crisis 3 ran on a 360...
    • by Nemyst ( 1383049 )
      Flight sims and driving sims aren't graphics showpieces anymore and haven't for a while. Shooters are usually where it's at, and games like Crysis 3 or Battlefield 4 can put very high end cards to their paces, especially at >1080p resolutions. Crysis 3 ran on a 360 with a sub-720p resolution and a lot of settings notched down... You can see the difference very easily between PC and consoles.
    • Shit there are people who pay $250,000 for a freaking bathub and do not blink buying a $50,000 Ford F-350 complete with $10,000 shocks, strucks and tires to show off how much money they have compared to you and I on the road.

      People have money and even if the HPs and Dells are in decline, the high end motherboard industry is taking off as many play more PC games. So yes $400 is cheap for a $1800 computer to MMO with their friends.

      Not everyone is a poor young professional with student loan debt. However, that

      • by Cederic ( 9623 )

        More to the point, the poor young professionals that grew up PC gaming have now cleared their student debt, the mortgage is a pitiful percentage of their net pay and the wife's earning as much as they are.

        Gaming isn't just for kids.

        • Still have 40k on mine and no house yet.

          Value cards for me it is for now. If you graduated past 2006 40k to 100k with only 10 - 12/hr temp jobs when you graduate seem to be the new norm in the great recession.

          But if you graduated in the 1990s you paid 15k to 25k and could get a house for 1/4th the cost as someone graduating today. Economics do not count homes, services, rent, nor food in inflation indexes which is silly because it is a big problem as well as not counting debt and only income.

  • It's nice that NVidia is lowering their prices, but they are just not that competitive if you use these cards for password hashing or openCL. I had been using NVidia for the last 12, but I recently switched to an AMD card since at half the price, it was still faster at brute forcing crypto than the NVidia board was. I think NVidia should work on their openCL performance and AMD should work on the number of shaders and such on their chipsets.
  • by fa2k ( 881632 )

    Great that there's competition. I'm sure this blows the AMD ones away in Linux, but do they still have the tearing problems for video? Video has looked like crap on both my recent nVidia cards because it's splitting in the middle, though only when using two monitors. The exception is a few video players that use vdpau, but I can't expect all my videos to be playable on those. While I've preferred AMD before, I'm trying to not care about the brand as long as they don't do something hostile to users (Sony), s

  • "the GK110 GPU inside the GTX 780 Ti is fully unlocked" ... "yet the card has been artificially limited in GPGPU performance"... ... So which is it? To me "fully unlocked" can't be true of a card that is "artificially limited"

Lots of folks confuse bad management with destiny. -- Frank Hubbard

Working...