Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Hardware

NVIDIA Launches GeForce GTX 560 Ti 448-Core GPU 127

MojoKid writes "NVIDIA has just launched the GeForce GTX 560 Ti with 448 cores. Though perhaps a bit unimaginative in terms of branding, the new GeForce GTX 560 Ti with 448 cores is outfitted with the same GF110 GPU powering high-end GeForce GTX 570 and GTX 580 cards, but with a couple of its streaming multiprocessors fused off. The card has 448 CUDA cores arranged in 14 SMs, with 56 texture units and 40 ROPs. Reference specifications call for a 732MHz core clock with 1464MHz CUDA cores. 1.2GB of GDDR5 memory is linked to the GPU via a 320-bit bus and the memory is clocked at an effective 3800MHz data rate. Performance-wise, the new GPU proved to be about 10 to 15 percent faster than the original GeForce GTX 560 Ti and a few percentage points slower than the GeForce GTX 570."
This discussion has been archived. No new comments can be posted.

NVIDIA Launches GeForce GTX 560 Ti 448-Core GPU

Comments Filter:
  • by Anonymous Coward on Wednesday November 30, 2011 @01:29PM (#38216094)

    I bought a 560 Ti just a month ago and now this? FFFFFFfffffffffff...

    • by ebombme ( 1092605 ) on Wednesday November 30, 2011 @01:56PM (#38216466)
      If it is EVGA brand I believe they allow a trade in program within a certain amount of time for situations just like this. They have a trade in program where you send in your old card and pay the difference and they will send you the updated card of choice.
      • If it is EVGA brand I believe they allow a trade in program within a certain amount of time for situations just like this. They have a trade in program where you send in your old card and pay the difference and they will send you the updated card of choice.

        I just bought the 560 TI 2GB EVGA. Your post just made my day.

    • Comment removed based on user account deletion
      • by mikael ( 484 )

        I believe a stream processor is a processor dedicated to processing pipelines of data. More optimized for sequential memory access and arithmetic/trigonometric calculations than for branching and conditional instructions.

        A core is just the basic CPU with or without an FPU, but will have support for multiple threads (shared code/data space but separate execution points). Multiple cores will have specially adapted cache memory to allow data sharing.

      • nVidia and AMD do it different. nVidia counts their processing sub units as a "core". They aren't quite a core as you'd think of it on a CPU, but similar. AMD counts each execution sub unit in their "cores" as a "stream processor". So roughly speaking for the 4000 and 5000 series there are 5 SPs for each core.

        That doesn't quite tell the whole story though as what each core can do is different between the different vendors. More or less, to the extent it is useful information at all, it is only comparing wit

    • by mjwx ( 966435 )

      I bought a 560 Ti just a month ago and now this? FFFFFFfffffffffff...

      I knew NVidia were doing this six weeks ago. So I didn't buy a 560 Ti in the lead up to BF3. But look on the bright side AC, you still own a pretty good graphics card. I'm still on my 3yr old GF 285.

      • by bronney ( 638318 )

        got a gigabyte 560 ti oc last month for bf3, totally blew the 285 away. I know you feeling man. Having a 285 was awesome but man, the 560ti runs cool and oc very well. Give it a try and smile :)

        • by mjwx ( 966435 )

          got a gigabyte 560 ti oc last month for bf3, totally blew the 285 away. I know you feeling man. Having a 285 was awesome but man, the 560ti runs cool and oc very well. Give it a try and smile :)

          Yeah, it's summer here so a cooler card would be great.

          But I just bought a 1920x1200 monitor, so that old 285 is going to show it's age real soon.

          • by bronney ( 638318 )

            Ah I was always talking about 1920x1080. I run 2x 24" LG here doing those res. When I had the 285 I was running 2x 20" at 1680x1050. Before that it was 2x 9800gt. SLI is a lie and it rhymes!

            But yeah, get one. In fact get the old one and volt it 1.1V 900/1800 and leave the mem alone. Completely solid and made my BF3, all previous gen games so smooth. Skyrim lags a bit outdoor on high but perfectly playable. But DX11 makes it so beautiful.

            When I got the 560 Ti, I had 580 money in my pocket ready to bl

    • by noodler ( 724788 )

      "I bought a 560 Ti just a month ago and now this? FFFFFFfffffffffff..."

      Don't worry.
      There won't be games that will use you cards full power for at least 2 or 3 years.
      Current game developers produce art that fits the consoles and PC gamers are stuck with sub-par graphics that run great on 2 year old hardware.

  • by Anonymous Coward

    They couldn't call it a 561 Ti, or a 560 Pt? Or Au, or Ir, or whatever other element is "better" than Titanium.

  • "but with a couple of its streaming multiprocessors fused off."

    fused off? Really?!

    I don't know how they would do that, except than rather not connecting them in the blueprints.

    Or are they just "defect" 570 and 580 relabeled.

    • by rahvin112 ( 446269 ) on Wednesday November 30, 2011 @01:46PM (#38216324)

      They use lasers to cut the traces on the processor and firmware to disable what they can't cut. The chips are designed with this ability so they can bin and disable to differentiate models and use parts with defects. All the different model numbers are just binned parts with the bad sections disabled.

      • Re: (Score:2, Flamebait)

        by mr1911 ( 1942298 )
        Or they use fuzes, which leads to the "fuzed off" language. The fuzes are generally only accessible via the IC tester and the pads are not bonded out when packaged.
    • by mprinkey ( 1434 ) on Wednesday November 30, 2011 @01:53PM (#38216420)

      Probably referring to efuses that can be burned out on the die. These are common and allow CPU/GPUs to have unit-specific information (like serial numbers, crypto keys, etc) coded into otherwise identical parts from the fab. Video game systems like the 360 use them as an anti-hacking measure...disallowing older version of firmware to run on systems that have certain efuses "blown." Likely, there is an efuse for each core or group of cores. Those can be burned out if they are found to be defective or to simply cripple a portion of the part for down-binning. That is a practice at least as old as the Pentium 2.

    • yes, really (Score:5, Informative)

      by OrangeTide ( 124937 ) on Wednesday November 30, 2011 @01:55PM (#38216458) Homepage Journal

      all parts have defects. although sometimes companies don't test throughly enough to find the defects. and usually the defects don't impact normal operation. But when the potential exists for problems, they are forced to either scrap the part or bin [wikipedia.org] it as a lower spec part. Binning improves yield and helps keep prices down on the higher-end parts that do pass tests.

      But here's the problem with binning from a marketing standpoint: "and a few percentage points slower than the GeForce GTX 570". This binned 570 is about $60+ cheaper and will likely slide down to the old 560 Ti (naming is confusing!) prices. So now they've created a cheaper version that is almost the same performance, and run the risk that customers will choose the cheaper product over the more expensive (and I assume higher margin) product.

      • by Anonymous Coward

        They have probably been filling the bin since the 570's started rolling out of the factory. I'm sure they waited until they stopped producing regular 560's, and waited until they had a waning supply of regular 560's before announcing this "new" 560. They probably have a person or team of people dedicated to figuring out the perfect time to stop production of one card and when to release another card. This isn't a small inexperienced start-up we're talking about here.

      • Re:yes, really (Score:5, Informative)

        by billcopc ( 196330 ) <vrillco@yahoo.com> on Wednesday November 30, 2011 @02:45PM (#38217120) Homepage

        Strictly speaking, it costs the same to make this new 560 Ti chip as a balls-out 580 chip. They're identical from the fab's perspective. In practice, the 560 Ti is a way to maximize yield by salvaging defective 580s. This is very much like Celerons being Pentiums with the defective parts lasered or fused off.

        If it weren't for this binning, they would have to toss these chips in the garbage. In theory, salvaging imperfect chips allows them to price things more aggressively across the product line, since the sunk cost of manufacturing is averaged out over a much greater volume.

        Here's an example, and note these numbers are purely arbitrary, I know nothing about fab economics:

        Suppose they had to throw 4 out of every 5 GTX chip away, and each one cost $100 to make, then each good GTX would cost $500 on average. The yield is thus 20%.

        If instead, they can sell those 4 bad chips as lower-spec products, each chip costs $100. The yield is now 100%.

        So yes, the higher binned cards theoretically cost the same to build as the cheap ones (assuming similar memory/outputs/PCB). They are thus higher-margin, but also scarce due to manufacturing limitations. The smaller the pitch, the harder it is to produce a perfect chip. This scarcity is what leads to the increased cost. After all, if a 580 cost the same as a 560, everyone would want the faster one and no one would buy the rejects. OEMs partially compensate by bundling a bunch of stuff with the high-end cards, like pack-in games, DP converters and assorted swag - cheap stuff with a higher perceived value. Put it this way: a recent title like Battlefield 3 might sell for $69 in stores, but you can be sure the OEM is paying much less to bundle it with their product. That goes a long way toward buttering up prospective buyers.

        I'm sure NVIDIA would rather be stuck with a handful of "true" 570s than a shit ton of useless defective chips. People will choose the cheaper product, that's the point! If it weren't for binning, these chips would be worth zero dollars, destined for the incinerator. Pricing doesn't really matter that much, this late in the generation. The GTX 6xx (Kepler) is due out in March 2012, and will probably be available only as a high-end part at first. Bleeding-edge nutjobs like myself will be able to blow $1500 on a pair of the latest and greatest, while the sane people buy out the remainder of 5xx inventory at clearance prices, and only then will the new low-end cards be launched. They've got it down to a science.

        • Bleeding-edge nutjobs like myself will be able to blow $1500 on a pair of the latest and greatest, while the sane people buy out the remainder of 5xx inventory at clearance prices, and only then will the new low-end cards be launched. They've got it down to a science.

          ...and some of us will get them years later, used, and they'll still run everything we want to run. And on behalf of this group, I want to thank you for spending the big bucks so that they're motivated to keep cranking out newer, bigger, and faster cards, which I enjoy but do not want to pay full price for.

        • by isorox ( 205688 )

          Suppose they had to throw 4 out of every 5 GTX chip away, and each one cost $100 to make, then each good GTX would cost $500 on average. The yield is thus 20%.

          If instead, they can sell those 4 bad chips as lower-spec products, each chip costs $100. The yield is now 100%.

          But what if the availability of those $100 chips reduces demand for the $500 ones? You could end up selling the 4x$100 ones, but can't shift the $500 one.

          Look at airlines. They fly a plane from London to New York, it has 200 economy seats, 50 business seats.

          Economy sell for $500. Business for $2000.

          After everyone's on board, the cabin crew notice that Business class only has 30 people in (economy has 150 people). They hold an auction to upgrade economy travellers, and remaining 20 seats go for $200 each.

          On

          • The fundamental difference is that economy and business class seating both arrive at the same destination, at the same time. I couldn't care less about the cheap seat, as long as I get there in one piece. A high-end GPU runs a lot faster than the low-end variants, which for some of us is all that matters.

            I like to run my games at 2560x1440. Why ? Because that's my native resolution on these 27" LCDs. Sure, I could use 1920x1080, or even 1280x720, but I like the higher res. If I can spend a little more

        • Is you pay by the wafer when making chips. A wafer will cost a certain amount to make depending on the process, the size, the fab and so on. That is how the company that makes GPUs is charged. So the more chips that come off the wafer, the more the cost of that wafer can be spread out. That means not only having smaller chips, which of course can fit more per wafer, but having less defective chips.

          Hence binning based on units that work (or don't). As you say, it brings up yields and thus brings down unit co

      • by Kjella ( 173770 )

        It's half and half, obviously quite a few are binned because of real defects. This I suspect is also why Intel got so many confusing variations with various virtualization features. But it also happens that the mix isn't what the market demands, for example say they're producing too many fully functional 2600Ks while the market wants 2500Ks. Those who do the math say those customers can't be sold up to the 2600K because they're cash limited and slashing the 2600K prices would reduce total revenue from all t

      • It has been suggested that this will be a limited-production chip/card.

        They may have simply accumulated enough fermi chips with a particular defect profile that it made sense to introduce this "new" version which would let them clear out that accumulated inventory.

        G.

  • by stevegee58 ( 1179505 ) on Wednesday November 30, 2011 @01:41PM (#38216250) Journal
    Yay gotta get me one o' those!
    • by Mashiki ( 184564 )

      Last time I looked, it cost more in hydro to make a bitcoin than it's actual value.

      • Not if you rent with power included in your bill :p

        (Not that I'd bother with something like that, but I know people who would... nutters)

      • What's hydro? Some kind of Canadian thingy?
        • Afaict in canada (and possiblly parts of the US too) it is common to reffer to power from the electricity grid as "hydro". I believe this is because those places historically got most of their electricity from hydroelectric dams.

          • It's also slang in some places for hydroponicly grown weed. Which will also cost you loads of electricity, so in some ways it's interchangeable.
  • by Anonymous Coward on Wednesday November 30, 2011 @01:43PM (#38216280)

    The summary doesn't make it clear, but... how many cores does this new video card have?

    • Re: (Score:2, Funny)

      by ebombme ( 1092605 )

      The summary doesn't make it clear, but... how many cores does this new video card have?

      448 cores

      • by Anonymous Coward

        Whoosh? The AC was making a joke, numbnuts. The 448 core thing is mentioned in the title and the first three sentences of the summary and they were mocking the redundancy of it all.

    • More importantly, what does "core" really mean in this context?
    • Crazy (Score:5, Funny)

      by Guppy ( 12314 ) on Wednesday November 30, 2011 @05:10PM (#38218842)

      Meanwhile, at AMD/ATI Headquarters:

      "Well, fuck it. We're going to 449 cores."

      • ATi redefined what they call a "core" some time ago to basically mean each sub processing unit of their cores so they say their Radeon 6970s have "1536 Stream Processors". I'm sure at some point nVidia will redefine theirs to mean each bit of an operator or something and we'll have cards with millions of "cores" before long.

  • What is with the branding scheme on these things? I see a summary with lots of letters and numbers and almost no useful information as to what the hell good they all are.
  • They already implied that these are salvaged 570 parts for a limited holiday production run. If you snag one with a free game you want, not a bad deal. Think carefully if you might go SLI in the future, since they'll be hard to find later.

    • Not so hard to find, if you check ebay or craigslist.

      The ones that are truly hard to find are the super-high-end dual-GPU cards. And by "hard to find", I mean you usually have to pay some goddamned scalper a ton of money because the retail inventory sold out in mere weeks.

  • Impressive specs (Score:4, Insightful)

    by ifiwereasculptor ( 1870574 ) on Wednesday November 30, 2011 @02:04PM (#38216570)

    As a whole, it's impressive that we can build such a thing. It's equally impressive that the number one reason for such an advanced piece of technology is so people can virtually shoot the current unfashionable eastern europeans by using more polygons.

    • by Anonymous Coward

      It's equally impressive that the number one reason for such an advanced piece of technology is so people can virtually shoot the current unfashionable eastern europeans by using more polygons.

      Speak for yourself. Some of us use these things in HPC projects and couldn't care less about their ability to render 3D shit.

  • Expensive much? (Score:4, Informative)

    by RobinEggs ( 1453925 ) on Wednesday November 30, 2011 @02:17PM (#38216772)
    I still can't fathom spending $300 on a video card....and feeling like I got a slammin deal in the process.

    What happened to the red-hot competition of 2008, when I built my first modern system and got a newly released Radeon 4850 for $150? That card was maybe the fourth most powerful you could get; there was no serious improvement to be had without adding more dies, via either X2 cards or crossfire.

    Now today the 560 Ti and the 6950 occupy the same relative position in the hierarchy of GPUs that my 4850 held in 2008...yet rather than being brand new and $150 those two cards are almost a year old and $250-$300.

    Ouch.
    • by modecx ( 130548 )

      I remember when a megabyte of RAM cost about $150, so I don't feel too bad about what a $150 video card will do these days. That's about where the sweet spot is, unless you really need to push very high resolution displays.

      • Re:Expensive much? (Score:4, Interesting)

        by Calibax ( 151875 ) * on Wednesday November 30, 2011 @05:12PM (#38218862)

        I remember when a megabyte of RAM dropped to under $1,000,000, when we switched from core to semiconductor technology circa 1972. And we thought it a great advance.

        Now get off my lawn, noob.

      • by Raenex ( 947668 )

        Considering the price of computers these days and the diminishing returns on graphic cards, I'd say $50 is the sweet spot.

        Seriously, the games were already looking pretty amazing not long after 2000. Just how much better does a game look with this year's model versus a card from three years ago?

    • by mikael ( 484 )

      My first PC in 1988 was around £2000 or $3000.

      Now, even smartphones and USB sticks are getting GPU's to do texture-mapping at HD resolutions.

      To think that it used to $150,000+ just to get a basic 24-bit color framebuffer and basic graphics API.

    • > I still can't fathom spending $300 on a video card....and feeling like I got a slammin deal in the process.

      Did you miss the $300 Radeon 5970 on the NewEgg Black Friday sale too? =)
      http://www.newegg.com/Product/Product.aspx?Item=N82E16814103195 [newegg.com]

      While I agree it's hard to justify that price point, that combination of THAT bang/buck is phenomenal !
      http://www.tomshardware.com/reviews/battlefield-3-graphics-performance,3063.html [tomshardware.com]

      Specifically ...
      http://www.tomshardware.com/reviews/battlefield-3-graphics-perfor [tomshardware.com]

    • by karnal ( 22275 )

      I bought a new 8800gt at 300$ and felt I got my money's worth out of it. Of course, now I got a gently used GTX 280 (for free) and haven't played games for about a year... heh.

  • Nice summary, more redundancy please!
    • by Surt ( 22457 )

      It's not the summary's fault. 448 cores is actually in the product title.

  • That's pretty impressive.... It takes the current crown for fastest video card... all so I can play a dumbed-down Skyrim with its dumbed-down console interface and low-res console textures. What is it *for*?
  • More interesting news is that Nvidia's next high end card will probably arrive in late 2012, or even 2013 - while AMD's high end card Tahiti is expected in January 2012.

    http://www.edn.com/article/520175-Nvidia_Kepler_GPUs_to_trail_AMD_s_next_generation.php [edn.com]

  • Impressive! Imagine what a Beowulf Cluster of these things could . . . wait a minute! This chip IS it's own Beowulf Cluster! =)
  • by bryan1945 ( 301828 ) on Wednesday November 30, 2011 @03:57PM (#38217980) Journal

    Does it make my speedy bird go really really REALLY fast?

    • by Surt ( 22457 )

      There actually is an easter egg in there for owners of high end graphics cards. If you have a sufficiently good one your speedy bird will cross the light speed barrier, creating a warp wake that creates a wave of destruction. Sorry if you're missing out with your wimpy card though.

  • Still trying to figger out the method to the madness of going back wards. They had a 9000 series and then a 500 series. Then a 200 series. I'm waiting until they come out with the zero series. And it's not just Nvidia. Seems a lot of companies are doing it.
  • by Lawrence_Bird ( 67278 ) on Wednesday November 30, 2011 @05:03PM (#38218734) Homepage

    Its pretty difficult to get precise power figures on graphics cards - reviews always rate against 'total system' but never give us for reference the system power use without the card (or perhaps an onboard video solution). In any event, all modern cards are total power pigs. At a time where Intel and AMD try are trying very hard to reduce cpu power consumption, graphics cards are using up many multiples of those savings. I'm not sure where 'it has a wall outlet plug' gave these card and gpu producers license to subsidize the power companies.

  • Or all these companies (AMD, nVidia, Intel) slapping more "cores" on their products and only yielding 10 - 15% performance improvements kind of lack luster? I think if companies can't at least double performance then they shouldn't bother releasing a new product. The trickle release of half-hearted improvements in performance has to end.
  • maybe the 560 Ti's will come down in price to that magical 150-175$ threshold to which I refuse to spend over for a video card.

One man's constant is another man's variable. -- A.J. Perlis

Working...