Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Hardware

NVIDIA Launches GeForce GTX 560 Ti 448-Core GPU 127

MojoKid writes "NVIDIA has just launched the GeForce GTX 560 Ti with 448 cores. Though perhaps a bit unimaginative in terms of branding, the new GeForce GTX 560 Ti with 448 cores is outfitted with the same GF110 GPU powering high-end GeForce GTX 570 and GTX 580 cards, but with a couple of its streaming multiprocessors fused off. The card has 448 CUDA cores arranged in 14 SMs, with 56 texture units and 40 ROPs. Reference specifications call for a 732MHz core clock with 1464MHz CUDA cores. 1.2GB of GDDR5 memory is linked to the GPU via a 320-bit bus and the memory is clocked at an effective 3800MHz data rate. Performance-wise, the new GPU proved to be about 10 to 15 percent faster than the original GeForce GTX 560 Ti and a few percentage points slower than the GeForce GTX 570."
This discussion has been archived. No new comments can be posted.

NVIDIA Launches GeForce GTX 560 Ti 448-Core GPU

Comments Filter:
  • by rahvin112 ( 446269 ) on Wednesday November 30, 2011 @01:46PM (#38216324)

    They use lasers to cut the traces on the processor and firmware to disable what they can't cut. The chips are designed with this ability so they can bin and disable to differentiate models and use parts with defects. All the different model numbers are just binned parts with the bad sections disabled.

  • by mprinkey ( 1434 ) on Wednesday November 30, 2011 @01:53PM (#38216420)

    Probably referring to efuses that can be burned out on the die. These are common and allow CPU/GPUs to have unit-specific information (like serial numbers, crypto keys, etc) coded into otherwise identical parts from the fab. Video game systems like the 360 use them as an anti-hacking measure...disallowing older version of firmware to run on systems that have certain efuses "blown." Likely, there is an efuse for each core or group of cores. Those can be burned out if they are found to be defective or to simply cripple a portion of the part for down-binning. That is a practice at least as old as the Pentium 2.

  • yes, really (Score:5, Informative)

    by OrangeTide ( 124937 ) on Wednesday November 30, 2011 @01:55PM (#38216458) Homepage Journal

    all parts have defects. although sometimes companies don't test throughly enough to find the defects. and usually the defects don't impact normal operation. But when the potential exists for problems, they are forced to either scrap the part or bin [wikipedia.org] it as a lower spec part. Binning improves yield and helps keep prices down on the higher-end parts that do pass tests.

    But here's the problem with binning from a marketing standpoint: "and a few percentage points slower than the GeForce GTX 570". This binned 570 is about $60+ cheaper and will likely slide down to the old 560 Ti (naming is confusing!) prices. So now they've created a cheaper version that is almost the same performance, and run the risk that customers will choose the cheaper product over the more expensive (and I assume higher margin) product.

  • by ebombme ( 1092605 ) on Wednesday November 30, 2011 @01:56PM (#38216466)
    If it is EVGA brand I believe they allow a trade in program within a certain amount of time for situations just like this. They have a trade in program where you send in your old card and pay the difference and they will send you the updated card of choice.
  • Expensive much? (Score:4, Informative)

    by RobinEggs ( 1453925 ) on Wednesday November 30, 2011 @02:17PM (#38216772)
    I still can't fathom spending $300 on a video card....and feeling like I got a slammin deal in the process.

    What happened to the red-hot competition of 2008, when I built my first modern system and got a newly released Radeon 4850 for $150? That card was maybe the fourth most powerful you could get; there was no serious improvement to be had without adding more dies, via either X2 cards or crossfire.

    Now today the 560 Ti and the 6950 occupy the same relative position in the hierarchy of GPUs that my 4850 held in 2008...yet rather than being brand new and $150 those two cards are almost a year old and $250-$300.

    Ouch.
  • Re:yes, really (Score:5, Informative)

    by billcopc ( 196330 ) <vrillco@yahoo.com> on Wednesday November 30, 2011 @02:45PM (#38217120) Homepage

    Strictly speaking, it costs the same to make this new 560 Ti chip as a balls-out 580 chip. They're identical from the fab's perspective. In practice, the 560 Ti is a way to maximize yield by salvaging defective 580s. This is very much like Celerons being Pentiums with the defective parts lasered or fused off.

    If it weren't for this binning, they would have to toss these chips in the garbage. In theory, salvaging imperfect chips allows them to price things more aggressively across the product line, since the sunk cost of manufacturing is averaged out over a much greater volume.

    Here's an example, and note these numbers are purely arbitrary, I know nothing about fab economics:

    Suppose they had to throw 4 out of every 5 GTX chip away, and each one cost $100 to make, then each good GTX would cost $500 on average. The yield is thus 20%.

    If instead, they can sell those 4 bad chips as lower-spec products, each chip costs $100. The yield is now 100%.

    So yes, the higher binned cards theoretically cost the same to build as the cheap ones (assuming similar memory/outputs/PCB). They are thus higher-margin, but also scarce due to manufacturing limitations. The smaller the pitch, the harder it is to produce a perfect chip. This scarcity is what leads to the increased cost. After all, if a 580 cost the same as a 560, everyone would want the faster one and no one would buy the rejects. OEMs partially compensate by bundling a bunch of stuff with the high-end cards, like pack-in games, DP converters and assorted swag - cheap stuff with a higher perceived value. Put it this way: a recent title like Battlefield 3 might sell for $69 in stores, but you can be sure the OEM is paying much less to bundle it with their product. That goes a long way toward buttering up prospective buyers.

    I'm sure NVIDIA would rather be stuck with a handful of "true" 570s than a shit ton of useless defective chips. People will choose the cheaper product, that's the point! If it weren't for binning, these chips would be worth zero dollars, destined for the incinerator. Pricing doesn't really matter that much, this late in the generation. The GTX 6xx (Kepler) is due out in March 2012, and will probably be available only as a high-end part at first. Bleeding-edge nutjobs like myself will be able to blow $1500 on a pair of the latest and greatest, while the sane people buy out the remainder of 5xx inventory at clearance prices, and only then will the new low-end cards be launched. They've got it down to a science.

Living on Earth may be expensive, but it includes an annual free trip around the Sun.

Working...