NVIDIA GTX 295 Brings the Pain and Performance 238
Vigile writes "Dual-GPU graphics cards are all the rage and it was a pair of RV770 cores that AMD had to use in order to best the likes of NVIDIA's GeForce GTX 280. This time NVIDIA has the goal of taking back the performance crown and the GeForce GTX 295 essentially takes two of the GT200 GPUs used on the GTX 280, shrinks them from 65nm to 55nm, and puts them on a single card. The results are pretty impressive and the GTX 295 dominates in the benchmarks with a price tag of $499."
Drivers drivers drivers (Score:5, Interesting)
For me personally, I could care less if the card hardware is great if the drivers suck. NVIDIA, fix your linux drivers please. Next time I'll give a much harder look at amd.
Comment removed (Score:3, Interesting)
Microstutter (Score:3, Interesting)
I wonder if this card will suffer from microstutter. The 9800GX2 benchmarked very well but real world performance was lacking because the card essentially varied between very high fps and very low fps, so it still lagged even though it got decent average fps.
With these dual cards it's best to look at their low fps rating. An average fps is often misleading.
Re:Really, though. (Score:2, Interesting)
Now, which one (given those numbers) would you expect to look better?
Which one actually looks (a lot) better?
Theres your problem.
Also it has a spectacular memory leak that sees it using up all available physical memory after a while, grinding to a halt and refusing to load any new textures, which is actually pretty funny the first time. Driving around in a flying car avoiding invisible buildings
Point: Missed (Score:4, Interesting)
For the interested, there's a great article at anandtech [anandtech.com] talking about how the R770 came to be pretty awesome... Really, though, it's not a super-high-end part.
Re:ugh (Score:3, Interesting)
The Real Question (Score:2, Interesting)
Personally I'm betting on the former being far more likely than the latter.
Re:Really, though. (Score:5, Interesting)
Please cite that.
I'm running L4D on my (very) old computer, a 1.6 Ghz AMD single core with a 7600 GS and 1.5GB ram. The game runs fine at medium settings (despite the fact that I am way, way under the minimum system requirements), and when I briefly swapped out the 7600 for a 7900, I was able to turn a few of the settings from medium to high (1024x768, textures low, medium effects -> 1280x1024, textures medium, high effects) and still get a stable 20-25 average frame rate.
Not quality benchmarks, I know, but the engine hasn't changed drastically since HL2 except for graphical improvements (=GPU limited), so claims about it being CPU limited haven't been true since the first public version of the Source engine, and that's assuming they were even true back in 2004.