NVIDIA GTX 295 Brings the Pain and Performance 238
Vigile writes "Dual-GPU graphics cards are all the rage and it was a pair of RV770 cores that AMD had to use in order to best the likes of NVIDIA's GeForce GTX 280. This time NVIDIA has the goal of taking back the performance crown and the GeForce GTX 295 essentially takes two of the GT200 GPUs used on the GTX 280, shrinks them from 65nm to 55nm, and puts them on a single card. The results are pretty impressive and the GTX 295 dominates in the benchmarks with a price tag of $499."
Holiday Shopping (Score:2, Insightful)
ugh (Score:3, Insightful)
this is like the razor wars (double blade! triple blade! quad blade! pento blade!). With OpenCL and DirectGPU (or whatever MS is calling it this week), this should be good for anyone trying to build a cheap superGPU cluster.
It's great that there's a market for this stuff... (Score:5, Insightful)
Re:i hate fans (Score:5, Insightful)
Easy--they're deaf. After years of working on building (near) silent PCs, I've learned that what many people/reviewers consider to be 'quiet' is nowhere near my definition of 'quiet'. I'm not quite sure how loud some gamers have their sound systems turned up, or if they play with the window open or what, but I simply can't trust a review on newegg or most websites when someone says a piece of equipment is 'silent'. There are a few websites like silentpcreview.com that do a good job, but if a piece of equipment isn't reviewed there or in the forums, you're SOL (or you get to be the guinea pig).
fix (Score:4, Insightful)
NVIDIA, fix your linux drivers please.
NVIDIA, open your linux drivers please.
This is all so 1998 (Score:2, Insightful)
Ten years ago the video card wars were in full swing. Each generation brought amazing new advance and impressive technology.
But nVidia and ATI haven't realized that we passed the point of diminishing returns years ago. Mobility and battery life are what matter. And I know there are hardcore PC gamers out there, but there are only a handful of companies even bothering to push the high-end graphics, so you buy a $500 video card and there are exactly ZERO games that take full advantage of it. Wait a year or so, and you may find that one or two of the few high-end PC game makers decide to throw you a bone and add support. And as a bonus, you get SIGNIFICANTLY increased power consumption, and the video card addicts are just wasting resources so they can all whack-off to Shader 30.0 soft shadows on eyelashes.
It's a weird, captive, completely pointless market unless you're doing 3D rendering for a living (for movies, for commercials, for product design, etc.).
Re:It's great that there's money for this stuff... (Score:3, Insightful)
You are also kind of donating the hardware, which is a much bigger cost than the power. $10 worth of electricity will do more of these calcs than a $10 donation would enable.
Re:Taking back the performance crown? (Score:5, Insightful)
No, the funny thing here is that AMD *did* have the performance crown, even though they had planned to give it up. Before the GTX 295, the Radeon 4870x2 was the top of the pile for single-card graphics.
Re:Really, though. (Score:3, Insightful)
People need to understand that sometimes, those detail sliders aren't meant to be cranked all the way up on current hardware. They're there to "future-proof" the game, so that it can still look pretty 2 or 5 years down the road. Wing Commander 4 did it for example.
Of course, it's not a huge incentive for developers to future-proof a game when all they get for it is a forum-bashing like "omg the game sux i can't get 50 fps on my 1337 rig".