Hands On With Nvidia's New GTX 280 Card 212
notdagreatbrain writes "Maximum PC magazine has early benchmarks on Nvidia's newest GPU architecture — the GTX 200 series. Benchmarks on the smokin' fast processor reveal a graphics card that can finally tame Crysis at 1900x1200.
'The GTX 280 delivered real-world benchmark numbers nearly 50 percent faster than a single GeForce 9800 GTX running on Windows XP, and it was 23 percent faster than that card running on Vista. In fact, it looks as though a single GTX 280 will be comparable to — and in some cases beat — two 9800 GTX cards running in SLI, a fact that explains why Nvidia expects the 9800 GX2 to fade from the scene rather quickly.'"
Yeah but... (Score:2, Funny)
Re: (Score:2, Insightful)
Same answer as all cool new hardware: NO!
Re: (Score:3, Insightful)
Easy counter-example would be any new CPU architecture, which is generally adopted by Linux faster than the competition (especially Windows, which is probably what you're comparing Linux to, given the context). AMD64 (and Itanium 2, for that matter) is an example. While Linux can be slow to get support for some things, that's certainly not true for all cool new hardware. What about the PS3? Pandora? Heck, some cool hardware Linux supports would be impossible for a g
Re: (Score:3, Informative)
Don't worry.... (Score:2, Funny)
Power vs Intel (Score:4, Interesting)
How is Nvidia able to year after year make these amazing advances in power while Intel makes (although great) only modest advances?
As I said I do not know anything about chip design so please correct me on any points.
Re:Power vs Intel (Score:5, Informative)
Re: (Score:3, Informative)
For example, with the 8000 series pixel shaders had become very important in modern games, so the cards were optimised for pixel shading performance much more than the 7000 series was. There is simply no equivalent for CPUs - even stuff like SSE extensions is really just trying to do the same stuff in a more parallel way, it isn't a radically new way of doing things.
Re: (Score:2, Troll)
Re:Power vs Intel (Score:4, Informative)
Precisely. This is something that can be solved by simply throwing more transistors in. Their biggest challenge is probably power and heat, not architecture.
Not to mention that "programs" on GPUs are ridiculously simple compared to something on a general purpose CPU. Next time you write a shader, try branching (i.e. if, else), your shader will slow to a relative crawl.
Re: (Score:2)
How is Nvidia able to year after year make these amazing advances in power while Intel makes (although great) only modest advances?
There is more room for improvement in the graphics card/GPU arena than in the CPU arena. Since the market is so much larger surrounding CPUs, more research has been done and the chips are closer to "perfectly" using available technology and continually expanding the realm of what technology is available.
And I'll echo the_humeister's statement that graphics operations are much more easily done in parallel than generic computing. You can throw processors/cores at the problem pretty easily and continue to s
Re: (Score:3, Informative)
Intel and AMD are having issues getting over 4 cores per die right now, while this card "... packs 240 tiny processing cores into this space, plus 32 raster-operation processors".
Re:Power vs Intel (Score:5, Interesting)
Intel's chips have to WORK. and I mean WORK ALL THE TIME. getting a single calculation wrong is mega mega hell. remember the pentium calculation bug?
People will calculate invoices and bank statements with that intel chip. It might control airplanes or god knows what. It needs to be foolproof and highly reliable.
Graphics chips draw pretty pictures on the screen.
It's a different ballgame. As a game dev, my 100% priority for any new chips is that they ship them with stable, tested drivers that are backwards compatible, not just great with directX 10 and 11.
If someone wrote code that adhered correctly to the directx spec on version 5 or even 2, the new cards should render that code faithfully. Generally, they don't, and we have to explain to gamers why their spangly new video card is actually part of the problem in some situations
Re:Power vs Intel (Score:4, Interesting)
Nvidia is increasingly marketing its chips as "stream processors," rather than "graphics processors." They are becoming increasingly used for scientific computation, where reliability and accuracy are just as important as in the general-purpose case (which reminds me, I need to check if they support double-precision and IEEE 754 yet). It could be the case in a few years that the structural analysis for the building you'll be in might be done by a program running on one of these chips.
Re: (Score:2)
A CPU consists of a single pipelined processor with all sorts of tricks to optimise performance. The Intel P6 [wikipedia.org] article at wikipedia gives some explanation of these:
* Speculative executio
Power Consumption (Score:5, Interesting)
Something that has always concerned me (more as I play games less often now) is how much power these cards draw when they aren't pumping out a zillion triangles a second playing DNF.
Most of the time (90%+ probably) I'm just doing very simple desktop type things. While it's obvious from the heat output that these cards aren't running flat out when redrawing a desktop surely they must be using significatnly more power than a simple graphics card that could perform the same role. Does anyone have any figures showing how much power is being wasted?
Perhaps we should have two graphics cards in the the system now - one that just does desktop type things and one for when real power is required. I would have thought it would be fairly simple to design a motherboard such that it had an internal only slot to accept the latest and greatest 3D accelerator card that suplimented an on board dumb-as-a-brick graphics card.
Re:Power Consumption (Score:4, Informative)
You may be wondering, with a chip this large, about power consumptionâ"as in: Will the lights flicker when I fire up Call of Duty 4? The chip's max thermal design power, or TDP, is 236W, which is considerable. However, Nvidia claims idle power draw for the GT200 of only 25W, down from 64W in the G80. They even say GT200's idle power draw is similar to AMD's righteously frugal RV670 GPU. We shall see about that, but how did they accomplish such a thing? GeForce GPUs have many clock domains, as evidenced by the fact that the GPU core and shader clock speeds diverge. Tamasi said Nvidia implemented dynamic power and frequency scaling throughout the chip, with multiple units able to scale independently. He characterized G80 as an "on or off" affair, whereas GT200's power use scales more linearly with demand. Even in a 3D game or application, he hinted, the GT200 might use much less power than its TDP maximum. Much like a CPU, GT200 has multiple power states with algorithmic determination of the proper state, and those P-states include a new, presumably relatively low-power state for video decoding and playback. Also, GT200-based cards will be compatible with Nvidia's HybridPower scheme, so they can be deactivated entirely in favor of a chipset-based GPU when they're not needed.
Re:Power Consumption (Score:5, Informative)
Re: (Score:2)
Re: (Score:3, Informative)
Re: (Score:2)
Re: (Score:2, Interesting)
Ok, I just went over your original posting again and wanted to address the issues you spoke of.
There is a price to pay for powering up/down sections of a chip. It takes time and power. One needs to limit the number of times this is done. But the basic idea is good and I believe this is exactly what the card in question does. However, it is not as simple as implied.
Re: (Score:2)
Real-time idle for video cards? (Score:2)
Re: (Score:2)
Works in vista only though, and of course, that OS is still showing signs of flop in the games area, despite DX10 and SP1.
Re: (Score:2)
RTFA (Score:3, Informative)
Power Considerations
Nvidia has made great strides in reducing its GPUs' power consumption, and the GeForce 200 series promises to be no exception. In addition to supporting Hybrid Power (a feature that can shut down a relatively power-thirsty add-in GPU when a more economical integrated GPU can handle the workload instead), these new chips will have performance modes optimized for times when Vista is idle or the host PC is running a 2D application, when the user is watching a movie on Blu-ray or DVD, and when full 3D performance is called for. Nvidia promises the GeForce device driver will switch between these modes based on GPU utilization in a fashion that's entirely transparent to the user.
So, yes, they hear you, and are making improvements in this area.
What about Aero graphics? (Score:2)
Re: (Score:2)
MY Space Heater! (Score:2)
-Another good article on the GTX280 (GT200 GPU) at TR: http://www.techreport.com/articles.x/14934 [techreport.com]
Re: (Score:3, Funny)
GX2 Cheaper and Faster (Score:5, Informative)
I'm not well versed in the cause of micro-stutter, but the results are that frames aren't spaced evenly from each other. In a 30 fps situation, a single card will give you a frame at 0 ms, 33 ms, 67 ms, 100 ms, etc. Add a new SLI card and let's say you have 100% scaling, which is overly optimistic. Frames now render at 0 ms, 8 ms, 33 ms, 41 ms, 67 ms, 75 ms, 100ms, and 108ms. You get twice the frames per second, but they're not evenly spaced. In this case, which uses realistic numbers, you're getting 60 fps might say that the output looks about the same as 40 fps, since the delay between every other frame is 25 ms.
It would probably look a bit better than 40 fps, since between each 25 ms delay you get an 8 ms delay, but beyond the reduced effective fps there are other complications as well. For instance, the jitter is very distracting to some people. Also, most LCD monitors, even those rated at 2-5 ms response times, will have issues showing the 33 ms frame completely free of ghosting from the 8 ms frame before the 41 ms frame shows up.
Most people only look at fps, though, which makes the 9800 GX2 a very attractive choice. Because I'm aware of micro-stutter, I won't buy a multi-GPU card or SLI setup unless it's more than 50% faster than a single-GPU card, and that's still ignoring price. That said, I'm sort of surprised to find myself now looking mostly to AMD's 4870 release next week instead of going to Newegg for a GTX280, since the 280 results, while not bad, weren't quite what I was hoping for in a $650 card.
Re: (Score:2)
Short answer: it's fine. if you have the money, and want to play at extreme resolutions, get SLI.
It almost seems like micro stutter is some kind of viral ATI anti-marketing bs.
Re:GX2 Cheaper and Faster (Score:5, Interesting)
Thank goodness! (Score:2)
Noise leveb (Score:5, Informative)
no surprise (Score:3, Informative)
i dont get why people fall for that - push a chip to limits, put a noisy fan on it, and sell it as high performance card.
at least with ati 3870 you can decide whether you gonna overclock the card and endure the noise or not.
Re: (Score:3, Informative)
Not at all surprising. Did you see the size of that chip die? You can fit 6 Penryn on it!! I used to work for a semiconductor company and the larger the chip the more expensive it gets. This is because the larger the die is the less likely it is to be defect free when it comes out of the fab.
I call bull on those conclusions. (Score:3, Insightful)
and in some cases beat â" two 9800 GTX cards running in SLI, a fact that explains why Nvidia expects the 9800 GX2 to fade from the scene rather quickly.
Bullshit. The 9800GX2 is consistently quite a bit faster (TechReport's very detailed review here [techreport.com]), and it costs around $450, while the GTX 280 costs $650 (with the younger brother the 260 at $400), with the only drawbacks being more power drawn and higher noise. Even then, I think it's a no-brainer.
Don't get me wrong, these are impressive single-GPU cards, but their price points are TOTALLY wrong. ATI's 4870 and 4850 cards are coming up at $450 and $200 respectively, and I think they'll eat these for lunch, at least in the value angle.
Re: (Score:2)
Lets see how the reviews of the 4800 series pan out.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Basically he spends USD400, plays computer games for a few nights, and actually ends up with more money than he would otherwise (I actually know someone who did save some money in a similar way). In contrast if it were USD4000 for a vid card, the calculation could be different - he could get bored of the various games and go back to
Re: (Score:2)
A $400 video card really is a smart business decision when you look at entertainment-hours per dollar.
So, practically who bought 9800 (Score:3, Interesting)
well done nvidia. very microsoft of you.
Re:So, practically who bought 9800 (Score:5, Insightful)
If you buy a graphics card in the hope that it's going to be the top of the line card for longer then a few months then you're very much mistaken.
Buy a card that will do what you need it to, and then just stick with that until it stops being powerful enough for you. Anyone hoping their computer will be "future proof" is heading towards disappointment very fast.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
ATI's Response? (Score:2, Interesting)
Based on the information I've seen on it, it will be pretty comparable in terms of performance, but at a far cheaper price.
I'm hoping that the new ATI card performs within 10% - 15% or so of the GTX280 because I'm getting a bit tired of the issues I have with my current nVidia 8800GTS cards. (SLI)
I cannot set the fanspeed in a manner that will "stay" after a reboot.
My game of choice actually has some moderate-sever issues with nVidia cards and crashes at le
Re: (Score:2)
Looks like it's being released right now. Best wait a week or two until the reviews are out and you can compare the two before wasting your valuable beer tokens.
Inquirer camping outside the NDA session. [theinquirer.net]Tame Crysis at 1900x1200? (Score:3, Insightful)
If I want more speed, i'll get another 8800. That card is phenomenal, and about to get a lot cheaper.
Re: (Score:2)
I understand using crysis as a benchmark but pretending that there wasn't any setup capable of running crysis on 1900x1200 is exaggerating
Great news - not that I want to buy the thing... (Score:2)
Noise... (Score:3, Funny)
The scene has changed. (Score:5, Interesting)
This year I put my disposable income towards getting in on all three next generation consoles, and the PC will languish for a long time yet.
I don't think I've changed, I think the market has changed.
They're getting bigger and hotter, and no longer feel like cutting edge kit. They feel like an attempt to squeeze more life out of old technology.
DirectX 10 as a selling point is a joke, with the accompanying baggage that is Vista all it does is slow games down, and none of them look any better for it yet. In any case, there are only five or six of them. You can pick up an 8800GT 512 for less than 150 dollars these days, and it's a powerhouse, unless you're gaming in full 1080p. There is no motivation to put one of those power hungry bricks in my rig. Nothing gets any prettier these days, and FPS is well taken care of at 1680x1050 or below.
Game over, graphics cards.
I wonder what will happen if everyone figures this out? Imagine a world in which the next gen of consoles is no longer subsidised, or driven, by PC enthusiasts...
I have to agree (Score:2)
Only now with the release of the 8800GT and 9600GT is the power consumption/performance ratio getting reasonable (and yes, the ATI 3870 has similar power consumption to the 8800GT, but cannot mat
Re: (Score:3, Informative)
Texture size and number of objects in a scene on Crysis would be the best examples, there is a difference. Games are moving to levels (especially for HD or 1920x1200&up players) that the texture limitations of DX9.0c can't bring the detail needed, and this is just one 'tiny' aspect of DX10.
http://www.tomsgames.com/us/2007/09/18/dx10_part3/page3.html [tomsgames.com]
Bundling it into Vista is bad, for a slew of reasons, and the shit they've pu
Does it sound like a jet engine? (Score:3, Funny)
the way things are going you will need 2 power supplies in a PC. one for the video card and one for everything else
Well, there goes my upgrade plan (Score:4, Insightful)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
To be fair, he's an extreme case. He's also waiting for an Xbox360...waiting for the RRODs to be solved in the 3rd or 4th production generation which will address graphics cooling(unlike the last one which changed the h
Re: (Score:2)
My worst experience was researching video cards. This GTX200 series just popped ou
And it uses more power than your A/C (Score:2)
I doubt it's noisier than MY A/C (Score:2)
I sat for a year next to a Silicon Graphics twin-tower GTX. Now THAT was a noisy machine. Any GeForce is whisper-quiet compared to that.
Impressive. But impractical. (Score:4, Insightful)
Obviously, the above numbers are wild speculation; but the punchline is that these parts can't possibly be cheap to manufacture. I suspect that NVIDIA will see some nice sales to lunatic early adopters, and they'll probably have a compute only version of this card for high end computing; but there is no way that it could hit mass distribution price points. Even at $650, I'm not sure that NVIDIA's margins are all that exciting on this particular part.
We're gonna need CUDA benchmarks (Score:3, Insightful)
Someone please develop CUDA [wikipedia.org] benchmarks to be included in future reviews.
We need several apps: one with a kernel that is trivial enough to be constantly starved for memory, one that is the opposite (compute heavy, memory light), integer vs. FP, and something that specifically benefits from the new double-precision floating point that only the newer stuff has.
Get back to me soon, mmmmK?
Re: (Score:2, Insightful)
Re: (Score:2)
Re:Vista cuts performance... (Score:5, Interesting)
When Vista is faster there's 'something wrong'? (Score:2)
DirectX 10 is the reason (Score:5, Informative)
When run under Vista, it features tons of additional effects. Those are the reasons why the speed improvement in Crysis aren't that much impressive under Vista.
PS: And for the record, Radeon HD3870X2 uses the exact same GDDR3, not GDDR4 as TFA's review says. ATI choose to go for GDDR3 to cut the costs of the dual GPU setup. (Only a few non standard boards by 3rd party manufacturer use GDDR4 and a PCI-express 2.0 bridge).
Re:DirectX 10 is the reason (Score:4, Informative)
I suppose if the reviewers were stupid, they may not have run the game in DX9 mode on both XP and Vista, which would account for the difference even if the graphical options were set to the same levels. But it doesn't make the game look better just by running it in DX10 mode, the only difference is that it's slower.
For those who aren't familiar with this: in XP/DX9 mode the highest-level graphical options are grayed out. This is an entirely artificial limitation; the configuration changes I mentioned simply replace graphical options with a higher ones (they're just integers in a plaintext file), so for example 'High' (DX9) becomes 'Very High' (the 'DX10' effects) in practise.
Re: (Score:2, Interesting)
There is no "Vista DRM." That little copy protection stuff lets you play Blu-Ray discs.
You can rip CDs, DVDs, and pirate t3h internetz if you want. I do so on a daily basis on my Vista x64 machine.
Now, if OS support for DRM bothers you, take it up with the studios that require it. Not playing DVDs is not an option.
Re: (Score:2)
Re: (Score:2)
Smaller companies can get away with saying "the studios require it". Microsoft can't.
Why ? Microsoft are insignificant players in content creation and delivery marketplace.
Re: (Score:2)
Re: (Score:2)
Do you really think anyone could sell video content that wouldn't play on Vista?
Re:Vista cuts performance... (Score:4, Insightful)
Microsoft owns the desktop. Content creation and delivery folks want the desktop. What does their (lack of) position in the content market matter?
The content folk are, at best, highly suspicious of "the desktop". With good reason.
Do you really think anyone could sell video content that wouldn't play on Vista?
Of course they could. Most people consume their content from standalone commodity appliances like DVD players and iPods. This hasn't changed in the last few decades (substitute "VHS", "Cassette", "LP", etc as necessary) and there's little reason to think it will in the future.
Re: (Score:2)
Old content, sure. New content embraces it. That said, old content isn't that wary -- they keep selling DVDs with bonus data-track content, after all.
Most people also tend to wait it out before adopting new technologies; being an early winner or an early loser can make a difference between being Blu-Ray or HD-DVD.
All that said,
Re:Anandtech and TechReport reviews (Score:4, Informative)
In fact, from the article:
The GTX 280 delivered real-world benchmark numbers nearly 50 percent faster than a single GeForce 9800 GTX running on Windows XP, and it was 23-percent faster than that card running on Vista. In fact, it looks as though a single GTX 280 will be comparable to--and in some cases beat--two 9800 GTX cards running in SLI, a fact that explains why Nvidia expects the 9800 GX2 to fade from the scene rather quickly.
Which leads me to the question, are you trolling?
Re: (Score:2)
Re: (Score:3, Informative)
Re:Anandtech and TechReport reviews (Score:4, Informative)
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Informative)
So an increase in CPU speed would probably benefit more the GTX 280 than the 9800GX2.
Of course, we can't know if my theory is true unless we test it, but seems logical to me.
Re: (Score:2)
Re: (Score:2)
Still some way to go to catch up with ATI though. Unfortunately all these benchmarks are largely academic until we have matching 4850/4870 ones.
Re: (Score:2)
Re: (Score:2)
You'd think they would be aiming for something hat runs faster on Vista than XP, if they are trying to step forwards, because although XP still has the s
Research before you spread FUD. (Score:4, Informative)
Re: (Score:2)
Re: (Score:2)
But there are a great many of us who can translate benchmark scores into real world performance.
all it takes is a comparison of those scores to your own scores to know what type of performance to expect.
I was considering upgrading to a 9800GX2, but given these scores, I think I will wait. My 8800GTS 640MB is performing fairly well.