Previewing ATi's Radeon X800 XT & X800 Pro 441
Giant_Panda writes "A few short weeks ago, it looked like NVIDIA was back on track as they were
able to overtake ATi and reclaim the 3D performance crown with their GeForce 6800
Ultra. Now, it seems like ATi has fired back with a killer card of their own.
HotHardware just posted a preview of the new
12-Pipe ATi Radeon X800 Pro ($399) and 16-pipe ATi Radeon
X800 XT ($499). The X800 XT seems to be faster then even the new GeForce 6800 Ultra
Extreme cards that were rumored to exist on a few sites this past weekend
and the X800 Pro is a great performer as well. (Other sites have just
posted previews:
TechReport,
Hexus, Lost Circuits)"
Video Arms Race (Score:3, Insightful)
Re:Video Arms Race (Score:3, Funny)
Re:Video Arms Race (Score:5, Insightful)
Buying a card just because you "prefer" that particular brand is stupid. There's nothing wrong with brand loyalty, but true enthusiasts will always go with the best product.
I was an Nvidia "fanboy" for quite a while, until their cards started to suck. My latest video card purchase was a Radeon 9800 pro, and I couldn't be happier.
Re:Video Arms Race (Score:3, Insightful)
Re:Video Arms Race (Score:5, Insightful)
Once I learned how to set up my Riva TNT2 with the NVidia drivers, I didn't have much of a problem doing it again whenever I upgraded my kernel.
However, that didn't prepare me for the obstacles involved in setting up my recently-bought ATI Radeon 9000. I'm not saying it was harder, just different.
I would have preferred to upgrade to a new NVidia card, but I didn't want to go back to a 2.4 kernel. (At the time, you needed to apply a third-party patch to the driver glue to get it to work with the 2.5/2.6.opre* kernels.)
Now, I'm happy to say that my Radeon works fine, and I don't need to reinstall a driver every time I upgrade my kernel.
Difference this time (Score:5, Insightful)
I guess I just see that two-slot, power-sucking design as a huge hassle. I can't imagine how noisy it must be, though I haven't heard it really mentioned in review. But I think the non-fanboys will take a look at the two cards, see that one takes up one and the other takes up two, and go with the one...
Uh (Score:3, Insightful)
The X800 has a lower transistor count and power requirements. How about reading all the reviews and not just Tom's Hardware (who always loves nVidia).
Two slots? Huge and noisy? Forget that. Next.
Re:Video Arms Race (Score:3, Insightful)
Just like every advanced commercial technology, not many people 'need' the power of the most high end products. But for those of us who buy at the more affordable price points, the release of these cards are just as significant. I'm sure soon enough you'll be able to pick up a 9800pro for dirt cheap, and for people like you that's probably great. In another year's tim
Re:Video Arms Race (Score:3, Informative)
Not me at least (Score:2)
1999: Voodoo3 AGP
2001: GeForce 2 MX
2003: GeForce FX 5200
2004: ATI Radeon 9700 Pro
Many I know follow the benchmarks and nothing more when buying. The only reason I used to be loyal to nVidia is becaues I used to run linux (ATI has shit linux drivers).
Actually no (Score:2, Interesting)
But over the Christmas holiday, it finally came time to upgrade. I decided to save a few bucks (actually, this was more a mandate from the wife) and build the box myself. This actually meant that I had to do some research instead of the click an
Re:Video Arms Race (Score:2, Interesting)
Primary reason I'll be going ATI (Score:3, Insightful)
The X800 matches or betters the nVidia card while having a lower transistor count and lower supply requirement (350), thereby meaning I can run the damn thing in just one slot!
OEMs are going to balk at needing to suck up two slots when they can just go to ATI and get an equal card that takes up one.
The only different I can see is PS3.0, which ATI chose not to bother wit
Re:Primary reason I'll be going ATI (Score:3, Informative)
Doesn't happen with everyone, but since I've always had the spare room, I've always just moved the cards off so that I had a space between everything.
Re:Video Arms Race (Score:3, Interesting)
Here is a good example why being a fan boy is plain stupid. Look
Re:Video Arms Race (Score:3, Insightful)
Re:Video Arms Race (Score:4, Interesting)
Guess again. Medical [utah.edu] volume [siggraph.org] visualization. [computer.org]
Now, if you're point is that for MOST consumers, they're only good for games, you may have a point. But the other way to look at it is that, since consumers have demanded such amazing video technology, the price to deliver advanced medical visualizations to doctors has dropped dramatically.
What you used to need a $40,000 SGI O2 for, now you can do with a $1000 computer from Best Buy. That computer might actually save your life some day. Pretty amazing, if you think about it.
Re:Video Arms Race (Score:3, Interesting)
I'm not an EE, but even I can see where better engineering comes into play. I just bought a new case with a 380W power supply, and I about choked on my soda when I read the 480W recommendation from nVidia (although
Re:Video Arms Race (Score:3, Informative)
Since the two cars are manufactured similarly, we can rule out any manufacturing technology differences from causeing this.
Next since the cards perform similarly and take up about the same die space, we can rule out the possibility that one company just has "better" designs for the internal components.
My money is on ATI either having or hiring a thermo expert (you'd be surprised how many EE's
Case of Engineering for Two Different Ends (Score:4, Interesting)
PS 3.0 offers 32 bit precision and an "unlimited pipeline", vertex textures, etc.,. Here's a good article on the differences [elitebastards.com].
Let's put it this way, ATI pretty much just doubled the vertex and pixel pipelines and did not change much architecture wise beyond it's last version of cards the R350. NVIDIA's new card is much more innovative actually, but it's questionable whether its timing is right with the current lack of PS 3.0 capable games. Also, a bad omen for NVIDIA is the fact that ATI's PS 3.0 R500 architecture is nearing completion and they have already shown their PS 3.0 cards, if you will.
It's also, unfortunate that these R420 ATI cards still beat the NVIDIA 6800's in a lot of the current benchmarks, despite their superior tech.
I'm sticking with my second hand R350 ATI 9800Pro that O/C's to 9800XT speed now, personally and I'll skip this iteration of cards. The 9800 will do PS 2.0 plenty quick (at a slightly lower res.) for the latest games including Far Cry and Doom3 and HL2 when they come out.
Looks like no more soft-mods (Score:5, Interesting)
Re:Looks like no more soft-mods (Score:3, Interesting)
Couldnt you just reconnect them by soldering wires??
The 9800se pipe unlock worked at about a 30% success rate.
Complete list of articles (Score:5, Informative)
Half-life 2 (Score:5, Funny)
Re:Half-life 2 (Score:2, Funny)
Re:Half-life 2 (Score:5, Funny)
If I had one of those, I wouldn't have enough time to frag people in HL2. The bundling of a girlfriend is a downgrade, dude!
Damn... (Score:3, Funny)
Re:Damn... (Score:5, Funny)
Soon we will move to external video card RAIDs with their own AC units
Re:Damn... (Score:5, Funny)
They've been there, they've done that. [tweakers.net] It was not terribly popular though. :-)
(Could the way 3dfx used several chips working in parallel be considered "video card RAID"?)
Re:Damn... (Score:3, Informative)
Too much hype over having the "best" card? (Score:5, Informative)
Just pick whicher brand you like better and you'll feel better off letting go of that $500...
Re:Too much hype over having the "best" card? (Score:3, Insightful)
Sure, both nvidia and ATI's latest cards will play all current games at great framerates, but once you start to pile on things like high resolution, anti-aliasing, antisotropic filtering... you need all the performance you can get. Even these newest cards probably won't be able to play FarCry perfectly at 1600x1200 16xAA 16xAF with full details...
More performance is never superfluous.
Re:Too much hype over having the "best" card? (Score:5, Interesting)
Re:Too much hype over having the "best" card? (Score:5, Funny)
The nVidia 6800 Ultra requires two dedicated molex power connectors, and it also requires a 480W power supply. More details. [anandtech.com] Now that's a lot of power.
Also, the cooling setup on the 6800 Ultra takes up a slot of its own, which means you lose a PCI port as well, although now that most of the features PCI had (such as sound and NICs) are integrated in the motherboards, it's not too big of a deal.
Lastly to note, nVidia is releasing a lower-powered 6800GT which is approximately equivalent to the X800 Pro card, and they just recently announced a 6850 Ultra which is basically an OEM-overclocked 6800 Ultra. That thing will probably take up 5 slots, have a built-in A/C unit, and have its own cold fusion reactor as well.
Comment removed (Score:4, Insightful)
Re:Too much hype over having the "best" card? (Score:3, Informative)
Re:Too much hype over having the "best" card? (Score:3, Informative)
It's worth noting that I did a clean install on my system when I switched from Nvidia to ATI.
One thing you absolutely DO NOT want is nvidia leftovers when you install ATI drivers. That may account for the problems you are having.
Certainly don't need to reboot in VGA mode or uninstall old drivers these days if everything is installed properly.
N.
Re:Too much hype over having the "best" card? (Score:3, Interesting)
200fps is an estimate based on the electrochemical reaction rate required to change the signal going into your brain. Beyond 200fps or so (depending on the person) you reach the point where the eye simply doesn't make changes fast enough to transmit different data to the brain.
The tests I have run with people run along the following lines:
At what point can a person no longer identify a letter placed in a single erroneous frame. (as an example you see a movie, one f
I really want to buy this card.... (Score:2, Insightful)
Ati: If you want to have my money, you better pull your thumbs out of your ass and write some Linux-drivers!
Or maybe I will buy this card, and hope it works well with the Generic Ati-drivers that ship with Xorg/Xfree...
Re:I really want to buy this card.... (Score:3, Insightful)
Huh? (Score:2)
I'm running 2.6.5 right with Nvidia's drivers on my debian system. I'm having no problems whatsoever. What kernel release are you talking about?
Re:Huh? (Score:2)
Re:I really want to buy this card.... (Score:5, Insightful)
I don't mean to troll, but every time there's a post about some new bleeding edge video card, there's always someone getting modded up to +5, insightful for saying he'd buy it if it weren't due to lack of driver support, and I'm left wondering what the hell for?
Re:I really want to buy this card.... (Score:5, Insightful)
Anyhow, the original poster is wrong and therefore this discussion is irrelevant.
Re:I really want to buy this card.... (Score:3, Informative)
Re:I really want to buy this card.... (Score:2)
Re:I really want to buy this card.... (Score:2)
The drivers Ati provide are nowhere near as good as the ones which NVIDIA provides. And since my next system will have Athlon 64 and the os will be 64bit Linux, I NEED 64bit drivers! NVIDIA has them, does Ati? I have heard some vague rumours that they _might_ make 64bit drivers available "sometime in the summer", but that's it. Untill that happens, I either have to buy a NVIDIA-card, or use some generic drivers that give me a
Re:I really want to buy this card.... (Score:4, Insightful)
Re:I really want to buy this card.... (Score:4, Informative)
ATI, on the other hand, was a complete nightmare the last time I installed their drivers on a Linux box for someone. I'm fairly proficient in Linux, and he was running Slackware which is the distro I run myself day in and out. It still took us a couple of hours of playing around to get the drivers working properly due to a combination of quirky behavior and EXTREMELY poor documentation. I wouldn't mind doing it all manually, as long as the documentation is clear and concise and helps you get things done in a reasonable amount of time.
Personally, I do keep a Windows box around for gaming, but the parts from this get hand-me-downed to the Linux machines as I upgrade. For that reason, Linux drivers are important to me, and I'll be buying nVidia next time I upgrade. I can deal with spending 5 minutes on a shell script and a reboot to upgrade my video card - I can't handle 2 hours to do the same thing with an ATI card.
Re:I really want to buy this card.... (Score:5, Informative)
Re:I really want to buy this card.... (Score:5, Insightful)
Seeing as how none of the other replies mentioned it, one reason is to do cutting-edge OpenGL development under Linux. There is significant interest in doing Linux game development using cross-platform toolkits of various types. One example is Garage Game's Torque engine [garagegames.com]. Write to that, and get Windows, Mac and Linux support with very little (if any) tweaking. IMO, Linux is the best and most cost-effective platform for game development.
This is why, once again, my next video card purchase will most likely be from NVIDIA. I'll get ATI if I manage a G5... ;-) (I wonder how soon the G5s will get these cards?)
Re:I really want to buy this card.... (Score:5, Informative)
Could the drivers be better? Oh yes. Are they up to nvidia's standard? No. But they ARE listening, and since the last update you can play winex games with hardware acceleration, so there's no problem there...
Re:I really want to buy this card.... (Score:4, Informative)
I bought a 9600 Pro thinking that whatever drivers ATI had would be 'good enough'. Well, they aren't. Not by a long shot. If I weren't so fundamentally opposed to separate power connectors for video cards, I might've traded it in for a nvidia months ago. Those drivers are the sole cause of instability in my system. If you're buying a card for Linux, buy Nvidia. Case closed.
Re:I really want to buy this card.... (Score:3, Insightful)
Re:I really want to buy this card.... (Score:2)
Not good enough... (Score:4, Funny)
Jason.
Re:Not good enough... (Score:2)
then ATI will release the ATI K-RADEOM XXX800 Pro Double Plus Infinity Plus One, and I'll buy it just to be l33t3r than you.
Re:Not good enough... (Score:2)
Other reviews (Score:5, Informative)
Re:Other reviews (Score:2, Informative)
Question (Score:5, Insightful)
Is there any point in getting one of these cards for any reason other than playing the latest games?
Re:Question (Score:4, Informative)
nVidia cards tend to have good OpenGL support and OpenGL is used by a number of "high end" CAD and rendering packages. These cards will work well for folks who don't want to spend the $1500 for the high end CAD cards which are almost the same thing (there are some differences but these will do well on a smaller budget, though $500 for a card is pretty pricey to me
ATI just has 2.0 versions of shaders (Score:3, Informative)
At my company, we had considered using hardware for the final rendering on some of the shots in our current visual effect movie, but the 2.0 shaders just didn't have the capability -- they really are suited only for games (not too surprising, that's where 99% of the market is.) The lack of fully-functional floating point buffers, the limitation on the size of the shader programs, the lack of texture mapping in the vertex shaders -- these are all devastating to the notion of doing high-quality hardware rendering.
All of these limitations, and more, were addressed in the new 3.0 shaders.
I am sure that ATi will support these features eventually, as games come to require them -- but right now you are really comparing apples and Porsches when you compare ATi's and Nvidia's latest offerings.
Thad Beier
Re:ATI just has 2.0 versions of shaders (Score:2)
I haven't seen it, but by all accounts, what ATi's managed to do with PS 2.0 in their Ruby demo makes PS 3.0's use seem rather superfluous. And we all know that within a couple months, we'll be seeing the X850 and X900, that probably will have PS 3.0 support.
If inclusion of PS 3.0 an as-of-yet unused and still far-in-the-future spec is the sole factor you're taking in to account in terms of "quality," I can see why you're let down, bu
Re:ATI just has 2.0 versions of shaders (Score:2, Insightful)
The real question for the gamer is how large the intersection is of the set of games that will (in the future) be able to run on these cards at a playable speed and the set of games that will use this feature. The answer is not clear to me that this intersection would be large.
Case in po
Re:ATI just has 2.0 versions of shaders (Score:5, Informative)
That means that ATI has decided not to compete with NVidia on compatibility. On shader quality, the screen shots at Toms Hardware [tomshardware.com] suggest that it is NVidia that has chosen not to compete. Why would you care about a 3.0 shader language from a card that still doesn't give you correct output of 2.0 shaders?
Wordperfect scrolling test? (Score:5, Funny)
Cost-performance ratio (Score:5, Insightful)
Maybe I'll do it if no one else can be bothered.
Re:Cost-performance ratio (Score:5, Insightful)
That, it it would seem that each card has their respective wins in different disciplines anyways... Radeon = better in newer games (Farcry, etc) and situations where you have a lot of options on, while nVidia tends to be better in older games, but not a slouch in any particular discipline either, so it would be harder to find out what index you would want to use for this particular graph.
Re:Cost-performance ratio (Score:4, Interesting)
http://www.tomshardware.com/graphic/20031229/vga-
ft
Silly question (Score:3, Insightful)
Make no mistake, I'll eventually buy one like these .. after it's well down the price curve, bugs fixed, drivers updated, in a couple years.
Big advantage for ATI (Score:5, Interesting)
nVidia still don't get it. (Score:2, Interesting)
Have a look through the feature sets between ATI, nVidia and DirectX9 - nVidia supports the barest of minimums to work with DirectX9 written games.
No wonder Carmack shunned nVidia
There has to be a time when they support the games, instead of just paying for a prissy ad at the start of a game.
Re:nVidia still don't get it. (Score:2)
Re:nVidia still don't get it. (Score:2)
Didn't he shun ATI for having crappy OpenGL? Futhermore, how do you explain this [bluesnews.com]
Doesn iD use OpenGL, not the Direct3D crap anyway?
Re:nVidia still don't get it. (Score:2)
Proper Linux drivers? (Score:5, Interesting)
I know that ATI has their little RPMs going, but the reason I have switched to using nVidia is because of the crap that went on with ATI and lack of Linux support. And now, they finaly released some drivers, but no support for older cards, and no way to actually install it properly on a Debian system.
nVidia at least allows for distribution of their drivers [debian.org]
This is the only reason why I switched to nVidia. I don't see how anyone using Linux can support the bad support for Linux from ATI (as compared to nVidia, of course).
As to the card itself, well, I think nVidia and ATI was always close enough :) Sometimes competition works, and ATI & nVidia are prime examples of that.
PS. Please, don't troll me about the free drivers. I want/need real drivers, and not some partial implementation.
Re:Proper Linux drivers? (Score:3, Informative)
Wow ultra uber speed (Score:3, Insightful)
The fanboy following video cards is endlessly annoying. I own a Radeon 9800, and it was good value for the dollar all around, but quite frankly, the support sucks.
ATI relies on big benchmark numbers over real world results, I guess that's what 'uber pc geeks' want. nVidia seems to cater to gamers by working with developers to make sure games USE all those fancy new functionalities of the GPU. Ie; nVidia's "The Way it was Meant to be Played" program. ATI plays lip service to it with it's "Get in the Game" program, but they don't provide the same support (like sample codes for killer shader effects, etc)
So we end up with TRON 2.0 having really cool glowing effects on nVidia, but flat and tacky looking on ATI. We have soft shadows in Splinter Cell for nVidia, blocky PSX-era crap for ATI.
Hell, I could go on for months listing all the anomalies in actual real-life games I've encountered. Texture corruptions in Tomb Raider: AOD, outright crashes in Halo.
For all the hype around FSAA and anisotropic filtering - just about EVERY GAME I've enabled them for has crashed hard. Unreal 2, Halo, XIII.
Oh, and the worst, the absolute worst, is frame drops to 5fps and worse in CounterStrike when there's smoke onscreen. I mean COME ON, I had a RivaTNT2 that played the game properly. There's no excuse for that, save a piss poor opengl implementation.
So I tried Will Rock, the game whos screenshots were on my 9800's box, and is a member of the "get in the game" program. This ought to SMOKE on an ATI card, right? Almost, awful looking texture corruption in menus, stuttering in-game for no apparent reason (nothing on screen).
Missing proprietary nVidia features is fine, substitute your proprietary ATI features. Just make them stable and working.
I've used ATI forever, they used to be a cut above the other retail level cards. Now they've slipped hard.
This is a case where nVidia will slowly strangle the competition, because the competition sucks. I'd really like to see ATI turn around and focus on the gaming experience, not the mutual masturbation you see on rage3d.com (the unofficial "support" forum) - with a bunch of kids comparing benchmarks and overclocks, with two or three frustrated folks chronically posting for advice on with mishmash of driver files will actually work with Counter Strike.
Anyhow, hooray for leapfrogging nVidia in phony-baloney do-nothing benchmarks. Will this fabulous new technology actually work with games or is this just more MARKETING BULLSHIT for the likes of toms hardware and hardocp to spread?
Re:Wow ultra uber speed (Score:5, Insightful)
The fanboy following video cards is endlessly annoying.
Please. There are NO differences between the companies as far as "caring about gamers" is concerned. Both exist to make a profit. Period. Several people I know are big independent ATI developers. ATI provides them with code samples, driver updates, etc.. gratis. Anything you say that generalizes one or the other of the companies makes you a "fanboy." Its no different than Ford vs Chevy. Each has some advantages and some disadvantages. And the both have some rabid fan-base that will make it thier sole priority to bash the other. *yawn*
Also, I don't get the whole "hooray for leapfrogging nVidia in phony-baloney do-nothing benchmarks" when every single review I read included all the current DX9 games with commentary on stability and visual quality, as well as performance. I don't even think Anandtech showd a 3DMark03 score. If so, I didn't pay attention to it. I agree, games are all that matter. Fortunately, that's what was tested.
Hmm... (Score:5, Interesting)
As a 3D developer, one of the most exciting things that has come about recently is Shader Model 3.0. It allows you to get greater effects with less operations using some new developments. However, it requires a 32 bit precision. Read more about it here [microsoft.com].
ATI has chosen to continue with it's 24-bit precision architecture. While fine for most applications, some of the exciting new developments require this newer spec technology. I'm sure that it will be interoperable, but all that speed may end up being wasted while computing certain operations.
I'm left wondering why I would buy a brand spankin' new card video card when it doesn't support the newest APIs all that well. Oh well, I guess I get to stick with nVidia...
Re:Hmm... (Score:3, Insightful)
Nothing on the horizon seems to make use of any of what you mentioned, so it'd be safe to buy either card and be totally happy.
There is no such thing as an upgrade that will keep you happy in 2 years if you need to see all the eye-candy. Even though the 6800 supports PS 3.0 and 32bit I highly doubt it'll hold a candle to the cards that are coming out
Off Topic Sponsored Links in that Article (Score:2)
Are Nvidia and ATI the only choice for Linux? (Score:5, Insightful)
|_|b3r |33t w00t! (Score:3, Funny)
I don't like the way this is headed (Score:2)
For two generations now, ATi's tended towards smaller, sleeker, more elegant designs, while nVidia's products keep getting larger, noisier, hotter, and more power-hungry. They're tpically more expensive, to boot. Making the decision for which card to purchase right now is an absolute no-brainer.
On one hand, ATi's X800 draw little power, has superior image quality, doesn't take up multiple slots,
L33t nuu Video cardz... (Score:5, Interesting)
I'd love to see some program that does "reverse VRAM reclaiming" so those of us who don't need 128mb of video RAM power can get some of that ram back for compiling or something.
Okay... that WAS geeky.
Re:L33t nuu Video cardz... (Score:3, Funny)
Two weeks ago I went and disabled all the fancy GFX in the game. Much more enjoyable to play when there are no distracting effects in the game.
(have GF4-Ti4200)
Re:L33t nuu Video cardz... (Score:3, Funny)
Not worth the upgrade (Score:5, Interesting)
A good denominator is fpspb (frames per second per buck, a made up value from Tom's Hardware. For the cash, you can squeeze a lot more out of a $200 Radeon Pro 9800 (especially with overclocking) than you can with anything else right now. You're only talking a marginal difference of fps between this generation and last at high (1600x1200) resolutions, and an almost non-existant difference at "normal" resolutions. The $200-300 extra price premium isn't worth those extra frames.
The X800 XT is not all that much faster (Score:2, Interesting)
Yay! (Score:2)
More Reviews (Score:5, Informative)
HardOCP [hardocp.com]
Ascully [ascully.com]
DriverHeaven [driverheaven.net]
TrustedReviews [trustedreviews.com]
K-Hardware [k-hardware.de]
Hardware Analysis [hardwareanalysis.com]
Hexus [hexus.net]
The Tech Report [techreport.com]
Beyond3D [beyond3d.com]
Neoseeker [neoseeker.com]
ExtremeTech [extremetech.com]
Gamers Depot [gamers-depot.com]
Lost Circuits [lostcircuits.com]
Firing Squad [firingsquad.com]
Tom's Hardware [tomshardware.com]
Bjorn3D [bjorn3d.com]
Hot Hardware [hothardware.com]
Your comment has too few characters per line (currently 10.9). Your comment has too few characters per line (currently 12.3). Your comment has too few characters per line (currently 14.9). Your comment has too few characters per line (currently 17.4).
Question (Score:2)
Ok ATi... (Score:2)
Difference between gaming and workstation card? (Score:3, Interesting)
"XT" as the new "top-of-the-line" standard? (Score:5, Funny)
Maybe ATi will come out with these cards next.
Radeon X800 AT
Radeon X800 386
Radeon X800 486
And then they'll run into trademark problems with a certain other semiconductor manufacturer...
Cost per Frame comparison, Geforce & Radeon (Score:4, Informative)
*CPF = Cost per Frame
**Per Aquamark 3: 1024, P4 3.2, 1024MB CAS2, i875P
Radeon X800 XT
Cost: $499 (MSRP)
FPS: 57.96
CPF: $8.60
Radeon X800 Pro
Cost: $399 (MSRP)
FPS: 54.89
CPF: $7.26
Radeon 9800 XT
Cost: $396 (Pricewatch.com)
FPS: 47.9
CPF: $8.26
GeForce 6800 Ultra
Cost: $499 (MSRP)
FPS: 62.65
CPF: $7.96
GeForce 6800 GT
Cost: $399 (MSRP)
FPS: 61.3
CPF: $6.50
GeForce FX 5950 Ultra
Cost: $365 (Pricewatch.com)
FPS: 50.93
CPF: $7.16
Winner: GeForce 6800 GT
NOTE:
This is ignoring other factors that go into TCO such as power consumption (the Radeons use far less power and may not require a power supply upgrade)
This is based on the Aquamark 3 benchmarks at 1024x768 only. If you wish to gather the mean of the other benchmarks in the linked review to figure a more percise CPF please reply.
Intended to make you think about what your getting when you pay the extra $100 for the top of the line card.
If you were wondering, I'm an ATI fanboy and would personally buy the Radeon X800 Pro if I had $400 to blow.
Re:it burrnsss us (Score:3, Funny)
Re:THG (Score:2)