Futuremark Replies to Nvidia's Claims 317
Nathan writes "Tero Sarkkinen, Executive Vice President of Sales and Marketing at Futuremark, has commented on the claims by Nvidia that 3DMark2003 intentionally puts the GeforceFX in bad light, after Nvidia had declined becoming a member of Futuremark's beta program. This issue looks like it will get worse before it gets better." ATI also seems to be guilty of tweaking their drivers to recognize 3DMark.
I don't care how fast it is... (Score:4, Insightful)
You're backwards (Score:4, Informative)
Nothing will even use the new kit to its fullest for that long.
I'm rapidly coming to the conclusion that the way to go with video cards is to buy one a year old. It's much cheaper, and typically handles all current and near future games perfectly well. The new gizmos, and speed boosts, on these cards rarely provide worthwhile bang for your buck these days.
Use the money you save to buy a faster processor, more RAM, a RAID array or something else that provides a useful improvement in performance outside of the theoretical. Or if you're buying/upgrading card + monitor together, get an extra couple of inches of screen real estate or go for a nice flat panel. The difference in price really is of that order, yet the difference in ability is irrelevant for almost all real applications.
nVidia vs. ATI (Score:4, Funny)
ATI: "Well, this shader program isn't optimally coded - here is a more optimally coded shader that does the exact same thing but more quickly."
nVidia: "Well, you caught us, but we have to cheat because you have it in for us!"
ATI: "Well, you caught us, and although we were doing the exact same thing (only faster), we will remove that code ASAP."
Re:nVidia vs. ATI (Score:5, Funny)
Re:nVidia vs. ATI (Score:2, Funny)
Wouldn't that be:
SCO: Hey, we have a patent on SOMETHING you did, pay us money!
Re:nVidia vs. ATI (Score:5, Interesting)
Re:nVidia vs. ATI (Score:5, Insightful)
ATI recognized the 3dmark executable and special cased for it. Which is misleading and wrong. The performance is enhanced for 3DMark and 3DMark alone.
Re:nVidia vs. ATI (Score:4, Insightful)
Re:nVidia vs. ATI (Score:2, Interesting)
Its kind of like ATI finding a performance bug, and working around it. But not telling anyone else about it. Its more of oportunistic cheating. Its not blatant. hehe, I just don't feel as bad about th
Re:nVidia vs. ATI (Score:2, Insightful)
Hey, on the upside (Score:5, Funny)
The real reason this is important. (Score:5, Interesting)
This is where the money really is, and what is worth fighting for.
Re:The real reason this is important. (Score:4, Informative)
Benchmarks that differ by a couple of percent depending on which test is run are not going to make a big difference in the overall decision process. If they made decisions based on benchmarks then ATI would have closed its doors many years ago, since until very recently they have been consistently outclassed by their competitors performance-wise for several years. However, ATI has done VERY well in the OEM market during this time not due to better performance, but due the the factors I listed.
Re:The real reason this is important. (Score:2, Insightful)
Re:The real reason this is important. (Score:2)
You have to expect it (Score:3, Informative)
It's quite simple really (Score:4, Insightful)
Point is, they can come out of this wearing the white hat, because they were the first to be such good guys about the issue.
The fact is, even with all Nvidia optimizations in-place, their high-end card will just barely edge out a 9800 Pro without optimizations. Add ot this the fact that ATI, 3dmark and the community will hound them and discount Nvidia's optimizations until they are removed, and you've got an all-out win for ATI.
Remember folks: everyone cheats. Fools take things too far and get caught. ATI has played the fool before, Nvidia plays it now; that is the game.
ATi wasn't so bad (Score:5, Funny)
Re:ATi wasn't so bad (Score:2, Insightful)
Re:ATi wasn't so bad (Score:2)
I guess they learned a bit from ATI and their quack.exe debacel
Re:ATi wasn't so bad (Score:2)
Not entirly true, as ATI and nVidia both work closly with big name game studios to make sure that optimizations such as the these are in the game. Obviously the benchmarks didn't use the optimizations they asked for so they took it into their own hands. Sneaky yes, but it is reflective of preformance in real games (atleast big name ones -snicker-)
Re:ATi wasn't so bad (Score:2)
Any modifications are wrong. If you cheat a little, you will cheat a lot. I can not imagine anyone at ATI agreeing to ONLY a 1.9% cheat. That is only asking to be cought with no benefit to the cheat. I am sure we will fi
If it weren't for standards ...... (Score:2, Interesting)
Does this guy [slashdot.org] work for NVidia?
Actually it's a pretty poor DX9 benchmark. (Score:5, Informative)
They do a good job of disecting the benchmark, and I'd have to agree that as a DX9 benchmark it fails.
Whatever, it's still just a synthetic mark and nothing more.
Re:If it weren't for standards ...... (Score:4, Informative)
We need to look at the new shader features offered by DirectX9, these are:
- Pixel and Vertex shaders 2.0 (supported by ATI R3xx line and GeForceFX)
- extended Pixel and Vertex shaders 2.0 (supported only by GeForceFX)
- Pixel and Vertex shaders 3.0 (no support until R400/NV40)
Now let's look at the features which are used by 3DMark03:
- Game 1: no shaders at all, only static T&L
- Game 2: vertex shader 1.1 and pixel shader 1.4 (which isn't natively supported by NVIDIA cards)
- Game 3: vertex shader 1.1 and pixel shader 1.4 (which isn't natively supported by NVIDIA cards)
- Game 4: vertex shader 2.0 and pixel shader 1.4+2.0
This means that:
-DirectX9 offers three new different shaders.
-Three of four 3DMark03 demos don't use new DirectX9 shaders at all
-Three of four 3DMark03 demos use Pixel Shader 1.4 which was introduced with DirectX8.1 and isn't natively supported by NVIDIA cards
-Only one set of new DirectX9 shaders are partially used in one 3DMark03 demo
Thus 3DMark03 shouldn't be called "DirectX9" benchmark. Following quote: "If hardware performs well 3DMark03, it performs well in all applications that use DirectX 9" should be changed: "If hardware performs well 3DMark03, it performs well in all applications that use Pixel Shader 1.4"
Re:If it weren't for standards ...... (Score:4, Insightful)
No, but they use shaders which are generally only supported on DX9 cards and a few older ATI cards. Just because you have a PS2.0 card that doesn't mean you have to use PS2.0 if PS1.4 can do the same: why deliberately make more work for yourself by not supporting older cards?
"Three of four 3DMark03 demos use Pixel Shader 1.4 which was introduced with DirectX8.1 and isn't natively supported by NVIDIA cards"
Support for PS1.4 is a requirement of DX9, so if the GF FX is a DX9 card then it supports PS1.4, and your claim is therefore bogus. If it doesn't support PS1.4, then it's not a real DX9 card.
MOD PARENT UP! (Score:2)
Re:If it weren't for standards ...... (Score:4, Insightful)
So this is how it should look, properly:
- Game 1: no shaders at all, only static T&L (DX7-class effects, given comparatively little weighting in overall score)
- Game 2: vertex shader 1.1 and pixel shader 1.4 (natively supported by GFFX, ATI Radeon 8500 and above)
- Game 3: vertex shader 1.1 and pixel shader 1.4 (natively supported by GFFX, ATI Radeon 8500 and above)
- Game 4: vertex shader 2.0 and pixel shader 1.4+2.0 (DX9 cards only, Radeon 9x00 and GFFX)
Nvidia's lack of support for PS1.4 is their own design choice, and now they have to live with it. The GF4 was released after DX8.1 came out, which contained the PS1.4 spec, but they chose not to support it. ATI Radeon 8500 and above have no problem with this because they supported DX8.1 from the getgo, but nvidia did not change and continued their 8.0 support. As was previously mentioned in the article, nvidia was participating in the developer's beta until Dec 2002, well into the development period for 3dm03 and a month after they paper launched the GFFX, so they knew what was going on with the benchmark for a long time beforehand and didn't change their stance for a while. Presumably, as a beta member up until Dec 2002 if they didn't like the choice of PS 1.4 in extensive use, then they could've said something earlier.
The key to regarding 3dm03 is it's goal as a forward-looking benchmark. Both DX8 games and DX9 games are currently in development, and many DX7 games are still in existence (remember, HL2 doesn't require anything above a DX6 card), so in this respect 3DM03 is still fair in its test design.
Re:If it weren't for standards ...... (Score:5, Insightful)
Re:If it weren't for standards ...... (Score:2)
Membership in the beta-program costs about 5000 dollars. That's peanuts for companies like Ati and NVIDIA.
open source (Score:5, Insightful)
Re:They should be sued to open their drivers. (Score:2)
Yep, that nVidia, always contributing to Microsoft's monopoly with their well made Linux drivers. Dash them. Dash them straight to heck!
Evaluating the evaluation. (Score:5, Insightful)
And, since one of the main reasons people will buy this is to play flashy and pretty games, ignoring the performance in those games is rediculous.
Re:Evaluating the evaluation. (Score:2)
Which is one of the reasons most people would look at publications that provide multiple types of benchmarks including performance in various popular games or game engines.
State of nvidia development team (Score:5, Interesting)
Of course it will work better when you do it their way; It was 3dfx's strength in the beginning, and its downfall in the end.
But I believe that their current development team has yet to hit its stride, and future offerings will see the trophy going back to nvidia...
driver tweaking (Score:5, Informative)
I bought a Radeon 9700 Pro. The driver issues almost make it not worth the increased FPS over my Ti4400.
The refresh rate problem in XP is annoying as hell. ATI handles it even worse than NVidia, where you set your "maximum" refresh rate and your "highest" resolution, and it assumes that all lower resolutions can only handle that refresh rate.
There's no way to tell your ATI card, "My monitor can do 1280x1024 @ 85hz, but 1600x1200 @ only 75hz." You either get 75hz in everything if you want 1600x1200, or you get 85hz up to 1280x1024, and have to avoid 1600x1200 use lest ye risk getting "frequency over range".
NV handles it better with the driver, allowing you to set maximum refresh rates for every resolution individually.
These refresh rate tweaking programs don't help either, since half the games out there choke when you use them. PC 3d is in a bit of a sorry state right now, and I'm tired of it.
Re:driver tweaking (Score:3, Informative)
I have a 9700 and don't have ANY driver problems, what sort of issues are you having?
Re:driver tweaking (Score:5, Informative)
Besides, this is more of a windows quirk than a driver thing as MS requires the driver to behave like this to pass it's WHQL tests.
Re:No problems here...PEBKAM (Score:2)
Re:driver tweaking (Score:5, Insightful)
Video cards have a simple job when it comes to resolution and refresh rate: When using resolution X, use the best refresh rate Y, and if I have to tell it what that is, so be it. They can't do this.
A lot of this is due to games and their poor detection of capabilities, and lack of effort to try for the best refresh rate. However, it's hard to pin it all on them. Games generally don't have trouble detecting if you have EAX capability, or detecting how many axis are on your joystick, or whether you have a third button on your mouse. Sure, I've seen problems in these areas too, but the video card situation feels like someone just invented VGA yesterday and the video card manufacturers are struggling to make it all work.
I'm using the Cat 3.4 drivers. I can set the refresh separately too, but a few times both the ATI driver tabs and the XP display properties reported that I was in one refresh rate, but my monitor OSD said differently. Inexcusable. It was due to that "maximum capability" setting, and as a result it didn't mind lying to me as long as it avoided going over the maximum. Glad it was able to "protect" me.
But that's not the half of it. I set the refresh rate, and when entering a game, it changes, usually back to 60hz. When entering games, the resolution changes a lot, and it seems completely random what it ends up on.
Other problems I'm having include that mode switching in general takes three times as long as the NV card. Switching back to Windows from games results in a very long black screen, and until Cat 3.4 came along, I couldn't switch out of CS/HL at all without crashing the entire OS.
Let me give you an example of my typical day. I set the display capability to 1280x1024 @ 85hz because I want 85hz in CS. In CS, I accidentally had the mode set for 1600x1200. With ATI3.3, it would crash. With ATI3.4, it would actually draw a 1600x1200 screen in a 1280x1024 window. Yep, it was cut off, with part of the screen literally extending off the monitor into the void.
I change the capability to 1600x1200 @ 75hz and play a while. I quit and fire up BF1942, which due to CPU constraints, runs better at 1024x768. But, I'm at 75hz, because I have no way to tell the card that while it only supports 1600x1200 @ 75hz, it does 85hz in every other mode. I have to change the capability to 1280x1024 @ 85hz. BF1942 runs.
I run another game at 1600x1200. Unlike CS, where it drew off the screen, this one would simply blackscreen as a result of trying to go into 1600x1200 @ 85hz (since 85hz is my "maximum" resolution.) I reboot, and the first few times it happened, I looked for game patches before realizing that this stupid ATI driver was the cause.
The constant mode switches between games take several seconds, and perform an odd "screen wiping" effect that reeks of cheesy hardware. The NV switches modes smooth as butter.
I'm scared as hell to hook this thing into my TV. It might try to pump 2048x1024 @ 100hz at it and cause an explosion.
Re:driver tweaking (Score:2)
BF1942 has a major problem with these applications. It seems to force the card into 60hz regardless of anything. Removing this mode using one of these programs thus results in an unstoppable force meeting an immovable object - the game wants X, the card allows Y, and they do not come to an agreement.
While these pr
Re:driver tweaking (Score:2)
Sure you can... this is Slashdot.
What a mess! (Score:5, Informative)
Now they have painted themselves into a corner and how this will turn out is anyone's guess.
*DX8.1 has PS 1.4 which is actually much closer (AFAIK) to PS 2.0 than PS 1.3 (DX8).
Re:What a mess! (Score:3, Informative)
other people have mentioned this, but take
Preposterous! (Score:5, Funny)
Once Again A Call For Open Source Benchmarks (Score:4, Insightful)
The alternative is what we have now: hand waving voodoo. Not only do we have to take the vendor's word they aren't monkeying around with the driver to match execution of the benchmark but now we have to question where the aligence of the benchmark makers.
Re:Once Again A Call For Open Source Benchmarks (Score:5, Insightful)
Better to have open source drivers so we can inspect the driver for cheating/optimisations.
Infact, open source them all if the hard numbers are that important!
Somebody still has to make the standard (Score:2)
Well, that's all well and good, but what does it accomplish? How do we decide who is allowed to work on the standard, because virtually everyone with sufficient skills and clout will have an angle. Do we let hardware developers do it? No, that's like having the fox guard the proverbial henhouse. Game developers? Maybe, although they're tainted too - for one, because they partner with hardware makers, and second, there are instances where they might want to
Re:Once Again A Call For Open Source Benchmarks (Score:2)
nvidia - ati cheat difference (Score:5, Informative)
http://www.extremetech.com/print_article/0,3998
It is cheating on both sides. (Score:3, Informative)
But this misses the whole point of 3dmark 2003. Different developers stress the pixel and triangle pipelines in different ways to produce a whole boatload of effects. While major games and engines are often optimized-for, there is no guarantee that ATI or Nvidia will sit down and optimize for the game you just bought.
That said, 3dmark 2003 should be considered a relational tool for generic perfrormance. Consider it a goo
Get over it....just look at it how YOU will use it (Score:5, Interesting)
On a side note, me and my team many, many years ago designed, what was at the time, one of the fastest chip sets for the blinding new 12 Mhz 386 PC. We had discovered that the Norton SI program that everyone was using to benchmark PC's based most of it's performance on a small 64 Byte (yes, that is not a typo 64 BYTE) loop. We had considered putting a 64 byte cache in our memory controller chip but our ethos won at the end of the day as cleary what we would have done would have been discovered and the benchmark would have been rewritten. Had we done it however, for our 15 mins of fame our benchmarks would have been something crazy like 10x or 100x better than anything out there.
Re: (Score:2)
Re:Get over it....just look at it how YOU will use (Score:2, Interesting)
The cache comment is correct, regardless of the CPU it was going to work with. We reversed engineered the Norton SI ben
What exactly are kid gloves? (Score:2, Informative)
Get the answer at Straight Dope [straightdope.com]
Just because the Register says so? (Score:3, Informative)
It's a real shame that The Register obscured the truth here with an article that attacks ATI for conservatively removing optimizations while giving the real miscreant gets a free pass. ATI should leave their optimizations in IMHO, but maybe you disagree because their mathematically equivalent optimization is not general enough, it's a close call, but they don't deserve what the distorted treatment given in The Register.
Re:Just because the Register says so? (Score:2)
You're loosing sight of what 3DMark03 is trying to acheive here; the objective of a benchmark is to provide two or more systems with the same input, and therefore the same workload, and see which system performs more favora
Re:Just because the Register says so? (Score:2)
If ATI had triggered on the shader name, application name or modified the results of the shader to be functionally different, I'd be right with you calling it a cheat, but they didn't.
Re:Big deal, ATI cheated WAY worse before - Quack. (Score:2)
Re:Big deal, ATI cheated WAY worse before - Quack. (Score:2)
All IHV's provide per game "hints" to the driver of how to work for a give game as well as "hints" of how to work with an unknown game. But this is usually onl
Confused (Score:3, Insightful)
I'm confused about what this means. Is the 1.9% difference in ATI performance between Game Test 4 with correct and modified names, or between the current driver and an older version?
Most people here seem to think it's the latter, and I'd agree that they did nothing wrong if that's the case. But it's not obvious to me that they're not accused of the same thing as NVIDIA.
The more things change... (Score:3, Interesting)
Quake III at 300 FPS (Score:5, Insightful)
The issue with low FPS is a game problem 9 out of 10 times. The faster the video card, the less the game development houses work to streamline and improve their framerate.
Re:Quake III at 300 FPS (Score:3, Interesting)
Word.
Lots of development houses are focusing exclusively on the high-high end graphics card market and are forgetting that their LOD rendering engine *could* display the characters as a virtua-fighters esque 50 shaded polygon mess. I was personally blown away when the Half-Life 2 development team decided that the onboard i-810 would be
Cheating??? (Score:3, Insightful)
Nvidia is cheating and acting like a child, er, large corporation...but that isn't at all what Ati is doing.
Re:Cheating??? (Score:3, Insightful)
Re:Cheating??? (Score:2)
Still I own a Radeon 9800 and I'm pissed off by ATI: there are still bugs in their driver (SplinterCell, and IL2 Forgotten Battles) and they used some developper's time to optimise for 3DMark instead of debugging the driver!!
but this doesnt seem fishy? (Score:3, Insightful)
Ive read 5-6 reviews of the FX 5900 and everyone seems to think its great, and rightly gives Nvidia the 3d crown. (Especially concerning Doom ]|[
If you read the interview, its even brought up that the 5900 seems to do just fine in all other benchmarks, only futuremark seems to give it a hard time, and im not buying that crap about Doom 3 benchmarks not being readily available.
If i remember, Toms had a good review of that....
Re:but this doesnt seem fishy? (Score:2, Insightful)
Re:but this doesnt seem fishy? (Score:2)
Does anyone think it's coincidence (Score:5, Funny)
to quake3.exe removes those pesky leaves, revealing her suptle nature, and that renaming it to 3dmark2003.exe removes the leaves and her wings? Is the inside joke that they leave "certain things out" of quake3 and 3dmark? Does the government know of the existence of aliens and wormhole portals to other worlds?
Re:Does anyone think it's coincidence (Score:3, Informative)
Coincidence? (Score:4, Insightful)
What upsets me is not that you lied to me, but that from now on I can no longer believe you. -- Nietzsche
Classic.
This is a good thing (Score:3, Insightful)
Think of it this way, when's the last time you saw PC World roast a product that truely deserved it? How many review sites gave WinMe a thumbs up when it's widely viewed in the industry at MS's worst OS to date? We (the public) simply aren't being served if the test companies are cooperating with the companies their testing. Look if a testing company, review site or whatever other lab doesn't occasionaly come out and just say "this sucks" than you know they aren't credible. There's too much out there that sucks, and too few reviewers willing to let the public know before they waste their money.
It's the same reasoning that dictates why consumer's reports will buy their cars anonymously from dealers using third parties instead of getting "special" delivery directly from the manufacture. What we should really see with the behaviour were observing so far is an impetus to develop an open source test benchmark application. By doing this we would assure that the results can't be bought, just like has become common practice in so many other industries.
I'm shocked. (Score:2)
SHOCKED! to find that there is optimization going on here.
(Alphonse enters.)
Your SPECmarks, sir.
Thank you.
Teach the subject, not the test (Score:5, Informative)
In order for a driver benchmark to be useful at all, it needs to be kept absolutely secret from the chip manufacturers before the test, and then once it is used and revealed that benchmark needs to be retired, because the next generation of testing should be designed to concentrate on the new features that the graphic card developers are expected to put in their next generation of cards that will be used in the next generation of games.
In short, the best benchmark will always be based on "that sure-to-hit game that's just about to come out."
Take a step back and look at the big picture. (Score:2)
In Doom ]|[, the most advanced graphics game out there currently, NVDA smokes ATI. On test equipment that NVDA only provided the card.
Am I going to "play" 3DMark or am I going to play DOOM 3. For all of you who will be playing 3DMark more than doom, go ahead and get the ATI card. I'll make my decision based upon the stellar DOOM 3 performance.
Re:Take a step back and look at the big picture. (Score:2)
By that time newer, faster cards will have been released. If you intend to buy a new card now then look at what games you currently play as the market will certainly change when DOOM 3 hits the shelves.
So what? Who cares? (Score:2, Insightful)
All this proves is that a benchmark is a highly isolated incident of observable performance.
For example, most charts I see rate the P4 as "faster than the Athlon" at the same clock rate. Yet when I benchmarked my bignum math code I found that the Athlon completely kicked the P4s ass
[K7]
http://iahu.ca:8080/ltm_log/k7/index.html
[P4]
http://iahu.ca:8080/ltm_log/p4/index.html
Does this single test prove the athlon
Simple enough solution to cheating (Score:2)
It would be easy enough for ALL of the testing labs to simply rename the benchmark and game application EXE to something random before starting the tests as a matter of course. If they state this fact up front for all to see, it would make special casing like this extinct overnight.
It wouldn't prevent cheating. Data profiling could p
Cheating can damage the industry as a whole (Score:2)
In this day and age, the top of the line video cards have more than 100 million transistors, and have become increasingly "intelligent" in sorting and rearranging data, instructions and choosing algorithms to get the best performance out a given engine. Some engines perform better than the others under different circumstances, making benchmarks even more subjective than ever.
ATI Cheated worse in the past (Score:3, Interesting)
See here [slashdot.org] for the original /. store describing the Quack / Quake 3 cheat ATI had a while back. MUCH worse than the current NVidia cheat IMO.
Regardless of if you think it is worse, the point is that BOTH companies have cheated in benchmarks so there is NO point in "glorifying" ATI at NVidia's expense. They are just as bad if not worse (and their drivers blow ass ).
Solution: Open Source. (Score:2)
Name Change Redux (Score:2)
I wrote to nvidia...here is their reply (Score:3, Interesting)
I like nvidia but I'm disappointed that the reply sounds like a justification. From Derek Perez (dperez@nvidia.com):
Since NVIDIA is not part in the FutureMark beta program (a program which costs of hundreds of thousands of dollars to participate in) we do not get a chance to work with Futuremark on writing the shaders like we would with a real applications developer.
We don't know what they did but it looks like they have intentionally tried to create a scenario that makes our products look bad. This is obvious since our relative performance on games like Unreal Tournament 2003 and Doom3 shows that The GeForce FX 5900 is by far the fastest graphics on the market today.
dp
Who cares about benchmarks anyway? (Score:3, Insightful)
I don't believe claims anyway; ATI says their card is faster than NVidia's. NVidia says theirs is faster than ATI's. Bleh....
Live demos? (Score:2)
A better thing would be to have a demo with a simple multi-room house or whatnot. Various lighting effects, a few characters perhaps, window panes, maybe the outside is blocked by lava or something so that yo
Tired of hearing this. (Score:3, Insightful)
And how have people figured this out time and time again? Oh, they renamed the executable...
Why does the benchmarking software not rename the executable to some-254-character-long-file-name-random-string.e
I'm sure that there is some way that Nvidia and ATI could get around even this but what are they gonna do make a 75MB driver in retaliation to what the benchmark companies do?
Are there any *VISUAL* diffrences? (Score:2)
Obviously any professional game engine is going to have optimization profiles for the major cards, so I don't see this as a big deal.
Futuremark ATI Version 1.0 (Score:2)
Thier comment that the benchmark ran great on the released version but showed errors on the unreleased private devleoper version is like saying "Well it may get 120fps in QuakeIII but if you run it on Super Secret unreleased QuakeIII beta 2 it has errors"
If I have an NVIDIA, S3, Matrox, etc. and they did not pay for the FutureMark beta program then
Does it -REALLY- matter? (Score:4, Interesting)
I think benchmarking these days is almost trivial. The scoring difference between a 9800 Pro and a 5900 Ultra, in the end, will only mean about a 15fps difference in your gaming experience. Honestly now, does playing UT2003 at 60fps vs. 45fps really pump your nuts? if so, then you can go ahead and moderate this as flamebait.
And as far as 'optomizing code' for benchmarks, it's industry wide. Intel releases custom compilers just so programs will run faster on P4 chips! Is that cheating? Not really. The programs still run the same, just better on the hardware they chose. Same with nVidia in this situation, the picture still LOOKED the same (unless you enabled the free viewing). So who cares what happens in the background?
My point is, people need to make decisions on their own when it comes to purchasing hardware. It all boils down to personal choice. Some people are hardcore ATI fans no matter what the benchmarks say, others are nVidia fans until the bitter end.
Personally, I choose nVidia because of hardware compatibility issues in the past with several chipsets i used to have, now it's just habitual. People who are on the fence and really don't have their feet in the water when it comes to hardware might be sold by the gold PCB.
In the end, well, it boils down to this. You know what they say about opinions
Driver strategies (Score:5, Insightful)
Rewriting a shader so that it does exactly the same thing, but in a more efficient way, is generally acceptable compiler optimization, but there is a range of defensibility from completely generic instruction scheduling that helps almost everyone, to exact shader comparisons that only help one specific application. Full shader comparisons are morally grungy, but not deeply evil.
The significant issue that clouds current ATI / Nvidia comparisons is fragment shader precision. Nvidia can work at 12 bit integer, 16 bit float, and 32 bit float. ATI works only at 24 bit float. There isn't actually a mode where they can be exactly compared. DX9 and ARB_fragment_program assume 32 bit float operation, and ATI just converts everything to 24 bit. For just about any given set of operations, the Nvidia card operating at 16 bit float will be faster than the ATI, while the Nvidia operating at 32 bit float will be slower. When DOOM runs the NV30 specific fragment shader, it is faster than the ATI, while if they both run the ARB2 shader, the ATI is faster.
When the output goes to a normal 32 bit framebuffer, as all current tests do, it is possible for Nvidia to analyze data flow from textures, constants, and attributes, and change many 32 bit operations to 16 or even 12 bit operations with absolutely no loss of quality or functionality. This is completely acceptable, and will benefit all applications, but will almost certainly induce hard to find bugs in the shader compiler. You can really go overboard with this -- if you wanted every last possible precision savings, you would need to examine texture dimensions and track vertex buffer data ranges for each shader binding. That would be a really poor architectural decision, but benchmark pressure pushes vendors to such lengths if they avoid outright cheating. If really aggressive compiler optimizations are implemented, I hope they include a hint or pragma for "debug mode" that skips all the optimizations.
John Carmack
JC: WHEN WILL YOU FINISH DUKE NUKEM FOREVER? (Score:3, Funny)
Re:Driver strategies (Score:4, Informative)
Re:Driver strategies (Score:3, Insightful)
Not to nitpick but the ARB specs do specify a minimum precision for ARB_fragment_program. From the Latest GL documentation:
RESOLVED: We've decided not to include precision queries.
Implementations are expected to meet or exceed the precision guidelines set forth in the core GL spec, section 2.1.1, p. 6, as ammended by this extension.
To summarize section 2.1.1, the maximum representabl
No it is DX 9 (Score:3, Informative)
ATIs "optimized code" (Score:3, Insightful)
This isn't about Nvidia vs. ATI or about defending Nvidia, what NV did by clipping planes was even worse. It's just that there is no justification for cheating on the benchmarks, even if the graphical results are the same. The benchmarks should be an indicator how the card will perform in real-world-szenarios (i.e. games) and any driver-tweaks that are benchmark-specific but don't help performance otherwise are just cheating and make-believe.
Re:why tweak for the better? (Score:2)
Re:Same old story.... (Score:5, Insightful)
I dont even bother with 3DMark scores when I read reviews, I skip straight to the tested games and get a look at the FPS at various levels of detail.
Then it's easy to realize that card A gives 201 FPS, card be gives 199 FPS, and the answer is: buy whichever is cheaper.
This gives me much more useful information that relates to what I want the card for - playing games.
Re:NVIDIA DID NOT CHEAT (Score:3, Funny)