Futuremark Replies to Nvidia's Claims 317
Nathan writes "Tero Sarkkinen, Executive Vice President of Sales and Marketing at Futuremark, has commented on the claims by Nvidia that 3DMark2003 intentionally puts the GeforceFX in bad light, after Nvidia had declined becoming a member of Futuremark's beta program. This issue looks like it will get worse before it gets better." ATI also seems to be guilty of tweaking their drivers to recognize 3DMark.
The real reason this is important. (Score:5, Interesting)
This is where the money really is, and what is worth fighting for.
If it weren't for standards ...... (Score:2, Interesting)
Does this guy [slashdot.org] work for NVidia?
State of nvidia development team (Score:5, Interesting)
Of course it will work better when you do it their way; It was 3dfx's strength in the beginning, and its downfall in the end.
But I believe that their current development team has yet to hit its stride, and future offerings will see the trophy going back to nvidia...
Re:nVidia vs. ATI (Score:5, Interesting)
Get over it....just look at it how YOU will use it (Score:5, Interesting)
On a side note, me and my team many, many years ago designed, what was at the time, one of the fastest chip sets for the blinding new 12 Mhz 386 PC. We had discovered that the Norton SI program that everyone was using to benchmark PC's based most of it's performance on a small 64 Byte (yes, that is not a typo 64 BYTE) loop. We had considered putting a 64 byte cache in our memory controller chip but our ethos won at the end of the day as cleary what we would have done would have been discovered and the benchmark would have been rewritten. Had we done it however, for our 15 mins of fame our benchmarks would have been something crazy like 10x or 100x better than anything out there.
The more things change... (Score:3, Interesting)
Re:but this doesnt seem fishy? (Score:1, Interesting)
Ah yes... Tom's Hardware...
The guys who compared a dual Opteron system with 2 GB of RAM with a dual Xeon system with only 512 MB of RAM.
A great source for unbiased reviews and comparisons...
Re:but this doesnt seem fishy? (Score:1, Interesting)
I'll be perfectly blunt though, I just plain don't trust Tom's Hardware.
If it would have made no difference had it been 512 MB instead of 2 GB then they should have ran it with that, you want to make the test field at least appear even.
ATI Cheated worse in the past (Score:3, Interesting)
See here [slashdot.org] for the original /. store describing the Quack / Quake 3 cheat ATI had a while back. MUCH worse than the current NVidia cheat IMO.
Regardless of if you think it is worse, the point is that BOTH companies have cheated in benchmarks so there is NO point in "glorifying" ATI at NVidia's expense. They are just as bad if not worse (and their drivers blow ass ).
Re:Get over it....just look at it how YOU will use (Score:2, Interesting)
The cache comment is correct, regardless of the CPU it was going to work with. We reversed engineered the Norton SI benchmark and found out what it was doing. We were tempted but did not go forward as it would have been a useless feature. Point is, any silicon vendor out there hawking their wares knows what the benchmarks are doing and will do "what ever it takes" to either explain away the bad marks and figure out how to make their silicon look better. It becomes a gray area when you start tweaking just to tweak as opposed to adding anything of real value.
I wrote to nvidia...here is their reply (Score:3, Interesting)
I like nvidia but I'm disappointed that the reply sounds like a justification. From Derek Perez (dperez@nvidia.com):
Since NVIDIA is not part in the FutureMark beta program (a program which costs of hundreds of thousands of dollars to participate in) we do not get a chance to work with Futuremark on writing the shaders like we would with a real applications developer.
We don't know what they did but it looks like they have intentionally tried to create a scenario that makes our products look bad. This is obvious since our relative performance on games like Unreal Tournament 2003 and Doom3 shows that The GeForce FX 5900 is by far the fastest graphics on the market today.
dp
Re:Quake III at 300 FPS (Score:3, Interesting)
Word.
Lots of development houses are focusing exclusively on the high-high end graphics card market and are forgetting that their LOD rendering engine *could* display the characters as a virtua-fighters esque 50 shaded polygon mess. I was personally blown away when the Half-Life 2 development team decided that the onboard i-810 would be a low-end target: these people really know that not all gamers bought 9700 Pros. As an 8500/128 owner, I appreciate the added bonuses of a videocard, but quite frankly the difference in image quality between what is available in Warcraft 3 (which my card runs quite well) and what is being offered by Doom 3 (which my card probably won't run) is negligable. Look at screenshots for the upcoming Age of Empires, and compare them to the screenshots of the 3 year old Empire Earth. Graphics are fine, and have been so for quite some time. Let's focus on something else, like gameplay, shall we?
Reviewers won't run a game on the minimum system specs and then complain about the graphics. Why not put that LOD system to good use and drop down to lower poly models for those of us with older machines? Can't write a script to shave off vertices? Artistic vision snobbery?
Re:nVidia vs. ATI (Score:2, Interesting)
Its kind of like ATI finding a performance bug, and working around it. But not telling anyone else about it. Its more of oportunistic cheating. Its not blatant. hehe, I just don't feel as bad about the ATI cheat at this point. But we should wait till FurtureMark finishes auditing the ATI drivers...
Does it -REALLY- matter? (Score:4, Interesting)
I think benchmarking these days is almost trivial. The scoring difference between a 9800 Pro and a 5900 Ultra, in the end, will only mean about a 15fps difference in your gaming experience. Honestly now, does playing UT2003 at 60fps vs. 45fps really pump your nuts? if so, then you can go ahead and moderate this as flamebait.
And as far as 'optomizing code' for benchmarks, it's industry wide. Intel releases custom compilers just so programs will run faster on P4 chips! Is that cheating? Not really. The programs still run the same, just better on the hardware they chose. Same with nVidia in this situation, the picture still LOOKED the same (unless you enabled the free viewing). So who cares what happens in the background?
My point is, people need to make decisions on their own when it comes to purchasing hardware. It all boils down to personal choice. Some people are hardcore ATI fans no matter what the benchmarks say, others are nVidia fans until the bitter end.
Personally, I choose nVidia because of hardware compatibility issues in the past with several chipsets i used to have, now it's just habitual. People who are on the fence and really don't have their feet in the water when it comes to hardware might be sold by the gold PCB.
In the end, well, it boils down to this. You know what they say about opinions