Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

Futuremark Replies to Nvidia's Claims 317

Nathan writes "Tero Sarkkinen, Executive Vice President of Sales and Marketing at Futuremark, has commented on the claims by Nvidia that 3DMark2003 intentionally puts the GeforceFX in bad light, after Nvidia had declined becoming a member of Futuremark's beta program. This issue looks like it will get worse before it gets better." ATI also seems to be guilty of tweaking their drivers to recognize 3DMark.
This discussion has been archived. No new comments can be posted.

Futuremark Replies to Nvidia's Claims

Comments Filter:
  • by Cannelbrae ( 157237 ) on Tuesday May 27, 2003 @12:35PM (#6048906)
    Its about the OEMs as much or more than the consumer market. They watch the benchmarks closely -- and make decisions based on results.

    This is where the money really is, and what is worth fighting for.
  • by oliverthered ( 187439 ) <oliverthered@hotmail. c o m> on Tuesday May 27, 2003 @12:36PM (#6048920) Journal
    "3DMark03 was developed strictly according to DirectX9 standard in very close cooperation with Microsoft and other BETA members. If hardware performs well 3DMark03, it performs well in all applications that use DirectX 9. Note that since 3DMark is designed to be an objective evaluation tool, it does _not_ include manufacturer-specific optimizations. This is why it is exceptionally well suitable for objective performance measurement. "

    Does this guy [slashdot.org] work for NVidia?
  • by cmburns69 ( 169686 ) on Tuesday May 27, 2003 @12:38PM (#6048946) Homepage Journal
    Back when nvidia aquired 3dfx, they began to merge their development teams. The fx is the first card by nvidia to be developed by engineers from both the nvidia and 3dfx groups.

    Of course it will work better when you do it their way; It was 3dfx's strength in the beginning, and its downfall in the end.

    But I believe that their current development team has yet to hit its stride, and future offerings will see the trophy going back to nvidia... ... But who knows! I'm no fortune teller ...
  • Re:nVidia vs. ATI (Score:5, Interesting)

    by asdkrht ( 655186 ) on Tuesday May 27, 2003 @12:44PM (#6049017)
    From what I've heard, Nvidia totally replaced the shader program with ones that they wrote. All ATI did was re order some of the instructions in the shaders to "optimize" them. The optmized and the original shader programs were functionally evquivalent. Sort of what happens when a complier optimizes code. The same can't be said for what Nvidia did.
  • by rimcrazy ( 146022 ) on Tuesday May 27, 2003 @12:45PM (#6049021)
    I've worked in the PC industry more years than I care to think about. All graphic card vendors tweak their drivers and bios to make their cards look better. If people didn't put so much emphisis on benchmarks for buying decisions then there would not be much reason to tweak things but the reality of the world is they do.

    On a side note, me and my team many, many years ago designed, what was at the time, one of the fastest chip sets for the blinding new 12 Mhz 386 PC. We had discovered that the Norton SI program that everyone was using to benchmark PC's based most of it's performance on a small 64 Byte (yes, that is not a typo 64 BYTE) loop. We had considered putting a 64 byte cache in our memory controller chip but our ethos won at the end of the day as cleary what we would have done would have been discovered and the benchmark would have been rewritten. Had we done it however, for our 15 mins of fame our benchmarks would have been something crazy like 10x or 100x better than anything out there.

  • by jridley ( 9305 ) on Tuesday May 27, 2003 @12:49PM (#6049069)
    I remember video card companies cheating on benchmarks 10 or more years ago. This was when PCI was the latest thing on the block, and was competing with VESA-local bus. They wrote drivers specifically to detect when they were being called by the PC-Magazine benchmark program, and they'd do some stuff like just returning from every other call, since the prog was just calling the same thing 10000 times.
  • by Anonymous Coward on Tuesday May 27, 2003 @01:03PM (#6049200)
    If i remember, Toms had a good review of that....

    Ah yes... Tom's Hardware...
    The guys who compared a dual Opteron system with 2 GB of RAM with a dual Xeon system with only 512 MB of RAM.
    A great source for unbiased reviews and comparisons...
  • by Anonymous Coward on Tuesday May 27, 2003 @01:13PM (#6049287)
    I have no idea how memory intensive the benchmarks were, since I can't actually see everything they ran, jsut what they said and the results.

    I'll be perfectly blunt though, I just plain don't trust Tom's Hardware.
    If it would have made no difference had it been 512 MB instead of 2 GB then they should have ran it with that, you want to make the test field at least appear even.
  • by brunes69 ( 86786 ) <`gro.daetsriek' `ta' `todhsals'> on Tuesday May 27, 2003 @01:34PM (#6049484)

    See here [slashdot.org] for the original /. store describing the Quack / Quake 3 cheat ATI had a while back. MUCH worse than the current NVidia cheat IMO.

    Regardless of if you think it is worse, the point is that BOTH companies have cheated in benchmarks so there is NO point in "glorifying" ATI at NVidia's expense. They are just as bad if not worse (and their drivers blow ass ).

  • by rimcrazy ( 146022 ) on Tuesday May 27, 2003 @01:39PM (#6049542)
    I think not but it has been so long. I was design manager at VLSI Technology. We made a 12Mhz chipset which was our first, I thought for the 386 but it may have been for the 286. Our chip set was used by IBM for their "reentry" back into the ISA bus channel PC's after their big micro-channel effort resulted in them loosing half of their market share by dropping the ISA box for a while.

    The cache comment is correct, regardless of the CPU it was going to work with. We reversed engineered the Norton SI benchmark and found out what it was doing. We were tempted but did not go forward as it would have been a useless feature. Point is, any silicon vendor out there hawking their wares knows what the benchmarks are doing and will do "what ever it takes" to either explain away the bad marks and figure out how to make their silicon look better. It becomes a gray area when you start tweaking just to tweak as opposed to adding anything of real value.

  • by Call Me Black Cloud ( 616282 ) on Tuesday May 27, 2003 @02:07PM (#6049827)

    I like nvidia but I'm disappointed that the reply sounds like a justification. From Derek Perez (dperez@nvidia.com):

    Since NVIDIA is not part in the FutureMark beta program (a program which costs of hundreds of thousands of dollars to participate in) we do not get a chance to work with Futuremark on writing the shaders like we would with a real applications developer.

    We don't know what they did but it looks like they have intentionally tried to create a scenario that makes our products look bad. This is obvious since our relative performance on games like Unreal Tournament 2003 and Doom3 shows that The GeForce FX 5900 is by far the fastest graphics on the market today.

    dp
  • by cgenman ( 325138 ) on Tuesday May 27, 2003 @02:56PM (#6050260) Homepage
    The issue with low FPS is a game problem 9 out of 10 times. The faster the video card, the less the game development houses work to streamline and improve their framerate.

    Word.

    Lots of development houses are focusing exclusively on the high-high end graphics card market and are forgetting that their LOD rendering engine *could* display the characters as a virtua-fighters esque 50 shaded polygon mess. I was personally blown away when the Half-Life 2 development team decided that the onboard i-810 would be a low-end target: these people really know that not all gamers bought 9700 Pros. As an 8500/128 owner, I appreciate the added bonuses of a videocard, but quite frankly the difference in image quality between what is available in Warcraft 3 (which my card runs quite well) and what is being offered by Doom 3 (which my card probably won't run) is negligable. Look at screenshots for the upcoming Age of Empires, and compare them to the screenshots of the 3 year old Empire Earth. Graphics are fine, and have been so for quite some time. Let's focus on something else, like gameplay, shall we?

    Reviewers won't run a game on the minimum system specs and then complain about the graphics. Why not put that LOD system to good use and drop down to lower poly models for those of us with older machines? Can't write a script to shave off vertices? Artistic vision snobbery?
  • Re:nVidia vs. ATI (Score:2, Interesting)

    by dnoyeb ( 547705 ) on Tuesday May 27, 2003 @03:20PM (#6050440) Homepage Journal
    You are absolutely correct. Especially considering that ATI is part of the beta program. So this means that other cards will not benefit from the optimizations to "FutureMark03" that ATI made. The correct procedure for ATI would have been to tell Futuremark that they need to optimize at a certain point.

    Its kind of like ATI finding a performance bug, and working around it. But not telling anyone else about it. Its more of oportunistic cheating. Its not blatant. hehe, I just don't feel as bad about the ATI cheat at this point. But we should wait till FurtureMark finishes auditing the ATI drivers...
  • by WndrBr3d ( 219963 ) * on Tuesday May 27, 2003 @04:22PM (#6051122) Homepage Journal
    I mean, seriously now folks. I think benchmarking these days is nothing compared to what it used to mean. When the Voodoo3's rolled out to tackle the TNT2's, there was a considerable gap between the two. When nVidia introduced the GeForce GPU, the whole game world was changed and the 3dMark score of a GeForce box was 3x that of a non.

    I think benchmarking these days is almost trivial. The scoring difference between a 9800 Pro and a 5900 Ultra, in the end, will only mean about a 15fps difference in your gaming experience. Honestly now, does playing UT2003 at 60fps vs. 45fps really pump your nuts? if so, then you can go ahead and moderate this as flamebait.

    And as far as 'optomizing code' for benchmarks, it's industry wide. Intel releases custom compilers just so programs will run faster on P4 chips! Is that cheating? Not really. The programs still run the same, just better on the hardware they chose. Same with nVidia in this situation, the picture still LOOKED the same (unless you enabled the free viewing). So who cares what happens in the background?

    My point is, people need to make decisions on their own when it comes to purchasing hardware. It all boils down to personal choice. Some people are hardcore ATI fans no matter what the benchmarks say, others are nVidia fans until the bitter end.

    Personally, I choose nVidia because of hardware compatibility issues in the past with several chipsets i used to have, now it's just habitual. People who are on the fence and really don't have their feet in the water when it comes to hardware might be sold by the gold PCB.

    In the end, well, it boils down to this. You know what they say about opinions ;-)

What is research but a blind date with knowledge? -- Will Harvey

Working...