Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

NVidia Accused of Inflating Benchmarks 440

Junky191 writes "With the NVidia GeForce FX 5900 recently released, this new high-end card seems to beat out ATI's 9800 pro, yet things are not as they appear. NVidia seems to be cheating on their drivers, inflating benchmark scores by cutting corners and causing scenes to be rendered improperly. Check out the ExtremeTech test results (especially their screenshots of garbled frames)."
This discussion has been archived. No new comments can be posted.

NVidia Accused of Inflating Benchmarks

Comments Filter:
  • Re:Hmmmm (Score:5, Informative)

    by drzhivago ( 310144 ) on Thursday May 15, 2003 @09:15AM (#5963285)
    Do you remember how a year or so ago ATI released a driver set that reduced image quality in Quake 3 to increase frame rate?

    Here [tech-report.com] is a link about it in case you forgot or didn't know.

    It just goes to show that both companies play that game, and neither to good results.
  • by krog ( 25663 ) on Thursday May 15, 2003 @09:21AM (#5963335) Homepage
    The point is that visual quality *was* fine... within the benchmark's prescribed path. "Off the rail" is when problems started occuring.

    This is why all software and hardware should be open-source.
  • Re:whatever (Score:1, Informative)

    by Anonymous Coward on Thursday May 15, 2003 @09:23AM (#5963350)
    The issue is that the driver problems don't occur when you run the benchmark normally - They had a special version of the benchmark that let them stop and fly around, which then revealed the graphical errors - of course, all of this is explained if you actually read the article.
  • Re:whatever (Score:5, Informative)

    by Pulzar ( 81031 ) on Thursday May 15, 2003 @09:26AM (#5963375)
    Instead of only looking at the pictures, read the whole article before making decisions on whether it's a driver "fuckup" or an intentional optimization.

    The short of it is that nVidia added hard-coded clipping of the scenes for everything that the banchmark doesn't show in its normal run, and which gets exposed as soon as you move the camera away from its regular path.

    It's a step in the direction of recording an mpeg on what the benchmark is supposed to show and then playing it back at 200 fps.

  • Not a big deal. (Score:5, Informative)

    by grub ( 11606 ) <slashdot@grub.net> on Thursday May 15, 2003 @09:32AM (#5963416) Homepage Journal

    One has to take all benchmarks with a grain of salt if they come from a party with financial interestes in the product. Win 2K server outperforms Linux, a Mac is 2x the speed of the fastest Wintel box, my daddy can beat up your daddy..

    It's not suprising but it is somewhat disappointing.
  • by mr_luc ( 413048 ) on Thursday May 15, 2003 @09:38AM (#5963467)
    Targetting performance for benchmarks is one thing.

    These drivers were written with specific limits built in that make the drivers COMPLETELY irrelevant to ordinary gaming, as ET demonstrates by moving the camera just a bit from the designated path.

    This would be like chopping the top off of a car to make it lighter, to reduce the distance it takes for it decellerate in a brake test. Or compensating for a crappy time off the starting line by removing the back half of the car and bolting a couple of RATO rockets where the back seats used to be. Or loading the car up with nitro, or something. You think Car and Driver Magazine wouldn't say something?

    These drivers make the card completely unsuitable for ordinary gaming. They aren't 'more powerful' -- they are a completely altered version of the drivers that are ONLY good at improving one particular set of benchmarks.
  • by Pulzar ( 81031 ) on Thursday May 15, 2003 @09:39AM (#5963470)
    Please try reading the article in more detail.

    The developer version is not a pre-release, it's the same version with some extra features that let you debug things, change scenes, etc.

    As soon as you move the camera away from it's usual benchmark path, you can see that nVidia hard-coded clipping of the benchmark scenes to make it do less work than it would need to in a real game, where you don't know where the camera will be in advance.

    As I mentioned in another post, it's a step in the direction of recording an mpeg of the benchmark and playing it at a high fps rate.
  • by Anonymous Coward on Thursday May 15, 2003 @09:44AM (#5963512)
    Overclockers.com [overclockers.com] has a very well thought out Editorial on this issue titled ""Trust is Earned" It is well worth the read.
  • Re:Hmmmm (Score:4, Informative)

    by D3 ( 31029 ) <daviddhenning@gma i l .com> on Thursday May 15, 2003 @09:53AM (#5963600) Journal
    Actually ATI has done this as far back as the Xpert@Play series from 1997/98. They wrote drivers that gave great benchmarks with the leading benchmark tests. Then people started using game demos as benchmarks and the cards showed their true colors. This is why places like Tom's Hardware use a variety of games to make it hard for manuacturers to cheat.
  • Re:Random Rail (Score:2, Informative)

    by stratjakt ( 596332 ) on Thursday May 15, 2003 @10:03AM (#5963703) Journal
    But then the benchmark would be useless, unless you repeated it a few dozen times and averaged the results.

    By sheer luck, card A could get a 'rail' that drags it along a plain brick wall with nothing fancy to render, and card B could go through the heart of some mega explosion with fragments and fire and smoke and all that. Card A would get 4000000 fps, card B gets 20.

    It would be fine to take them off the rails to "keep em honest", but you need to run both cards in the exact same situation for your test to have any sort of merit at all.
  • by AndrewHowe ( 60826 ) on Thursday May 15, 2003 @10:19AM (#5963816)
    Well you clearly didn't, because it explicitly says that they don't know how much the errors affected the result. You must feel pretty stupid right now.
  • Re:Enough of that... (Score:2, Informative)

    by brer_rabbit ( 195413 ) on Thursday May 15, 2003 @11:07AM (#5964261) Journal
    A quack3 joke mod'd interesting? Funny, yes, but not interesting. Maybe the mods don't remember the ATI quack3 thing...

  • NVidia not cheating (Score:4, Informative)

    by linux_warp ( 187395 ) on Thursday May 15, 2003 @11:14AM (#5964347) Homepage
    hardocp.com [hardocp.com] on the front page has a great writeup on this.

    But basically, extremetek is just a little bit mad because they were excluded from the doom3 benchmarks. Since nvidia refused to pay the 10s of thousands of dollars to be a member of the 3dmark03 board, they have absolutely no access to the software used to create this bug.

    Here is the full exept from hardocp.com:

    3DMark Invalid?
    Two days after Extremetech was not given the opportunity to benchmark DOOM3, they come out swinging heavy charges of NVIDIA intentionally inflating benchmark scores in 3DMark03. What is interesting here is that Extremetech uses tools not at NVIDIA's disposal to uncover the reason behind the score inflations. These tools are not "given" to NVIDIA anymore as the will not pay the tens of thousands of dollars required to be on the "beta program" for 3DMark "membership".

    nVidia believes that the GeForceFX 5900 Ultra is trying to do intelligent culling and clipping to reduce its rendering workload, but that the code may be performing some incorrect operations. Because nVidia is not currently a member of FutureMark's beta program, it does not have access to the developer version of 3DMark2003 that we used to uncover these issues.

    I am pretty sure you will see many uninformed sites jumping on the news reporting bandwagon today with "NVIDIA Cheating" headlines. Give me a moment to hit this from a different angle.

    First off it is heavily rumored that Extremetech is very upset with NVIDIA at the moment as they were excluded from the DOOM3 benchmarks on Monday and that a bit of angst might have precipitated the article at ET, as I was told about their research a while ago. They have made this statement:

    We believe nVidia may be unfairly reducing the benchmark workload to increase its score on 3DMark2003. nVidia, as we've stated above, is attributing what we found to a bug in their driver.

    Finding a driver bug is one thing, but concluding motive is another.

    Conversely, our own Brent Justice found a NVIDIA driver bug last week using our UT2K3 benchmark that slanted the scores heavily towards ATI. Are we to conclude that NVIDIA was unfairly increasing the workload to decrease its UT2K3 score? I have a feeling that Et has some motives of their own that might make a good story.

    Please don't misunderstand me. Et has done some good work here. I am not in a position to conclude motive in their actions, but one thing is for sure.

    3DMark03 scores generated by the game demos are far from valid in our opinion. Our reviewers have now been instructed to not use any of the 3DMark03 game demos in card evaluations, as those are the section of the test that would be focused on for optimizations. I think this just goes a bit further showing how worthless the 3DMark bulk score really is.

    The first thing that came to mind when I heard about this, was to wonder if NVIDIA was not doing it on purpose to invalidate the 3DMark03 scores by showing how the it could be easily manipulated.

    Thanks for reading our thoughts; I wanted to share with you a bit different angle than all those guys that will be sharing with you their in-depth "NVIDIA CHEATING" posts. While our thoughts on this will surely upset some of you, especially the fanATIics, I hope that it will at least let you possibly look at a clouded issue through from a different perspective.

    Further on the topics of benchmarks, we addressed them earlier this year, which you might find to be an interesting read.

    We have also shared the following documentation with ATI and NVIDIA while working with both of them to hopefully start getting better and more in-game benchmarking tools. Please feel free to take the documentation below and use it as you see fit. If you need a Word document, please drop me a mail and let me know what you are trying to do please.

    Benchmarking Benefiting Gamers

    Objective: To gain reliable benchmarking and image quality tools
  • Short Description. (Score:3, Informative)

    by BrookHarty ( 9119 ) on Thursday May 15, 2003 @12:43PM (#5965281) Journal
    Reading the posts, I dont think everyone is understanding the point of the rail test.

    Using the rail test, Nvidia excluded almost all non-visible data. This shows nvidia tweaked its drivers to only render data seen on the rail test, which would only happen if you tweak your drivers for the benchmarks. (aka the cheat)

    I like it better if benchmarks uses average FPS on a game, and you go PLAY the game, and watch for yourself.

    Try 1024x768/1280x1240/1600x1200 with all AA/AF modes. Also stop using 3ghz P4's for the benchmarks, use a mix of 1ghz/2ghz/3ghz AMD/Intel boxes so we can know if the hardware is worth the upgrade.

  • Re:whatever (Score:3, Informative)

    by MrBlue VT ( 245806 ) on Thursday May 15, 2003 @12:56PM (#5965413) Homepage
    My opinion (being a 3D programmer) that the situation is most likely a bug in the 3DMark program itself that then compounds a driver bug in the nVidia drivers. Since the driver itself does not have access to the program's data structures, it would be impossible for the driver to throw away undraw objects before the point where it would normally do it when clipping. Just because these "leet" game playerz at ExtremeTech think they know anything about graphics programming, doesn't mean they actually do.

That does not compute.

Working...