Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

FutureMark Confirms nVidia's Benchmark Cheating 406

jlouderb writes "As first reported by ExtremeTech, Futuremark has confirmed that nVidia is cheating on its 3DMark2003 benchmark through eight driver optimizations. The 3D graphics performance war just keeps getting more and more interesting!" See our previous story.
This discussion has been archived. No new comments can be posted.

FutureMark Confirms nVidia's Benchmark Cheating

Comments Filter:
  • by Anonymous Coward
    Test with the applications/games people really use, and they can't optimize for them without, well, optimizing for them! If they want to make Quake III faster, great.
    • by mskfisher ( 22425 ) * on Friday May 23, 2003 @12:35PM (#6025302) Homepage Journal
      Wrong - as they point out in the article, these "optimizations" are usually reductions in quality. They don't just improve performance.
      • I meant to say, they can add "optimizations" for other applications as well. Quake III is a notorious target for "optimizations".
        As the report said, the drivers can even detect when you're really playing the game as opposed to running a benchmark, and adjust visual quality appropriately.
        Nothing is safe from these... though your original point of diversifying could help.
      • a mirror (Score:3, Informative)

        by abhisarda ( 638576 )
        Mirror [lycos.co.uk]. Slashdot into oblivion.
      • Worse than that! (Score:5, Informative)

        by siskbc ( 598067 ) on Friday May 23, 2003 @01:10PM (#6025649) Homepage
        Wrong - as they point out in the article, these "optimizations" are usually reductions in quality. They don't just improve performance.

        According to the article, that's only half the story. I could almost accept it if they were "optimizing" in the sense that, in certain situations, they slightly reduced image quality for a significant gain. That's kind of sketchy, as the card isn't then doing what it's claiming, but you could argue, perhaps, that the tradeoff is worth it. And if this activity were optional, it might be a benefit.

        What they're doing here is different, and much worse. They're actually detecting what program is running - whether it is 3D Mark or not. Effectively, what it does is disobey 3DMark, and only 3DMark, when it issues certain commands that would reduce throughput. That has no purpose but to deceive.

        So, not only are these not optimizations in that they don't really improve performance, they're not optimizations in that they don't even take effect when you run a program not called 3DMark.

        Quite frankly, I think this could be considered false advertising and nVidia should get in deep shit for this. This is the worst kind of cheating, and quite frankly, this could be what puts nVidia down the Voodoo path. I don't know whether I'll ever buy another of their cards.

        • Re:Worse than that! (Score:4, Informative)

          by p7 ( 245321 ) on Friday May 23, 2003 @03:11PM (#6026768)
          I guess you won't buy an ATI either since they did the degrade image quality under quake.exe cheat. Remember the guys that renamed the quake.exe to quack.exe and ATIs framerates dropped and in screenshots you could see where the image quality was reduced.
          • Re:Worse than that! (Score:4, Informative)

            by ergo98 ( 9391 ) on Friday May 23, 2003 @05:20PM (#6027735) Homepage Journal
            This is the specific point of the parent and grandparent post: ATI's action was questionable and bordering on fraudulent, but they were "optimizing" a game that people actually play, with a specific branch for quake that altered settings accordingly: 99.9+% of the times that this "optimization" would take effect would be people actually playing the game, versus gathering benchmark numbers. The quake hack didn't have a "if (bBenchmarking)" condition.

            From what it sounds like, nvidia purportedly altered something for the specific purposes of deceiving a benchmark. A benchmark has the sole purpose of benchmarking, so there is absolutely no justification for "optimizations" for a benchmark.

            The point is that ATI had a pretty tenuous justification (that they were optimizing for Quake 3 as it's the engine behind a large number of games), but if this is the case then nvidia has none.
            • by Sj0 ( 472011 )
              The point is that ATI had a pretty tenuous justification (that they were optimizing for Quake 3 as it's the engine behind a large number of games), but if this is the case then nvidia has none.

              Only if they didn't understand what they were doing, which I doubt. Since there aren't many quake-based games that are named quake.exe, and at the time, Quake 3 was an aging game used mainly for benchmarks, and the stunning similarity between the two, you're just searching for a way to justify it.

              This case certainl
    • So cheating at game and app benchmarks through driver tweaks is OK?
    • These cheats could be used in game-benchmarks as well. At least in case of 3DMark, we have proper methods of detecthing those cheats.
    • by rmarll ( 161697 ) on Friday May 23, 2003 @12:49PM (#6025449) Journal
      Partially true... Trouble is, there aren't any games out yet that exploit pixel/vertex shader features to the extent that Futuremark does. And that gives us insight into how hardware will perform on next generation games. It's not a be all end all benchmark, even by futuremark's PR. It is a tool to be used along side current generation titles to measure differing aspects of hardware.

      It is by Nvidia's negligence that the optimisations were found. That's why (among other things) the beta program exists with those features. I think we can probably expect this and other cheat hampering features in future versions.
    • Who modded this up? They say specifically in the article that this is still *not* optimization, it is cheating!

      In fact, they say in the article that with "applications/games people really use", it is even harder to detect driver cheats.

    • by paranode ( 671698 ) on Friday May 23, 2003 @01:08PM (#6025635)
      This is a problem with Nvidia. The only reason they are competing well with ATI is because they cut so many corners to get their benchmark scores up. It certainly would be nice if Nvidia concentrated on real-world apps and games but it seems like they do not. If you look at the benchmarks historically between ATI and Nvidia's closely competing cards, you'll find that they are closely matched in default runs. However, try turning on 4x anti-aliasing or anisotropic filtering and watch the older, slower, ATI cards beat out the shiniest new Nvidia cards. ATI's image quality has always been superior to Nvidia's. They are all about quantity and need to be focusing more on quality.
  • This is why.. (Score:5, Insightful)

    by craigtay ( 638170 ) on Friday May 23, 2003 @12:34PM (#6025295) Journal
    You don't base your findings on one benchmark. Whenever I go to a site like tomshardware.com they have several different ways to benchmark. Each card has its own strengths, and if a card has cheated it will show up like that.
    • by shdragon ( 1797 ) * on Friday May 23, 2003 @01:54PM (#6026087) Homepage Journal
      If you want better

      [Next Page]

      reviews that

      [Next Page]

      don't read like Cat in

      [Next Page]

      the Hat with ads, you

      [Next Page]

      should try

      [Next Page]

      AnandTech [anandtech.com] or ExtremeTech [extremetech.com] or even HardOCP [hardocp.com].

      • by Idarubicin ( 579475 ) on Friday May 23, 2003 @02:44PM (#6026524) Journal
        ...or ExtremeTech...

        The linked

        [Next Page]

        article was at

        [Next Page]

        ExtremeTech, and it

        [Next Page]

        still ran to ten or so pages, most having two or fewer paragraphs. Maybe that's why the site wasn't Slashdotted--nobody had the patience to click through the whole article.

  • Cheaters! (Score:5, Funny)

    by DarkHelmet ( 120004 ) <<ten.elcychtneves> <ta> <kram>> on Friday May 23, 2003 @12:35PM (#6025305) Homepage
    Futuremark has confirmed that nVidia is cheating

    WHAT?? My FX 5800 Leaf Blower only has a range of five feet and not six? I want a refund!

  • lies and statistics. (Score:4, Interesting)

    by acomj ( 20611 ) on Friday May 23, 2003 @12:35PM (#6025306) Homepage
    There a lies, damm lies and statistics .

    I remember SPEC benchmarking ment something, and companies putting special routines to make chips seems faster than they were.

    Thats why "Real world testing" is important. While not always the greatest comparison, its much better in most cases.

    • by blackmonday ( 607916 ) on Friday May 23, 2003 @01:58PM (#6026131) Homepage
      The problem with real world testing: Should I go out and buy 3 video cards and then return 2 to the store? Especially with CompUSA's 15% "restocking" fee...
    • by Surak ( 18578 ) * <.surak. .at. .mailblocks.com.> on Friday May 23, 2003 @02:15PM (#6026290) Homepage Journal
      The problem with 'real world testing' when it comes to video cards aimed at the gamers market is basically the difference between a few lousy FPS between the two top-of-the-line cards (and each have similar features, performance-wise) will be virtually indistinguishable in most cases.

      I think people shouldn't get all macho when it comes to this stuff. Honestly, it's like the difference between a 350 hp engine and a 351 hp engine. It doesn't amount to a hill of beans worth of difference except on paper.

      Get over it people.
      • The other problem (Score:3, Insightful)

        by roystgnr ( 4015 )
        "Real World Testing" in general means that they're testing the card on games that are out on the shelf, finished products, right now; i.e. games which were targeted at video cards years old. In other words, one card does 150fps at the highest quality settings, another does 155fps, and when both of them are run on my 80hz refreshing monitor, the results are exactly the same.

        Instead, I want testing that approximates the sorts of games that I'll want to buy years from now. Unfortunately those games don't ex
  • by Nathan Ramella ( 629875 ) on Friday May 23, 2003 @12:38PM (#6025331) Homepage
    There's no law about fudging benchmarks on a third party application.

    While this isn't a huge suprise, I am happy that there are smart folks out there who spend time to uncover this kind of information. Kudos to you for your efforts!

    Videocard Benchmarks are about as believable as the the 'World's Best Grampa' award.

    -n

    • by James Lewis ( 641198 ) on Friday May 23, 2003 @01:03PM (#6025587)
      It doesn't need to be against the law. Their motive for doing this in the first place was the expectation that their card would gain a better reputation by doing well in that benchmark by cheating. Instead, it has backfired and seriously hurt their reputation. Having a community that can uncover these unsavory practices is deterrent enough.
    • Those who read the article, which is probably a small percentage of /.ers, know that ATI was caught cheating as well. They just weren't caught doing as many things as NVidia was. It is possible that both are cheating the same amount.

      Of course if the article title was, "Everybody cheats on our benchmark!" then that would do more to undermine their benchmark than anything else. Instead they made the focus of the article the fact that NVidia is cheating.

    • by yamla ( 136560 ) <chris@@@hypocrite...org> on Friday May 23, 2003 @01:19PM (#6025726)
      Actually, it is against the law, at least in Canada.

      380. (1) Fraud -- Every one who, by deceit, falsehood or other fraudulent means, whether or not it is a false pretence within the meaning of this Act, defrauds the public or any person, whether ascertained or not, of any property, money or valuable security or any service [is guilty of fraud, a criminal offence]...


      Nvidia (and ATI before) are guilty of using deceit to attempt to sell more video cards. Thus, they are guilty of fraud.
      • Nvidia (and ATI before) are guilty of using deceit to attempt to sell more video cards. Thus, they are guilty of fraud.

        No, they are not guilty of fraud. They did not misrepresent their benchmark score; merely to optimize for the benchmark score. Whether or not benchmark scores are representative of general real world performance is not their responsibility.

        This is similar to Intel realizing that MHz meant everything to silly consumers, and optimizing their CPUs to achieve the highest MHz rating possib

      • Let me play devil's advocate. IANAL and all, but here's my devil's advocate view.

        Has nVidia (or ATI for that matter) ever claimed that any benchmark was indicitive of real world performance? Sure, they may boast individual benchmark numbers and say that their card is fast, despite having optimized routines for individual benchmarks, but they're really only claiming that their cards are fast, which is a subjective measure, and achieve those numbers on those tests, which they do. The benchmark writers may try

  • by Lieutenant_Dan ( 583843 ) on Friday May 23, 2003 @12:38PM (#6025337) Homepage Journal
    How can company proceed to do its business while blatantly lying to its customers!!??

    Oh wait, my medication just kicked in. It's just business as usual. I will just go on checking my MSN e-mail, while watching MSNBC, drinking my Coke and eating my McDonalds burger.

    Never mind.

  • correction (Score:3, Funny)

    by Anonymous Coward on Friday May 23, 2003 @12:39PM (#6025347)
    • The 3D graphics performance war just keeps getting more and more interesting!
    Wrong
  • by dvanduzer ( 563848 ) <dvd@tennica.net> on Friday May 23, 2003 @12:40PM (#6025361)
    According to the ExtremeTech article, it's entirely plausible that this isn't entirely intentional on NVidia's part:
    nVidia believes that the GeForceFX 5900 Ultra is trying to do intelligent culling and clipping to reduce its rendering workload, but that the code may be performing some incorrect operations. Because nVidia is not currently a member of FutureMark's beta program, it does not have access to the developer version of 3DMark2003 that we used to uncover these issues.
    So it's quite likely that NVidia was just anticipating optimizations and not outright "cheating."
  • Re: (Score:2, Interesting)

    Comment removed based on user account deletion
  • by Anonymous Coward
    Calling them optimizations gives what nVidia is trying to do a level of legitimacy which is undeserved. If you read the Futuremark paper, you will see that they are clearly cheating.

    It would be as if a CPU manufacturer substituted its own algorithms stealthily in a CPU performance benchmark and only when running that benchmark.

    Sure, you get a higher number, but you aren't measuring what the benchmark designer intended to measure.
  • by ymgve ( 457563 ) on Friday May 23, 2003 @12:42PM (#6025379) Homepage
    Thank you for submitting this to Slashdot. With Futuremark slashdotted to death, NOBODY will be able to get the evidence! *manical laughter*
  • by voxel ( 70407 ) on Friday May 23, 2003 @12:42PM (#6025383)
    This has been done for many years, even the last decade. A good friend of mine works and has worked for almost every major video card company in the buisness for the last decade. What is his job? Make sure THEIR video card gets the best scores on the latest and greatest video cards.

    I am sorry to tell you all, but just because Nvidia was CAUGHT this time, doesn't mean they haven't been "cheating" (by optimizing for a specific benchmark) for the last 6 years.

    I would bet every driver release contains code to help out benchmarks and even specific games. Why do you think Nvidia just said with there latest driver release " *Up to 30% faster frame rates ( *With Unreal Tournament 2002)".

    Its just once in a great while someone notices a performance jump TOO big, or just wants some news worthy-ness and decides to put out a nice PDF file.

    - Jeff

  • Doom3 (Score:4, Interesting)

    by Blaster Jaack ( 536777 ) on Friday May 23, 2003 @12:44PM (#6025399)
    From what I read from [h]ardOCP's [hardocp.com] benchmark with doom3 It kills nvidia's card. And who cares aren't you suppose to optimize your card?

    They also have another benchmark here [hardocp.com] where they compare the 5900 ultra and the radeon 9800 pro. In that article it says that NVIDIA told them not to use 3DMark03 I recommend reading that article
  • by bilbobuggins ( 535860 ) <bilbobuggins@@@juntjunt...com> on Friday May 23, 2003 @12:44PM (#6025400)
    that's right
    9th grade, you told me cheaters never make money

    well 'pbhtbhtbthbth'

  • ATI Did The Same... (Score:3, Informative)

    by SgtClueLs ( 54026 ) <sgtcluels.gmail@com> on Friday May 23, 2003 @12:44PM (#6025401)
    I thought that ATI did the same [hardocp.com] with their Radeon 8500 drivers 2 years ago, making their Quake 3 scores look better by "cheating". Isn't that just status quo in the video card manufactoring world.
    • It's not the same.

      ATI was trying to make my Quake3 faster. That's good. They screwed up and hampered my image quality. Innocent mistake while trying to make my life better.

      nVidia was blatantly cheating by hardcoding viewpoints. That's bad. You can't do that in a real-world driver, so it's blatant and evil.

      You can't compare these two incidents. Maybe ATI has done similar things, but they have not been caught at anything as bad as this.

      Bryan
      • by TrancePhreak ( 576593 ) on Friday May 23, 2003 @01:36PM (#6025896)
        No, ATI forced you to medium quality no matter what so that it would seem like high quality scores were better.
      • It's the exact same thing. Both companies tried to get higher performance out of their hardware on one specific piece of software by writing different routines for that software. Don't try to tell me that higher fps in Quake3 didn't help ATI sell more cards. The claim was made by ATI and people testing it on Quake3 that on a certain hardware spec, it got this performance. It's all marketing. Hell, I'm more likely to buy a card based on real world performace like a game than based on a benchmark, so as
  • ATI cheaping too (Score:3, Insightful)

    by IpsissimusMarr ( 672940 ) * on Friday May 23, 2003 @12:45PM (#6025404) Journal
    Our investigations reveal that some drivers from ATI also produce a slightly lower total score on this new build of 3DMark03. The drop in performance on the same test system with a Radeon 9800 Pro using the Catalyst 3.4 drivers is 1.9%. This performance drop is almost entirely due to 8.2% difference in the game test 4 result, which means that the test was also detected and somehow altered by the ATI drivers. We are currently investigating this further.

    It not about cheating... but about how much you cheat.
  • In a deposition today, the benchmarks demanded sole custody of all offspring, and alimoney of a undisclosed ammount.

    NVidia's response was brief and to the point...

    I wouldnt have cheated on them (the benchmarks), But they have been sleeping around with ATI for years.. and I cant stand being cheated on with someone who cant even write good drivers...
  • It's still faster than my trusty GeForce2, which keep because it still renders games faster than I can physically detect...
  • PDF Mirror (Score:5, Informative)

    by Cable_Monkey ( 516166 ) on Friday May 23, 2003 @12:47PM (#6025426)
    http://198.3.92.62/3dmark03_audit_report.pdf Just don't kill me now. ;-)
  • by Anonymous Coward on Friday May 23, 2003 @12:48PM (#6025432)
    A test system with GeForceFX 5900 Ultra and the 44.03 drivers gets 5806 3DMarks with
    3DMark03 build 320.
    The new build 330 of 3DMark03 in which 44.03 drivers cannot identify 3DMark03 or the tests in
    that build gets 4679 3DMarks - a 24.1% drop.
    Our investigations reveal that some drivers from ATI also produce a slightly lower total score on
    this new build of 3DMark03. The drop in performance on the same test system with a Radeon
    9800 Pro using the Catalyst 3.4 drivers is 1.9%. This performance drop is almost entirely due to
    8.2% difference in the game test 4 result, which means that the test was also detected and
    somehow altered by the ATI drivers. We are currently investigating this further.
  • by WPIDalamar ( 122110 ) on Friday May 23, 2003 @12:48PM (#6025441) Homepage
    We should have a constant for each 3d company that we can multiple their benchmarks agains...

    Maybe nvidia is 0.80 and ATI is 0.90 ...

    so then 100fps on a geFrorce card, is really 80 fps, and it would be 90 on an ATI...
  • The "optimization" relied on the benchmark camera being on 'rails'. It always shows the exact same angles, and there are some things that the benchmark would have the graphics card render, even though it's impossible for the viewer to see.

    HOWEVER, in the development version of 3dmark 2k3, you can take the camera "offroading". When you do that, it becomes apparent that things are being drawn incorrectly -- that there are hard-coded limits that result in the video card doing less work than the program requests.

    For those of you whining about how they should use "real life" games for benchmarks, this technique could be applied to anything where the camera path is predetermined. It has nothing to do with 3dmark 2k3 specifically.
  • So what? (Score:2, Insightful)

    by bobm17ch ( 643515 )
    Different graphics cards have different strengths and weaknesses - much moreso than in previous years.

    eg. Fillrate, Vertex manipulation, Texture rasterizer, Shader technology, Texture sampling techniques, Shadow buffering etc.. etc...

    Some cards will be better than other at these tasks, and some games will take advantage of differing ratios of these technologies.

    The unreal engine has a reliance on poly-count and texture resolution, and it looks like the doom engine will tend to tax shader, and multitextu
  • by Anonymous Coward on Friday May 23, 2003 @12:52PM (#6025482)
    Let me just say that this occurs not just on this test, but on all imaginable tests, as well as all games that are somewhere used as benchmarks. Many of the cheats are hard to detect because they don't break the test in the way that this cheat did. For instance, at some point there was a trick for a test with lots of occlusion to clip (discard) polygons that would eventually be occluded. However, these discarded polygons were actually calculated at run-time and not precomputed, so if you changed the test, it would still work right. For Quake (I or II, can't remember) they had a hack where they wouldn't need to clear the framebuffer. That version of Quake would do a glClear at each frame, which takes some time, and prior to framebuffer compression, there was a hack where you wouldn't need to clear the framebuffer if you swapped the Z-check and only used half of the Z span every frame. That hack's probably been backed out now because with framebuffer compression, you're actually better off doing the glClear each frame.

    Anyway, I'm posting this as an AC for obvious reasons.
    • by Anonymous Coward on Friday May 23, 2003 @01:18PM (#6025721)
      I too am an ex-Nvidia employee. It isn't just driver cheats that go on at nVidia. There are black spells and rituals, sometimes involving human sacrifice. The driver team will stop at nothing. I finally broke when asked to cruise kindergartens looking for virgins. When I spoke up and said "ATI doesn't rely on the power of Satan, why should we?" I was fired. They called it "gross incompetence" but we all knew it was because of my threat of whistle blowing. Stick with ATI if you want less baby killing.
    • or instance, at some point there was a trick for a test with lots of occlusion to clip (discard) polygons that would eventually be occluded. However, these discarded polygons were actually calculated at run-time and not precomputed, so if you changed the test, it would still work right.

      You have just described an optimization, not a cheat. The point of cheats is that they take advantage of knowledge that's not available to normal processes. If your "cheat" takes no such advantage (e.g. calculating its s

  • trusty bit torrent (Score:3, Informative)

    by Neophytus ( 642863 ) on Friday May 23, 2003 @12:54PM (#6025495)
    the pdf for bittorrent [supersheep.org]
  • Who found it? (Score:5, Interesting)

    by anon*127.0.0.1 ( 637224 ) <slashdot@NosPAm.baudkarma.com> on Friday May 23, 2003 @01:18PM (#6025717) Journal
    I wonder why this driver cheat was discovered by Extremetech? If you're a video card manufacturer, wouldn't you have your engineers go over every one of the competitions driver releases with a fine-toothed comb, just hoping to find some kind of cheat? You'd think ATI has better testing facilities are resources then ET.

    Certainly any negative publicity for NVidia is good for ATI and vice versa.

    • Re:Who found it? (Score:4, Interesting)

      by Quasar1999 ( 520073 ) on Friday May 23, 2003 @01:37PM (#6025898) Journal
      I wonder why this driver cheat was discovered by Extremetech

      Simply put, if ATI brings it to light, many people would claim it was planted, biased, etc... if Extremetech (or another source not directly attached to ATI) brings it to light, then ATI still gets the benefit of burning Nvidia, but without the negative PR they might generate. I wouldn't be surprised if ATI tipped off the people over at Extremetech... ;)
    • Re:Who found it? (Score:5, Interesting)

      by User 956 ( 568564 ) on Friday May 23, 2003 @02:44PM (#6026527) Homepage
      Well, if you read Hard|OCP last week, you might have gotten the impression that Extremetech was making the whole thing up. They said "I have a feeling that Et has some motives of their own that might make a good story"

      Right, like maybe getting a fix posted? Oh, wait, looks like Hard|OCP is taking credit for that:

      Futuremark has released a patch for 3DMark 2003 that eliminates "artificially high scores" for people using NVIDIA Detonator FX drivers. This is in response to the news item we posted last week. According to the PDF on Futuremark's site, the patch causes a 24.1% drop in score for NVIDIA..."

      I'm amazed at the OCP's coverage of this whole deal. They didn't break the story, so they cast doubt on ExtremeTech's findings, and allude to suspicious "motives" that were never proven.

      Then, when the fix is released, they claim the fix is released "in response to a news item we posted last week", as if they're directly responsible. A week ago they're bashing ExtremeTech for even insinuating driver cheating, and this week they're taking credit for getting the fix released (as if they broke the story themselves).
  • Wasted Code (Score:5, Insightful)

    by JeffRC ( 103922 ) on Friday May 23, 2003 @01:29PM (#6025805)
    Just think about this the next time you do a 5MB driver download. How much of that code is specifically for detecting and defeating benchmarks? How much of the cheats are part of the instability problems in your system?
  • by mzs ( 595629 ) on Friday May 23, 2003 @01:31PM (#6025816)
    Here is an interesting quote from the article that seems to have been overlooked so far.

    "Our investigations reveal that some drivers from ATI also produce a slightly lower total score on this new build of 3DMark03. The drop in performance on the same test system with a Radeon 9800 Pro using the Catalyst 3.4 drivers is 1.9%. This performance drop is almost entirely due to 8.2% difference in the game test 4 result, which means that the test was also detected and somehow altered by the ATI drivers. We are currently investigating this further.

    Gasp, what a shock. Everyone seems to be guilty of having cheated on synthetic benchmarks at some time. This has happened before, it will happen again.
  • by destiney ( 149922 ) on Friday May 23, 2003 @02:20PM (#6026321) Homepage

    From the article:

    Our investigations reveal that some drivers from ATI also produce a slightly lower total score on this new build of 3DMark03. The drop in performance on the same test system with a Radeon 9800 Pro using the Catalyst 3.4 drivers is 1.9%. This performance drop is almost entirely due to 8.2% difference in the game test 4 result, which means that the test was also detected and somehow altered by the ATI drivers. We are currently investigating this further.
  • by egarland ( 120202 ) on Friday May 23, 2003 @02:47PM (#6026555)
    I think it's awesome that Futuremark has come out swinging on this one. NVidia has obviously cheated horribly on these benchmarks. ATI aparently has also taken the low road on these but not as low as NVidia.

    NVidia is losing. Their chips and cards are worse than ATI's. What's worse than that, though, is that they are still trying to pretend that it's not the case. They need to seriously sit down and work on their designs but instead they are pissing money away working on cheating on benchmarks. That is a really bad sign for a company. It means managament is diverting money away from becoming successful twords appearing to be successful. A mentality like that is disasterous to the real value of a company.

    SELL! SELL NOW! Buy again when they have fixed their mangement and design issues.

    Contravertial != Overrated. Reply if you disagree, I'll read it.
    • by DeathPenguin ( 449875 ) * on Friday May 23, 2003 @03:23PM (#6026885)
      >> What's worse than that, though, is that they are still trying to pretend that it's not the case.

      Since when? Jen-Hsun Huang admits defeat [bayarea.com] (But promises a comeback):

      "Tiger Woods doesn't win every day. We don't deny that ATI has a wonderful product and it took the performance lead from us. But if they think they're going to hold onto it, they're smoking something hallucinogenic."
  • Quack (Score:4, Informative)

    by DeathPenguin ( 449875 ) * on Friday May 23, 2003 @03:12PM (#6026780)
    Let's not jump on nVidia too harshly for this. Sure, this spectacle seems to have gained a lot more publicity than ATi's own cheating ( link [tech-report.com] link [hardocp.com] link [tomshardware.com] ). At least when nVidia cheated in 3DMark, they publically denounced [tomshardware.com] synthetic benchmarks.
  • game or demo? (Score:3, Interesting)

    by Anonymous Coward on Friday May 23, 2003 @04:03PM (#6027195)
    is 3dmark03 a synthetic benchmark or a eye-candy?
    if i remember correctly some of the people who funded futuremark had something to do with a demo named "second reality". a good old school demo on 2 discs.

    if 3dmark was TRULLY a bench it would then resort on code that we find in games!! opts are expected for thoses...even more for stuff...

    what if you told carmack that the opt he made for quake and tweaked openGL implementation are just cheats? Sure you remember 3dfx ogl implementation and riva128 drivers...

    what if you told ppl from the 'scene that their demo sucks because they don't properly handle Z buffering.

    They all rely on tricks.(beter than opts or cheating from a coder point of view), even processors rely on thoses. they're based on user experience, not bogomips or whatever. page-flipping was a inproper behavior at a time when VESA was not VESA but scene called mode-X, eventually it became best practice. Sprites asm hard-coding was the same and most 2d shooters are based on that.

    I'm pretty sure ppl at futuremark include some kind of sleazzy code in their bench as coders always do.

    the only difference b/w cheating and proper optimization is only PR. if nvidia told us "wow! we made an optimization that runs 3dmark faster" as it would with a game none would complain.
    it's just that for a lot of us 3dmark is supposedly an untouchable thing. It's not. it should reflect real world 3d. and in real life you expect those kind of code workaround.
    then i ask myself a question... why doesn't futuremark distribute freely a playable bench.
    why put us in front of a demo claiming it's a synthetic bench and then why aren't we believing it?
    because it'a a lie. either they're real world gaming and tricks are OK, either they're pure demos and tricks are not options.
  • thanks, but... (Score:4, Insightful)

    by Shadestalker ( 598690 ) on Friday May 23, 2003 @04:13PM (#6027271)
    Dear nvidia / ATI / etc.,

    Please optimize your drivers and hardware for the actual applications and games I run, not the synthetic benchmarks designed to simulate workloads. Benchmarks don't use your products, end-users do.

  • by cicatrix1 ( 123440 ) <cicatrix1NO@SPAMgmail.com> on Friday May 23, 2003 @08:27PM (#6028660) Homepage
    NVidia immidiately put out a rebuttal to these claims, and I'm not sure why they weren't reported along with this article. But, I guess I really can't say that I'm not used to biased or ignorant reporting from slashdot.

    From Bluesnews [bluesnews.com] (from an unlinked CNet article):

    "Recently, there have been questions and some confusion regarding 3DMark 03 results obtained with certain Nvidia" products, Futuremark said in the statement. "We have now established that Nvidia's Detonator FX drivers contain certain detection mechanisms that cause an artificially high score when using 3DMark 03."

    A representative at Nvidia questioned the validity of Futuremark's conclusions. "Since Nvidia is not part of the Futuremark beta program (a program which costs of hundreds of thousands of dollars to participate in), we do not get a chance to work with Futuremark on writing the shaders like we would with a real applications developer," the representative said. "We don't know what they did, but it looks like they have intentionally tried to create a scenario that makes our products look bad."
  • by captaineo ( 87164 ) * on Friday May 23, 2003 @10:20PM (#6029107)
    There isn't really much difference bewteen a "cheat" and a true optimization. As long as the "cheated" driver produces acceptable results, and produces them faster, I don't see what the problem is.

    Some of the cheats potentially reduce image quality, but we are talking about OpenGL and DirectX here - nobody really aims for 100% visual quality, and indeed there is no target to shoot for since neither standard specifies "correct" rendering down to the pixel level.

    You might complain that 3DMark is being treated specially, that other software wouldn't receive the same speedups. That is true. But application-specific optimization has a long history. Just look at Windows - the more recent versions detect and flag certain programs that are known to break or run slowly due to compatibility issues. Nobody says Windows is "cheating" because it refuses to install a driver that its internal knowledge base knows will trash your system. In the CAD world, video card makers almost always tweak drivers to support specific CAD and 3D applications. (3DLabs' control panel used to have a box where you could select "optimize for AutoCAD/3D Studio/Maya/etc...")

    ATI should be happy that NVIDIA engineers are wasting time fixing specific benchmarks when they could instead be improving performance in general. But I wouldn't read much more than that into this.

    Making your buying decision based on a synthetic benchmark, rather than in-context with your intended application, is always going to distort the picture. (Looking at SPEC benchmarks, Itanium blows the competition away - just tell that to the millions of people who are *not* buying IA64 chips!)

    If you, the OpenGL developer, end up writing the next wildly-successful game, I'm sure NVIDIA will be happy to tweak their drivers for it.

"Conversion, fastidious Goddess, loves blood better than brick, and feasts most subtly on the human will." -- Virginia Woolf, "Mrs. Dalloway"

Working...