Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Intel Graphics Technology

Intel Caught Cheating In 3DMark Benchmark 216

EconolineCrush writes "3DMark Vantage developer Futuremark has clear guidelines for what sort of driver optimizations are permitted with its graphics benchmark. Intel's current Windows 7 drivers appear to be in direct violation, offloading the graphics workload onto the CPU to artificially inflate scores for the company's integrated graphics chipsets. The Tech Report lays out the evidence, along with Intel's response, and illustrates that 3DMark scores don't necessarily track with game performance, anyway."
This discussion has been archived. No new comments can be posted.

Intel Caught Cheating In 3DMark Benchmark

Comments Filter:
  • by Shadow of Eternity ( 795165 ) on Tuesday October 13, 2009 @12:05AM (#29728355)

    Thanks for telling all of us that the best measure of hardware's performance ingame is... to benchmark it with a game.

    • by cjfs ( 1253208 )

      Thanks for telling all of us that the best measure of hardware's performance ingame is... to benchmark it with a game.

      Except the article clearly shows that the name of the games executable determines frame rates in some cases. It then goes on to state:

      the very same 785G system managed 30 frames per second in Crysis: Warhead, which is twice the frame rate of the G41 with all its vertex offloading mojo in action. The G41's new-found dominance in 3DMark doesn't translate to superior gaming performance, even in this game targeted by the same optimization.

      This kind of offloading is definitely shady. I can't see how they'd get the driver approved.

      • "The G41's new-found dominance in 3DMark doesn't translate to superior gaming performance, even in this game targeted by the same optimization."

    • The driver apparently detects "crysis.exe" and inflates performance metrics by offloading processing, whereas renaming the executable to "crisis.exe" gives realistic performance scores. Please RTFA before replying.
      • "The G41's new-found dominance in 3DMark doesn't translate to superior gaming performance, even in this game targeted by the same optimization."

        To me at least that reads as "it cheats in 3dmark but you catch it red handed if you benchmark with a game."

      • by Fred_A ( 10934 ) <fred@NOspam.fredshome.org> on Tuesday October 13, 2009 @03:43AM (#29729325) Homepage

        The driver apparently detects "crysis.exe" and inflates performance metrics by offloading processing, whereas renaming the executable to "crisis.exe" gives realistic performance scores. Please RTFA before replying.

        Thanks for the tip, I've now renamed all my games to "crysis.exe" and am now enjoying a major speed boost. You've given my laptop a new youth !

        I can finally get rid of that cumbersome i7 box with that noisy nVidia !

  • Eh? (Score:2, Interesting)

    by Tyler Eaves ( 344284 )

    I thought offloading graphics computations to the CPU was the whole *point* of integrated video.

    • Re:Eh? (Score:5, Informative)

      by The MAZZTer ( 911996 ) <(megazzt) (at) (gmail.com)> on Tuesday October 13, 2009 @12:29AM (#29728507) Homepage
      And here I thought the whole point of not doing video on the CPU was to offload it to a dedicated chip!
    • Re:Eh? (Score:5, Insightful)

      by parallel_prankster ( 1455313 ) on Tuesday October 13, 2009 @12:40AM (#29728565)
      Effectively dividing tasks among CPUs is not the issue here. They want to benchmark the GPU and they wanna make sure you don't enable optimizations that are targeted specifically for the benchmark which Intel was doing shamelessly.
      • Mod Parent Up (Score:4, Insightful)

        by causality ( 777677 ) on Tuesday October 13, 2009 @12:52AM (#29728615)

        Effectively dividing tasks among CPUs is not the issue here. They want to benchmark the GPU and they wanna make sure you don't enable optimizations that are targeted specifically for the benchmark which Intel was doing shamelessly.

        Please mod this up; it really is that simple.

        • The behaviour their driver has in the benchmark is also used in several games... ie Crysis Warhead. RTFA.

          • Correct. (Score:3, Insightful)

            by InvisiBill ( 706958 )

            The behaviour their driver has in the benchmark is also used in several games... ie Crysis Warhead. RTFA.

            The issue is that the driver treats different games differently, based on filename. Some get this boost and some don't. Whether you put 3DMark into the boosted or unboosted category, its results will be indicative of some games and not of others.

    • by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Tuesday October 13, 2009 @12:49AM (#29728599) Journal

      That was my first thought, too.

      Here's the thing, though: They took 3DMarkVantage.exe and renamed it to 3DMarkVintage.exe, and much of that offloading was dropped. So this isn't a general-purpose optimization, which would make sense -- it's a targeted optimization, aimed at and enabled specifically for a benchmark, in order to get higher scores in said benchmark.

      It reminds me of the days when Quake3.exe would give you higher benchmarks, but worse video, than Quack3.exe.

      • Re: (Score:3, Insightful)

        by Anonymous Coward

        Here's the thing, though: They took 3DMarkVantage.exe and renamed it to 3DMarkVintage.exe, and much of that offloading was dropped. So this isn't a general-purpose optimization, which would make sense -- it's a targeted optimization, aimed at and enabled specifically for a benchmark, in order to get higher scores in said benchmark.

        A practice which is explicitly forbidden per the guidelines. I know lots of Slashdotters don't read the article but I am really beginning to wonder what part of that is so hard

      • They should remarket it as "application-targeted profiles" and sell it as a optimization feature.

        • No, they should either target behaviors, rather than executables, or make these available for teh application to request. Or they should improve the overall performance for everything -- starting with, oh, making a better chipset.

          Targeting specific executables, even if they do end up improving the performance of specific games, has the effect of raising the barrier of entry to that market -- I mean, it's hard enough to optimize a game engine without having to develop a business relationship with Intel.

    • Not so, an integrated GPU is simply a (often low power) GPU which uses the system's RAM instead of it's own RAM. Because system memory buses are usually much much slower than the ones included on dedicated graphics cards, and because the IGP shares the bandwidth with the CPU, the IGP is in turn relatively slow.

      There's not (normally) anything to do with using the CPU to do graphics computations.

  • Hmm... (Score:5, Informative)

    by fuzzyfuzzyfungus ( 1223518 ) on Tuesday October 13, 2009 @12:15AM (#29728445) Journal
    On the one hand, a mechanism that uses the CPU for some aspects of the graphics process seems perfectly reasonable(whether or not it is a good engineering decision is another matter, and would depend on whether it improves performance under desired workloads, what it does to energy consumption, total system cost, etc.), so I wouldn't blame intel for that alone.

    On the other hand, though, the old "run 3Dmark, then run it again with the executable's name changed" test looks pretty incriminating. Historically, that has been a sign of dodgy benchmark hacks.

    In this case, however, TFA indicates that the driver has a list of programs for which it enables these optimizations, which includes 3Dmark, but also includes a bunch of games and things. Is that just an extension of dodgy benchmark hacking, taking into account the fact that games are often used for benchmarking? Or is this optimization feature risky in some way(either unstable, or degrades performance) and so only enabled for whitelisted applications?

    If the former, intel is being scummy. If the latter, I'm not so sure. From a theoretical purist standpoint, the idea that graphics drivers would need per-application manual tweaking kind of grosses me out; but, if in fact that is the way the world works, and intel can make the top N most common applications work better through manual tweaking, I'm can't really say that that is a bad thing(assuming all the others aren't suffering for it).
    • Re: (Score:3, Interesting)

      by Sycraft-fu ( 314770 )

      I'm inclined to give Intel the benefit of the doubt here. Few reasons:

      1) Nobody buys Intel integrated chips because of how they do on 3D mark. Nobody thinks they are any serious kind of performance. Hell, most people are amazed to find out that these days they are good enough that you can, in fact, play some games on them (though not near as well as dedicated hardware). So I can't imagine they are gaining lots of sales out of this. Remember these are chips on the board itself. You either got a board with on

      • Re: (Score:3, Insightful)

        Nobody buys Intel integrated chips because of how they do on 3D mark. Nobody thinks they are any serious kind of performance. Hell, most people are amazed to find out that these days they are good enough that you can, in fact, play some games on them (though not near as well as dedicated hardware). So I can't imagine they are gaining lots of sales out of this. Remember these are chips on the board itself. You either got a board with one or didn't. You don't pick one up later because you liked the numbers.

        Th

  • by Jonboy X ( 319895 ) <jonathan.oexner@ ... u ['lum' in gap]> on Tuesday October 13, 2009 @12:16AM (#29728451) Journal

    Just look at the pics. Changing the name of the executable changed the results dramatically. The driver is apparently detecting when it's running a 3DMark (or some other specific apps) and switches to some other mode to boost its scores/FPS markings.

    • by Eil ( 82413 ) on Tuesday October 13, 2009 @01:08AM (#29728697) Homepage Journal

      But see also Intel's response on page 2:

      We have engineered intelligence into our 4 series graphics driver such that when a workload saturates graphics engine with pixel and vertex processing, the CPU can assist with DX10 geometry processing to enhance overall performance. 3DMarkVantage is one of those workloads, as are Call of Juarez, Crysis, Lost Planet: Extreme Conditions, and Company of Heroes. We have used similar techniques with DX9 in previous products and drivers. The benefit to users is optimized performance based on best use of the hardware available in the system. Our driver is currently in the certification process with Futuremark and we fully expect it will pass their certification as did our previous DX9 drivers.

      And the rest of page 2 indicates that offloading some of the work to the CPU does, for certain games, improve performance significantly. Offhand, this doesn't necessarily seem like a bad thing. Intel is just trying to make the most out of the hardware of the whole machine. Also, one would also do well to bear in mind that the GPU in question is an integrated graphics chipset: they're not out to compete against a modern gaming video adapter and thus have little incentive to pump their numbers in a synthetic benchmark. Nobody buys a motherboard based on the capabilities of the integrated graphics.

      The question that should be asked is: What is the technical reason for the drivers singling out only a handful of games and one benchmark utility instead of performing these optimizations on all 3D scenes that the chipset renders?

      • by Guspaz ( 556486 )

        It makes sense; you'd only want to perform these optimizations in games where you're significantly GPU bound. CPU-heavy games, such as Supreme Commander, are probably better off spending the CPU time on the game itself.

        I'd see this as more laziness than anything else; it's easier to just hard-code in a list of GPU-bottlenecked games than it would be to actually have your driver auto-detect if there is idle CPU time that could be better spent on offloading.

        I don't really see much of an issue with what Intel

  • by iYk6 ( 1425255 ) on Tuesday October 13, 2009 @12:16AM (#29728459)

    Is 3DMark the benchmark that will give a higher score to a VIA graphics card if the Vendor ID is changed to Nvidia?

  • by Anonymous Coward on Tuesday October 13, 2009 @12:22AM (#29728485)

    Intel fully admits that the integrated chipset graphics aren't that great. They freely admit that they offload rendering to the CPU in some cases. This isn't a secret.

    • I think GAMES is the operative word here. A benchmark shouldn't be targeted in such a fashion.
      • Why not? Does this mean no software DirectX implementation can be benchmarked?

        Intel use this optimisation in both games and in benchmarks... There's no problem here.

  • by mpapet ( 761907 ) on Tuesday October 13, 2009 @12:39AM (#29728561) Homepage

    In true White Goodman fashion, cheating is something losers come up with to make them feel better about losing.

    Is Intel the 500 lb. gorilla in chipsets? Sure, and they got there by 'cheating.' Which is winning.

    Aint capitalism grand?

    For all you losers who don't know who the great White Goodman is: http://www.imdb.com/title/tt0364725/ [imdb.com]

    • Re: (Score:3, Insightful)

      by Grishnakh ( 216268 )

      Is Intel the 500 lb. gorilla in chipsets? Sure, and they got there by 'cheating.' Which is winning.

      To be fair, I'm pretty sure that Intel has made the highest-performing chipsets for Intel processors for quite a long time now, occasionally competing with Nvidia (who recently gave up). The other makers like VIA and SiS never had chipsets that worked as well as Intel's. Of course, this is for chipsets which don't have built-in graphics.

      Intel's entries into the 3D graphics market have never been very good, o

      • Re: (Score:3, Interesting)

        by Kjella ( 173770 )

        But why they even bother trying to rig benchmarks like this is beyond me. No one who's serious about graphics performance would use Intel's built-in video.

        No, but there are a lot of 3D games that aren't FPS junkie, the-sky-is-the-limit craving games. For example, I liked King's Bounty which has minimum requirements of "Videocard nVidia GeForce 6600 with 128 Mb or equivalent ATI". Tales of Monkey Island says: "Video: 64MB DirectX 8.1-compliant video card (128MB rec.)". You won't find any of these in the latest AMD/nVidia review, but just pretending to have a little 3D performance can make a difference between "no 3D games at all" and "some non-intensive 3D gam

  • They all cheat (Score:5, Insightful)

    by GF678 ( 1453005 ) on Tuesday October 13, 2009 @01:03AM (#29728679)

    I'm not defending Intel at all, but...

    ATI's done it: http://www.xbitlabs.com/news/video/display/20030526040035.html [xbitlabs.com]

    NVIDIA's done it: http://www.theinquirer.net/inquirer/news/1048824/nvidia-cheats-3dmark-177 [theinquirer.net]

    They've probably done it several times in the past with other benchmarking software as well.

    They're all dishonest. Don't trust anyone!

  • I'm seeing a potential other side to this that doesn't seem be being explored (unless I've missed something) -- if the optimizations are specific to .exes listed in the driver's .inf file, has anyone tried adding other games to the list (or alternately, just renaming another executable to match one in the list)?

    It would seem like an interesting turn if the optimizations are generic, but only enabled for games/applications that Intel has spent time testing them on.

  • by zullnero ( 833754 ) on Tuesday October 13, 2009 @01:48AM (#29728873) Homepage
    You'd think you'd have logic in the GPU that could determine when a certain load was being achieved, certain 3D functionality was being called, etc., and offload some work to a multicore CPU if it was hitting a certain performance threshold (as long as the CPU itself wasn't being pounded...but most games are mainly picking on the GPU and hardly taking full advantage of a quad core CPU or whatever). That makes a degree of sense...using your resources more effectively is a good thing. If that improves your performance scores, well...so what? It measures the fact that your drivers are better than the other card's drivers. That seems like fair play, from a consumer's standpoint. If the competitors can't be bothered to write drivers that work efficiently, that's their problem. Great card + bad drivers = bad investment, as far as I'm concerned. That's the real point of these benchmarking tests, anyway. It's just product marketing.

    But trapping a particular binary name to fix the results? That's being dishonest to customers. They're deliberately trying to trick gamers who just look at the 3DMark benchmarks into buying their hardware, but giving them hardware that won't necessarily perform at the expected level of quality. I generally stick up for Intel, having worked there in the past as a contractor and generally liking the company and people...but this is seriously bad form on their behalf. I'm surprised this stuff got through their validation process...I know I'd have probably choked on my coffee laughing if I were on that team and could see this in their driver code.
  • SOP (Score:3, Insightful)

    by OverflowingBitBucket ( 464177 ) on Tuesday October 13, 2009 @01:59AM (#29728913) Homepage Journal

    Hasn't every chipset maker- ever- been busted for fudging benchmark results at some point? Multiple times, usually?

    And then they get caught out by the old exe-renaming technique.

    Why do they keep trying it? The mind boggles.

    I would have thought by now that a standard tool in the benchmarkers repertoire was a tool that copied each benchmark exe to a different name and location and launched that, followed by a launch with the default name; and that the more popular benchmarks had options to tweak the test ordering and methodology slightly to make application profiling difficult.

    • Marketing execs change all the time. Each one says "Hey! I have an idea...." The programmer who is asked to put in the cheat is not wildly enthusiastic about the idea, knows it won't work and does a quick and dirty hack.
    • Why do they keep trying it? The mind boggles.

      Because we only know about the instances in which they were discovered.

  • Duh (Score:4, Informative)

    by ThePhilips ( 752041 ) on Tuesday October 13, 2009 @05:19AM (#29729649) Homepage Journal

    3DMark Vantage was never a legit benchmark. Heavily tuned for Intel CPU and nVidia GPU architectures it never actually meant a damm thing.

    Just compare performance of gf285/295 v. radeon 4870/5870 (any review) in 3DMark and in games. In 3DMark Vantage nVidia cards have close to 50% advantage while in real games radeons sometimes score higher.

    The statistical anomaly alone is sufficient to dismiss 3DMark Vantage results as outlier.

  • The article isn't loading for me, but: can't they simply measure the amount of CPU used during the benchmark and use that information in the benchmark? I don't think it's basically evil to perform that kind of offloading (except in this case when the rules of 3DMark forbid using empirical data on it to optimize performance; but then again, I would imagine many other pieces of software also get this treatment without bad effects on quality or game experience), but dynamically detecting the situation would de

  • "We have engineered intelligence into our 4 series graphics driver such that when a workload saturates graphics engine with pixel and vertex processing, the CPU can assist with DX10 geometry processing to enhance overall performance. 3DMarkVantage is one of those workloads, as are Call of Juarez, Crysis, Lost Planet: Extreme Conditions, and Company of Heroes. We have used similar techniques with DX9 in previous products and drivers. The benefit to users is optimized performance based on best use of the hard

  • Two solutions come to mind immediately for this.

    First off, here is the offending list of apps:
    ***
    [Enable3DContexts_CTG_AddSwSettings]

    HKR,, ~3DMark03.exe, %REG_DWORD%, 1
    HKR,, ~3DMark06.exe, %REG_DWORD%, 1
    HKR,, ~dreamfall.exe, %REG_DWORD%, 1
    HKR,, ~FEAR.exe, %REG_DWORD%, 1
    HKR,, ~FEARMP.exe, %REG_DWORD%, 1
    HKR,, ~HL2.exe, %REG_DWORD%, 1
    HKR,, ~LEGOIndy.exe, %REG_DWORD%, 1
    HKR,, ~RelicCOH.exe, %REG_DWORD%, 1
    HKR,, ~Sam2.exe, %REG_DWORD%, 1
    HKR,, ~SporeApp.exe, %REG_DWORD%, 1
    HKR,, ~witcher.exe, %REG_DWORD%, 1
    HKR,, ~Wo

  • I hope that new apple systems don't get stuck with this carp video + a dual core cpu. It's the new imac thinner then even with intel core i3 cpu and half as fast video starting at $1200. To get a real video card starting price is $1800.

    Mac mini with the slowest corei3 and 2gb of ram starting at $500-$600.

    APPLE IF you plan to pull that carp at least have a real desktop at $800-$1500+.

Math is like love -- a simple idea but it can get complicated. -- R. Drabek

Working...