Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Intel Graphics Technology

Intel Caught Cheating In 3DMark Benchmark 216

EconolineCrush writes "3DMark Vantage developer Futuremark has clear guidelines for what sort of driver optimizations are permitted with its graphics benchmark. Intel's current Windows 7 drivers appear to be in direct violation, offloading the graphics workload onto the CPU to artificially inflate scores for the company's integrated graphics chipsets. The Tech Report lays out the evidence, along with Intel's response, and illustrates that 3DMark scores don't necessarily track with game performance, anyway."
This discussion has been archived. No new comments can be posted.

Intel Caught Cheating In 3DMark Benchmark

Comments Filter:
  • by Shadow of Eternity ( 795165 ) on Tuesday October 13, 2009 @12:05AM (#29728355)

    Thanks for telling all of us that the best measure of hardware's performance ingame is... to benchmark it with a game.

  • Re:Eh? (Score:3, Insightful)

    by Anonymous Coward on Tuesday October 13, 2009 @12:23AM (#29728489)

    While it makes some sense, triggering the behavior using certain filenames is peculiar to say the least.

    I suppose considering that the 3DMark tests are intented to test a hardware solution's peak performance, there is some rationale behind identifying the test executable on some list of "heavy" applications. The guidelines in which 3DMark explicitly forbids that sort of thing are clear, yes. However, in a sense the "spirit" of those guidelines is that they don't want companies trying to cheat by designing driver features/modes for the test which are not usable in actual gameplay.

    Since these are (apparently) in use for actual games, it might not be such a heinous violation. Whether the other entries on their list are simply there, with sinister intent, to raise doubts as I've had in this post, who can say?

    Still a pretty daft thing to do, but maybe it is a simple mistake rather than intentional deception.

  • by cjfs ( 1253208 ) on Tuesday October 13, 2009 @12:36AM (#29728545) Homepage Journal

    It seems entirely reasonable to me for them to optimize the driver to run particular programs faster if at all possible.

    Perhaps, but you definitely don't do it for the benchmark. The article quotes the 3DMark Vantage guidelines which are perfectly clear.

    With the exception of configuring the correct rendering mode on multi-GPU systems, it is prohibited for the driver to detect the launch of 3DMark Vantage executable and to alter, replace or override any quality parameters or parts of the benchmark workload based on the detection. Optimizations in the driver that utilize empirical data of 3DMark Vantage workloads are prohibited.

    So yes, SLI and Crossfire are a different case.

  • Re:Eh? (Score:5, Insightful)

    by parallel_prankster ( 1455313 ) on Tuesday October 13, 2009 @12:40AM (#29728565)
    Effectively dividing tasks among CPUs is not the issue here. They want to benchmark the GPU and they wanna make sure you don't enable optimizations that are targeted specifically for the benchmark which Intel was doing shamelessly.
  • by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Tuesday October 13, 2009 @12:49AM (#29728599) Journal

    That was my first thought, too.

    Here's the thing, though: They took 3DMarkVantage.exe and renamed it to 3DMarkVintage.exe, and much of that offloading was dropped. So this isn't a general-purpose optimization, which would make sense -- it's a targeted optimization, aimed at and enabled specifically for a benchmark, in order to get higher scores in said benchmark.

    It reminds me of the days when Quake3.exe would give you higher benchmarks, but worse video, than Quack3.exe.

  • Re:Eh? (Score:5, Insightful)

    by jamesh ( 87723 ) on Tuesday October 13, 2009 @12:49AM (#29728603)

    Well, if the GPU becomes saturated, I could imagine the rest of the load spilling over to the CPU (one or many cores). Obviously the GPU is more efficient at video tasks, but if the video task is priority for the user, why not offload to the CPU as well? Makes sense to me.

    If you do that for a benchmark app then you are not really testing (just) the performance of the graphics hardware, so turning on that optimization without disclosing it is probably not really a fair comparison of the hardware. To make it 'fair' you really need to make the benchmark app to be aware of the feature and be able to turn it on or off under software control, or at least know if it is enabled or not. I wonder if similar optimisations could be made to any 3D video driver...

    In the real world, if the user wants high graphics performance and there are CPU cores doing nothing then like you said, offloading to them makes perfect sense.

  • Re:Why not? (Score:4, Insightful)

    by rm999 ( 775449 ) on Tuesday October 13, 2009 @12:51AM (#29728607)

    "they should be encouraged to release hand coded or special drivers to improve performance in specific games."

    Games, sure - but it defeats the point of benchmarks by introducing a new useless variable: how optimized the driver is for that benchmark. I mean, why should 3dMarkVintage.exe be 30% slower than 3dMarkVantage.exe? How does this help anyone except Intel?

  • Mod Parent Up (Score:4, Insightful)

    by causality ( 777677 ) on Tuesday October 13, 2009 @12:52AM (#29728615)

    Effectively dividing tasks among CPUs is not the issue here. They want to benchmark the GPU and they wanna make sure you don't enable optimizations that are targeted specifically for the benchmark which Intel was doing shamelessly.

    Please mod this up; it really is that simple.

  • by Idiomatick ( 976696 ) on Tuesday October 13, 2009 @12:57AM (#29728655)
    Optimizing for games makes sense or rendering software. Optimizing for benchmarks seems like a pretty clear violation of the rules.

    It does point out an weakness in benchmarks over in game tests though. If a company spends all of their time optimizing for specific applications then they will get lower marks in a benchmark than they would in real life. But it isn't fair to apply these to benchmarks. Lends more credence to the 'top 5 games' benchmarks that tomshardware or whoever uses.
  • They all cheat (Score:5, Insightful)

    by GF678 ( 1453005 ) on Tuesday October 13, 2009 @01:03AM (#29728679)

    I'm not defending Intel at all, but...

    ATI's done it: http://www.xbitlabs.com/news/video/display/20030526040035.html [xbitlabs.com]

    NVIDIA's done it: http://www.theinquirer.net/inquirer/news/1048824/nvidia-cheats-3dmark-177 [theinquirer.net]

    They've probably done it several times in the past with other benchmarking software as well.

    They're all dishonest. Don't trust anyone!

  • by Eil ( 82413 ) on Tuesday October 13, 2009 @01:08AM (#29728697) Homepage Journal

    But see also Intel's response on page 2:

    We have engineered intelligence into our 4 series graphics driver such that when a workload saturates graphics engine with pixel and vertex processing, the CPU can assist with DX10 geometry processing to enhance overall performance. 3DMarkVantage is one of those workloads, as are Call of Juarez, Crysis, Lost Planet: Extreme Conditions, and Company of Heroes. We have used similar techniques with DX9 in previous products and drivers. The benefit to users is optimized performance based on best use of the hardware available in the system. Our driver is currently in the certification process with Futuremark and we fully expect it will pass their certification as did our previous DX9 drivers.

    And the rest of page 2 indicates that offloading some of the work to the CPU does, for certain games, improve performance significantly. Offhand, this doesn't necessarily seem like a bad thing. Intel is just trying to make the most out of the hardware of the whole machine. Also, one would also do well to bear in mind that the GPU in question is an integrated graphics chipset: they're not out to compete against a modern gaming video adapter and thus have little incentive to pump their numbers in a synthetic benchmark. Nobody buys a motherboard based on the capabilities of the integrated graphics.

    The question that should be asked is: What is the technical reason for the drivers singling out only a handful of games and one benchmark utility instead of performing these optimizations on all 3D scenes that the chipset renders?

  • by Anonymous Coward on Tuesday October 13, 2009 @01:16AM (#29728727)

    Here's the thing, though: They took 3DMarkVantage.exe and renamed it to 3DMarkVintage.exe, and much of that offloading was dropped. So this isn't a general-purpose optimization, which would make sense -- it's a targeted optimization, aimed at and enabled specifically for a benchmark, in order to get higher scores in said benchmark.

    A practice which is explicitly forbidden per the guidelines. I know lots of Slashdotters don't read the article but I am really beginning to wonder what part of that is so hard to understand. Or maybe that's easy to understand. Maybe it's just that people can assert things that clearly didn't happen, and you find it convincing as long as they do it with confidence.

    I'll (re)summarize the article. Intel quite obviously cheated by trying to artificially inflate a benchmark score, and did so in a way that was not permitted by the guidelines of the benchmark. The motive is quite clear, as such benchmarks often influence buying decisions. There's nothing ambiguous about it according to the story.

    Reading some of the "debates" below, you'd think this were some complex, nuanced issue. It's amusing and kinda pathetic at the same time.

    I don't really know if you can blame this one on the public schools, but you probably can as it seems to be all about the general lack of critical thinking.

  • Re:Why not? (Score:4, Insightful)

    by Grishnakh ( 216268 ) on Tuesday October 13, 2009 @01:16AM (#29728731)

    Exactly. If they want to offload GPU processing to the CPUs, then they should do that for ALL programs, not just certain ones in a list.

  • Did you actually read the article? The driver was shown to be using the same cheats when the benchmark executable was renamed. This isn't about actual optimizations as far as the GPU is concerned; it's about falsifying results by using the CPU instead.

    Why couldn't you be bothered to do your research before replying?
  • by Grishnakh ( 216268 ) on Tuesday October 13, 2009 @01:26AM (#29728789)

    Is Intel the 500 lb. gorilla in chipsets? Sure, and they got there by 'cheating.' Which is winning.

    To be fair, I'm pretty sure that Intel has made the highest-performing chipsets for Intel processors for quite a long time now, occasionally competing with Nvidia (who recently gave up). The other makers like VIA and SiS never had chipsets that worked as well as Intel's. Of course, this is for chipsets which don't have built-in graphics.

    Intel's entries into the 3D graphics market have never been very good, only a "better than nothing" solution for low-end and corporate desktops where customers don't want a relatively expensive add-on graphics card, but want to run very basic 3D applications, such as Google Earth. The cost difference between a motherboard with a non-graphics chipset and Intel's built-in graphics is very nominal, and much cheaper than a separate Nvidia or ATI graphics card, especially when you multiply that difference by thousands of desktop systems as used in corporations. But why they even bother trying to rig benchmarks like this is beyond me. No one who's serious about graphics performance would use Intel's built-in video.

  • Re:Why not? (Score:3, Insightful)

    by causality ( 777677 ) on Tuesday October 13, 2009 @01:26AM (#29728795)

    It's not special drivers for specific games. It's regular drivers with exceptions coded in to make them appear faster on "standardised" tests, which are meant to be an all-purpose benchmark to help consumers identify the sort of card they need (and to compare competing cards). This is cheating to increase sales among the early adopter/benchmarker crowd, impress marketing types and get more units on shelves, and is generally at the cost of the consumer.

    No need for a car analogy on this one. So it's like what happens when the public schools teach a generation or two in such a way that they are optimized for performance on standardized tests, and when those students eventually enter the working world, they don't know how to make change without a cash register or other calculator of some sort? The way they don't know how to deconstruct an argument? Let alone understand the importance of things like living within your means?

  • by Anonymous Coward on Tuesday October 13, 2009 @01:37AM (#29728835)

    Sorry, but I remember that all to clearly. That IS cheating. Quake 3 was a used as a major benchmark at the time and ATI didn't optimize for anything else unless there was a full out issue with the drivers.

    So, the lesson learned is it's OK to optimize your drivers to make hardware run better for applications but it's flat out cheating a consumer when that optimization makes them believe that if your hardware runs that app better, it'll run just about every other app better.

    THATS WHAT GAME BENCHMARKING GPU's IS FOR.

  • by zullnero ( 833754 ) on Tuesday October 13, 2009 @01:48AM (#29728873) Homepage
    You'd think you'd have logic in the GPU that could determine when a certain load was being achieved, certain 3D functionality was being called, etc., and offload some work to a multicore CPU if it was hitting a certain performance threshold (as long as the CPU itself wasn't being pounded...but most games are mainly picking on the GPU and hardly taking full advantage of a quad core CPU or whatever). That makes a degree of sense...using your resources more effectively is a good thing. If that improves your performance scores, well...so what? It measures the fact that your drivers are better than the other card's drivers. That seems like fair play, from a consumer's standpoint. If the competitors can't be bothered to write drivers that work efficiently, that's their problem. Great card + bad drivers = bad investment, as far as I'm concerned. That's the real point of these benchmarking tests, anyway. It's just product marketing.

    But trapping a particular binary name to fix the results? That's being dishonest to customers. They're deliberately trying to trick gamers who just look at the 3DMark benchmarks into buying their hardware, but giving them hardware that won't necessarily perform at the expected level of quality. I generally stick up for Intel, having worked there in the past as a contractor and generally liking the company and people...but this is seriously bad form on their behalf. I'm surprised this stuff got through their validation process...I know I'd have probably choked on my coffee laughing if I were on that team and could see this in their driver code.
  • SOP (Score:3, Insightful)

    by OverflowingBitBucket ( 464177 ) on Tuesday October 13, 2009 @01:59AM (#29728913) Homepage Journal

    Hasn't every chipset maker- ever- been busted for fudging benchmark results at some point? Multiple times, usually?

    And then they get caught out by the old exe-renaming technique.

    Why do they keep trying it? The mind boggles.

    I would have thought by now that a standard tool in the benchmarkers repertoire was a tool that copied each benchmark exe to a different name and location and launched that, followed by a launch with the default name; and that the more popular benchmarks had options to tweak the test ordering and methodology slightly to make application profiling difficult.

  • by mac1235 ( 962716 ) on Tuesday October 13, 2009 @02:18AM (#29728949)
    Marketing execs change all the time. Each one says "Hey! I have an idea...." The programmer who is asked to put in the cheat is not wildly enthusiastic about the idea, knows it won't work and does a quick and dirty hack.
  • by edmudama ( 155475 ) on Tuesday October 13, 2009 @02:45AM (#29729065)

    That's not interesting. How do you plan to connect a non-Intel CPU to an Intel chipset with integrated graphics?

  • Re:Hmm... (Score:3, Insightful)

    by Jeppe Salvesen ( 101622 ) on Tuesday October 13, 2009 @04:08AM (#29729429)

    Nobody buys Intel integrated chips because of how they do on 3D mark. Nobody thinks they are any serious kind of performance. Hell, most people are amazed to find out that these days they are good enough that you can, in fact, play some games on them (though not near as well as dedicated hardware). So I can't imagine they are gaining lots of sales out of this. Remember these are chips on the board itself. You either got a board with one or didn't. You don't pick one up later because you liked the numbers.

    That's incorrect, I'm afraid. That's because the vast majority of the buyers are not clued-in. Consequently, they are lead to believe that the system is ready for gaming when it may not be. This is the core of the issue. 3DMark is supposed to inform consumers about performance, without having to read up on the relative merits of the 15 different chipset families out there. When someone cheats on 3DMark, they are making life more difficult for consumers seeking to make informed decisions with a minimal amount of effort.

  • by DJRumpy ( 1345787 ) on Tuesday October 13, 2009 @07:37AM (#29730245)

    Both ATI and nVidia have been caught cheating [extremetech.com] (and by cheating I mean specifically targeting the FutureMark benchmarks to make their products look better than they actually are). The above link is only a single instance. A quick google will net you a good sampling over the last decade or two.

    Optimizing a driver for a specific game is not cheating as long as it doesn't affect quality. Optimizing your driver to get inflated scores specifically in a benchmark is cheating.

  • Correct. (Score:3, Insightful)

    by InvisiBill ( 706958 ) on Tuesday October 13, 2009 @08:00AM (#29730351) Homepage

    The behaviour their driver has in the benchmark is also used in several games... ie Crysis Warhead. RTFA.

    The issue is that the driver treats different games differently, based on filename. Some get this boost and some don't. Whether you put 3DMark into the boosted or unboosted category, its results will be indicative of some games and not of others.

  • by poetmatt ( 793785 ) on Tuesday October 13, 2009 @09:03AM (#29730765) Journal

    maybe you don't care, but lots of sales are linked to good benchmark scores. Where the intel product does worse, they're making it look as if it's better. A polished turd is still a turd.

    Think of this like comcast's speedboost. It sounds great to be able to be at 50MB/s downstream, except you only get to do it for 30 seconds, thus making real downloads not receive benefit.

    wow we're at 50 mb/s! etc al.

  • by bluefoxlucid ( 723572 ) on Tuesday October 13, 2009 @10:28AM (#29731599) Homepage Journal

    In any case application specific optimizations are a great tool. They got an extra 18% speed out of the chip with just application specific tweaks. That's a pretty damn significant increase. Ignoring that would be a terrible decision. All graphics drivers should use this and update the drivers every few months as new games come out.

    The app-specific optimizations actually made Crysis look like shit, and ate more CPU power (you need an extra core to play Crysis), and the damn thing was still smashed by an equivalent AMD chip that could play Crysis at twice the frame rate (which was 30fps, rather than an unusable 15fps). The benchmark showed that Intel's was about 30% faster than AMD's offering, which in real life use was actually twice as fast as Intel's.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...