Intel Caught Cheating In 3DMark Benchmark 216
EconolineCrush writes "3DMark Vantage developer Futuremark has clear guidelines for what sort of driver optimizations are permitted with its graphics benchmark. Intel's current Windows 7 drivers appear to be in direct violation, offloading the graphics workload onto the CPU to artificially inflate scores for the company's integrated graphics chipsets. The Tech Report lays out the evidence, along with Intel's response, and illustrates that 3DMark scores don't necessarily track with game performance, anyway."
Good reporting there Ric (Score:5, Insightful)
Thanks for telling all of us that the best measure of hardware's performance ingame is... to benchmark it with a game.
Re:Eh? (Score:3, Insightful)
While it makes some sense, triggering the behavior using certain filenames is peculiar to say the least.
I suppose considering that the 3DMark tests are intented to test a hardware solution's peak performance, there is some rationale behind identifying the test executable on some list of "heavy" applications. The guidelines in which 3DMark explicitly forbids that sort of thing are clear, yes. However, in a sense the "spirit" of those guidelines is that they don't want companies trying to cheat by designing driver features/modes for the test which are not usable in actual gameplay.
Since these are (apparently) in use for actual games, it might not be such a heinous violation. Whether the other entries on their list are simply there, with sinister intent, to raise doubts as I've had in this post, who can say?
Still a pretty daft thing to do, but maybe it is a simple mistake rather than intentional deception.
Re:If you're too lazy to RTFA... (Score:5, Insightful)
It seems entirely reasonable to me for them to optimize the driver to run particular programs faster if at all possible.
Perhaps, but you definitely don't do it for the benchmark. The article quotes the 3DMark Vantage guidelines which are perfectly clear.
With the exception of configuring the correct rendering mode on multi-GPU systems, it is prohibited for the driver to detect the launch of 3DMark Vantage executable and to alter, replace or override any quality parameters or parts of the benchmark workload based on the detection. Optimizations in the driver that utilize empirical data of 3DMark Vantage workloads are prohibited.
So yes, SLI and Crossfire are a different case.
Re:Eh? (Score:5, Insightful)
Which would make sense... (Score:4, Insightful)
That was my first thought, too.
Here's the thing, though: They took 3DMarkVantage.exe and renamed it to 3DMarkVintage.exe, and much of that offloading was dropped. So this isn't a general-purpose optimization, which would make sense -- it's a targeted optimization, aimed at and enabled specifically for a benchmark, in order to get higher scores in said benchmark.
It reminds me of the days when Quake3.exe would give you higher benchmarks, but worse video, than Quack3.exe.
Re:Eh? (Score:5, Insightful)
Well, if the GPU becomes saturated, I could imagine the rest of the load spilling over to the CPU (one or many cores). Obviously the GPU is more efficient at video tasks, but if the video task is priority for the user, why not offload to the CPU as well? Makes sense to me.
If you do that for a benchmark app then you are not really testing (just) the performance of the graphics hardware, so turning on that optimization without disclosing it is probably not really a fair comparison of the hardware. To make it 'fair' you really need to make the benchmark app to be aware of the feature and be able to turn it on or off under software control, or at least know if it is enabled or not. I wonder if similar optimisations could be made to any 3D video driver...
In the real world, if the user wants high graphics performance and there are CPU cores doing nothing then like you said, offloading to them makes perfect sense.
Re:Why not? (Score:4, Insightful)
"they should be encouraged to release hand coded or special drivers to improve performance in specific games."
Games, sure - but it defeats the point of benchmarks by introducing a new useless variable: how optimized the driver is for that benchmark. I mean, why should 3dMarkVintage.exe be 30% slower than 3dMarkVantage.exe? How does this help anyone except Intel?
Mod Parent Up (Score:4, Insightful)
Effectively dividing tasks among CPUs is not the issue here. They want to benchmark the GPU and they wanna make sure you don't enable optimizations that are targeted specifically for the benchmark which Intel was doing shamelessly.
Please mod this up; it really is that simple.
Re:If you're too lazy to RTFA... (Score:3, Insightful)
It does point out an weakness in benchmarks over in game tests though. If a company spends all of their time optimizing for specific applications then they will get lower marks in a benchmark than they would in real life. But it isn't fair to apply these to benchmarks. Lends more credence to the 'top 5 games' benchmarks that tomshardware or whoever uses.
They all cheat (Score:5, Insightful)
I'm not defending Intel at all, but...
ATI's done it: http://www.xbitlabs.com/news/video/display/20030526040035.html [xbitlabs.com]
NVIDIA's done it: http://www.theinquirer.net/inquirer/news/1048824/nvidia-cheats-3dmark-177 [theinquirer.net]
They've probably done it several times in the past with other benchmarking software as well.
They're all dishonest. Don't trust anyone!
Re:If you're too lazy to RTFA... (Score:4, Insightful)
But see also Intel's response on page 2:
And the rest of page 2 indicates that offloading some of the work to the CPU does, for certain games, improve performance significantly. Offhand, this doesn't necessarily seem like a bad thing. Intel is just trying to make the most out of the hardware of the whole machine. Also, one would also do well to bear in mind that the GPU in question is an integrated graphics chipset: they're not out to compete against a modern gaming video adapter and thus have little incentive to pump their numbers in a synthetic benchmark. Nobody buys a motherboard based on the capabilities of the integrated graphics.
The question that should be asked is: What is the technical reason for the drivers singling out only a handful of games and one benchmark utility instead of performing these optimizations on all 3D scenes that the chipset renders?
Re:Which would make sense... (Score:3, Insightful)
A practice which is explicitly forbidden per the guidelines. I know lots of Slashdotters don't read the article but I am really beginning to wonder what part of that is so hard to understand. Or maybe that's easy to understand. Maybe it's just that people can assert things that clearly didn't happen, and you find it convincing as long as they do it with confidence.
I'll (re)summarize the article. Intel quite obviously cheated by trying to artificially inflate a benchmark score, and did so in a way that was not permitted by the guidelines of the benchmark. The motive is quite clear, as such benchmarks often influence buying decisions. There's nothing ambiguous about it according to the story.
Reading some of the "debates" below, you'd think this were some complex, nuanced issue. It's amusing and kinda pathetic at the same time.
I don't really know if you can blame this one on the public schools, but you probably can as it seems to be all about the general lack of critical thinking.
Re:Why not? (Score:4, Insightful)
Exactly. If they want to offload GPU processing to the CPUs, then they should do that for ALL programs, not just certain ones in a list.
Re:Good reporting there Ric (Score:3, Insightful)
Why couldn't you be bothered to do your research before replying?
Re:White Goodman Would be Proud (Score:3, Insightful)
Is Intel the 500 lb. gorilla in chipsets? Sure, and they got there by 'cheating.' Which is winning.
To be fair, I'm pretty sure that Intel has made the highest-performing chipsets for Intel processors for quite a long time now, occasionally competing with Nvidia (who recently gave up). The other makers like VIA and SiS never had chipsets that worked as well as Intel's. Of course, this is for chipsets which don't have built-in graphics.
Intel's entries into the 3D graphics market have never been very good, only a "better than nothing" solution for low-end and corporate desktops where customers don't want a relatively expensive add-on graphics card, but want to run very basic 3D applications, such as Google Earth. The cost difference between a motherboard with a non-graphics chipset and Intel's built-in graphics is very nominal, and much cheaper than a separate Nvidia or ATI graphics card, especially when you multiply that difference by thousands of desktop systems as used in corporations. But why they even bother trying to rig benchmarks like this is beyond me. No one who's serious about graphics performance would use Intel's built-in video.
Re:Why not? (Score:3, Insightful)
It's not special drivers for specific games. It's regular drivers with exceptions coded in to make them appear faster on "standardised" tests, which are meant to be an all-purpose benchmark to help consumers identify the sort of card they need (and to compare competing cards). This is cheating to increase sales among the early adopter/benchmarker crowd, impress marketing types and get more units on shelves, and is generally at the cost of the consumer.
No need for a car analogy on this one. So it's like what happens when the public schools teach a generation or two in such a way that they are optimized for performance on standardized tests, and when those students eventually enter the working world, they don't know how to make change without a cash register or other calculator of some sort? The way they don't know how to deconstruct an argument? Let alone understand the importance of things like living within your means?
Re:Wonder if AMD plays fair? (Score:1, Insightful)
Sorry, but I remember that all to clearly. That IS cheating. Quake 3 was a used as a major benchmark at the time and ATI didn't optimize for anything else unless there was a full out issue with the drivers.
So, the lesson learned is it's OK to optimize your drivers to make hardware run better for applications but it's flat out cheating a consumer when that optimization makes them believe that if your hardware runs that app better, it'll run just about every other app better.
THATS WHAT GAME BENCHMARKING GPU's IS FOR.
If you were improving the GPU for gaming... (Score:4, Insightful)
But trapping a particular binary name to fix the results? That's being dishonest to customers. They're deliberately trying to trick gamers who just look at the 3DMark benchmarks into buying their hardware, but giving them hardware that won't necessarily perform at the expected level of quality. I generally stick up for Intel, having worked there in the past as a contractor and generally liking the company and people...but this is seriously bad form on their behalf. I'm surprised this stuff got through their validation process...I know I'd have probably choked on my coffee laughing if I were on that team and could see this in their driver code.
SOP (Score:3, Insightful)
Hasn't every chipset maker- ever- been busted for fudging benchmark results at some point? Multiple times, usually?
And then they get caught out by the old exe-renaming technique.
Why do they keep trying it? The mind boggles.
I would have thought by now that a standard tool in the benchmarkers repertoire was a tool that copied each benchmark exe to a different name and location and launched that, followed by a launch with the default name; and that the more popular benchmarks had options to tweak the test ordering and methodology slightly to make application profiling difficult.
No organizational memory. (Score:2, Insightful)
Re:Do the optimizations work for anything else? (Score:5, Insightful)
That's not interesting. How do you plan to connect a non-Intel CPU to an Intel chipset with integrated graphics?
Re:Hmm... (Score:3, Insightful)
That's incorrect, I'm afraid. That's because the vast majority of the buyers are not clued-in. Consequently, they are lead to believe that the system is ready for gaming when it may not be. This is the core of the issue. 3DMark is supposed to inform consumers about performance, without having to read up on the relative merits of the 15 different chipset families out there. When someone cheats on 3DMark, they are making life more difficult for consumers seeking to make informed decisions with a minimal amount of effort.
They are all guilty of cheating at some point (Score:3, Insightful)
Both ATI and nVidia have been caught cheating [extremetech.com] (and by cheating I mean specifically targeting the FutureMark benchmarks to make their products look better than they actually are). The above link is only a single instance. A quick google will net you a good sampling over the last decade or two.
Optimizing a driver for a specific game is not cheating as long as it doesn't affect quality. Optimizing your driver to get inflated scores specifically in a benchmark is cheating.
Correct. (Score:3, Insightful)
The behaviour their driver has in the benchmark is also used in several games... ie Crysis Warhead. RTFA.
The issue is that the driver treats different games differently, based on filename. Some get this boost and some don't. Whether you put 3DMark into the boosted or unboosted category, its results will be indicative of some games and not of others.
Re:Wonder if AMD plays fair? (Score:3, Insightful)
maybe you don't care, but lots of sales are linked to good benchmark scores. Where the intel product does worse, they're making it look as if it's better. A polished turd is still a turd.
Think of this like comcast's speedboost. It sounds great to be able to be at 50MB/s downstream, except you only get to do it for 30 seconds, thus making real downloads not receive benefit.
wow we're at 50 mb/s! etc al.
Re:Wonder if AMD plays fair? (Score:4, Insightful)
In any case application specific optimizations are a great tool. They got an extra 18% speed out of the chip with just application specific tweaks. That's a pretty damn significant increase. Ignoring that would be a terrible decision. All graphics drivers should use this and update the drivers every few months as new games come out.
The app-specific optimizations actually made Crysis look like shit, and ate more CPU power (you need an extra core to play Crysis), and the damn thing was still smashed by an equivalent AMD chip that could play Crysis at twice the frame rate (which was 30fps, rather than an unusable 15fps). The benchmark showed that Intel's was about 30% faster than AMD's offering, which in real life use was actually twice as fast as Intel's.