More on Futuremark and nVidia 429
AzrealAO writes "Futuremark and nVidia have released statements regarding the controversy over nVidia driver optimzations and the FutureMark 2003 Benchmark. "Futuremark now has a deeper understanding of the situation and NVIDIA's optimization strategy. In the light of this, Futuremark now states that NVIDIA's driver design is an application specific optimization and not a cheat."" So nVidia's drivers are optimized specifically to run 3DMark2003... and that's not a cheat.
Yeah, right (Score:3, Insightful)
Futuremark shoots self in foot. (Score:5, Insightful)
And now Futuremark has totally invalidated their own benchmark software by declaring it "open season" for hardware manufacturers to distort the "tests" in any way shape or form they desire to make the numbers higher.
N.
Futuremark scared? (Score:3, Insightful)
Cheating is relative. (Score:1, Insightful)
Re:riiiiight... (Score:5, Insightful)
That is the way it sounds, isn't it?
"Application-specific optimization". . . In other words, "We're not cheating, we're just adding code to our driver to make sure our card works really well with benchmarking software." Of course, if it works better with benchmarking software than it does with real-world applications, that is cheating, isn't it?
It actually reminded me of the axiom, "That's not a bug, it's a feature!"
Stack Creep (Score:2, Insightful)
WE DONT CARE. Just use games for benchmarks! (Score:5, Insightful)
Short on details, but raises questions... (Score:4, Insightful)
So the question in my mind is did Futuremark learn something from the discussions? Is there something it was ignoring in its tests?
I'm trying to not be a cynic and assume a big fat envelope was passed under the table. That what Nvidia did was legitimate.
Big quality loss (Score:4, Insightful)
"This card is optimized for quake as long as you follow the left trail, the right trail will just look like crap but nobody follows it anyway".
What's the point of a benchmark? (Score:5, Insightful)
I call this bullshit.... (Score:5, Insightful)
The benchmark is ment to reflect performance in the actual game, the reason it takes the same path is merely to make the results comparable. What ATI was something the game *could* have achieved in game, if the operations were properly sequenced. What Nvidia did is to fake a performance it can't actually give if a person had followed the exact same path in the game. That is cheating.
It is pathetic by Nvidia, and it's pathetic by Futuremark to present this press statement. Get some backbone and integrity.
Kjella
So much for FutureMark (Score:5, Insightful)
If NVidia wants to do application-specific optimizations that make UT2003 go faster, then that would be great. That's what they should be doing. Those are optimizations that genuinely benefit the user.
Re:It is NOT a cheat. (Score:5, Insightful)
More tellingly, the driver deliberately flaunts the D3D spec by omitting buffer clears, mucking about with clip planes, etc.
As others have said, Futuremark's statement is just covering their legal arse. If someone modifies their code to get better scores in some benchmarks while introducing deliberate bugs (i.e. incorrect rendering), it's a cheat in my book.
Should benchmarks allow optimizations? (Score:2, Insightful)
There's a line from the story:
"...However, recent developments in the graphics industry and game development suggest that a different approach for game performance benchmarking might be needed, where manufacturer-specific code path optimization is directly in the code source. Futuremark will consider whether this approach is needed in its future benchmarks."
I'm concerned because I feel that allowing video card manufacturers to put code specifically about certain benchmarks in to their product (making their product look better in that benchmark) may not be reflective of real world performance.
However, the benchmark is useless if it doesn't measure real world performance, so I do believe that NVidia could put stuff in their product to make the benchmark run faster that would be beneficial to real applications, so I'm torn.
Some game manufacturers make optimized versions of their code to work with certain video cards, but the normal use is an operating system driver (DirectX...) and I believe using the generic driver is more representative of what you'll get when you use the video card.
It seems that NVidia is arguing that they should be allowed to put optimizations in to their code specifically for the benchmark because they do the same thing with some other populate applications.
Re:It is NOT a cheat. (Score:2, Insightful)
And you say these are not cheats as long as nvidia uses them in games? are you completely lobotomized?
The numbers... (Score:5, Insightful)
But I found a really nice german benchmark site 3dcenter.org [3dcenter.org] that had to be the best benchmarks ive ever seen, they actually use the games on each and lists the fps.
Looks like the FX/GF4 5200/4200 4600/5600 (non ultra) are the same. And the ATI 9700PRO/9800 are faster than the 5800 Ultra.
After reading these benchmarks, you can really tell nvidia tweaked the SHIT out its drivers for futuremark...
Re:Cheating is relative. (Score:3, Insightful)
There are a lot of "optimizations" you can use if you know in advance what you'll need to draw. You don't even need a 3d engine, you could just pre-render everything and put an mpeg into the driver. I bet that'd be very fast.
It wouldn't make the actual game run any faster, though.
Re:Great! (Score:3, Insightful)
Having such a logo labeling program might be a revenue opportunity for FutureMark.
Another revenue center for FutureMark might be to sell benchmark result coefficients. Each different card's results are biased by some coefficient, whose value can be purchased according to a tiered pricing model. The ensures uniformly fair bias according to what each video card vendor is willing to spend.
On the video card side of the fence, couldn't the rom in a video card put up a boot time splash screen that sports advertising? Such an ad would be flashed into the card by the card's drivers once the OS is up and connected to the net. The video card driver could also take measures to be sure you see the boot-time ads sufficiently frequently.
Re:riiiiight... (Score:2, Insightful)
Alas, not legit...... (Score:4, Insightful)
Re:GREAT With Me (Score:5, Insightful)
Argh (Score:4, Insightful)
The point of a benchmark is to test dissimilar systems against common references to get an idea of how they perform against each other in such a way that you have an apples to apples comparison.
If 3DMark writes their program in a way that allows optimization paths for a specific GPU, then it is no longer a benchmark.
You now no longer have an idea of how fast the card REALLY runs as there is no guarantee that game writers will use GPU-specific optimizations. It's the same thing as MMX...Nobody sees the benefits if it's not hardcoded into the program, so what's the point if being uberfast in a benchmark if you won't necessarily see the same results in the real world?
Re:riiiiight... (Score:3, Insightful)
I guess you just need to buy another card at that point.
In the spirit of George Bush (Score:5, Insightful)
That seems a bit more appropriate to the story, doesn't it.
Re:I call this bullshit.... (Score:1, Insightful)
I made this observation the last time (Score:5, Insightful)
But this misses the whole point of 3dmark 2003. Different developers stress the pixel and triangle pipelines in different ways to produce a whole boatload of effects. While major games and engines are often optimized-for, there is no guarantee that ATI or Nvidia will sit down and optimize for the game you just bought.
That said, 3dmark 2003 should be considered a relational tool for generic perfrormance. Consider it a good bet that if two cards perform similarly and acceptably, the two cards should be able to run almost any DX8/DX9 game off the shelf acceptably.
The fact that Nvidia's unopitmized drivers perform significantly behind ATI's unoptimized drivers in 3dmark 2003 raises a significant question:
We all know how well the 5900 does in Quake III, Serious Sam 2, UT2003, etc, but how does it do in ?
I want to know that if I take *insert random DX8 game here* home to play, IT WILL PERFORM WELL. That is the entire point of having a benchmark like 3dmark. To do application-specific optimizations for it is to nullify the entire point of the benchmark.
Re:NVIDIA convinced them to change the rules (Score:4, Insightful)
It's more like saying "What's 7+7+7+x+7+7?" For the benchmark program, x happens to be 7, leading to 7*6, but the general case is actually 7*5+x. No attempt is made to check an arbitrary game to see if x is 7. It only applies the optimization if the executable is a particular benchmark.
Re:This is not a cheat--its a good thing. (Score:3, Insightful)
Re:NVIDIA convinced them to change the rules (Score:3, Insightful)
That wouldn't help. The optimization is applied only to the benchmark program. In this case, x represents the direction the camera is facing at a particular time. In a game, this is unpredictable and non-optimizable. In the demo, it was set. The optimization does not translate to any gains in any game, no matter what the game developers do.
optimization vs. cheat (Score:2, Insightful)
Secondly, and this is a new train of thought for me, if nVidia had made the benchmark run faster without sacrificing image quality, I think it should be allowed to detect the benchmark was running and have a code path optimized in the driver for it. This could be used to show exactly how fast the hardware is capable of running optimized to the hilt by the driver developers. It could actually have a benefit of showing game developers how they should code their software, that sequencing instructions in this particular order or using certain architecture specific instructions are THAT much faster. Sort of like how 3DNow enhanced Quake showed off how much faster the K6 could be. Unfortunately this was not what they did, they were caught red-handed, and now they're just throwing their weight around. SHAME ON nVidia.
nVidia and ATI different? (Score:2, Insightful)
So I say that both nVidia and ATI should be ashamed of themselves for such unethical practices.
Re:Bullshit (Score:2, Insightful)
I disagree.
The type of customer that I'm thinking of is the one who walks into the store and says "I want SuperDuper XXX with a 32bit frigmataz and the blue sticker on the box." In other words, a kid who has no idea of what the terminology actually means, but wants what was recommended to him by his buddy George who goes to school in the next town and has a super-cool copy of some hacker tool that, well, we don't know what it does but it has a cool name, and his sister is kind of cute too, you know, but of course we can't admit that.
George says, "Look at this. This has a bigger number than that one!" and Little Johnny plonks down his cash.
Re:GREAT With Me (Score:3, Insightful)
3DMark03 is designed as an un-optimized DirectX test and it provides performance comparisons accordingly.
ATI/Nvidia optimized shaders, not cheating.
Nvidia optimized for rendering in a benchmark, cheating.
You can optimize, as long as you dont know beforehand what is going to be used in the Benchmark.
Re:NVIDIA convinced them to change the rules (Score:5, Insightful)
Actually, WE care. (Score:3, Insightful)
It's no use benchmarking on the latest and greatest games -- because, as developers, we try and avoid releasing games that run horribly (slowly or with obvious bugs) on certain cards. Sometimes we can persuade the manufacturers to fix their bugs, but the timescales can be tight and soemtimes they're quite happy to not fix the bugs, especially if they know their competition runs the codepath you're looking at faster than them, and we get forced to drop the feature. So instead of games pushing up hardware quality, games are held back by shoddy hardware or (more usually) drivers. You're just benchmarking on what the manufacturers already know works. Zzzz. Futuremark's job is to stress-test in advance what's coming up.
And even if Futuremark does things that aren't always what you'd do in games, they are trying to push the cards to the limits to see if they do what the manufacturers claim, or whether they only achieve their claimed performance "in controlled tests".
So some of us talk to the Futuremark guys and say things like, "We're looking into using [technology X] in our next game, but the drivers on cards [A and B] are screwed, works on [C] though. Could you put a section into 3DMark 2004 that uses [X]?". Then, when their card performs miserably at [X] (even though the card's hardware can handle it -- it's just that they've been slack on the drivers) they get shamed into improving their quality at those features. A bit like WHQL [microsoft.com] for games.
Once the driver bugs for the features are fixed, we can write code that uses them.
Except NVidia decided to stop playing nice when it turned out the latest tests make their cards look quite poor, and noticeably slower than ATI's. So they took their ball and went home (dropped out of Futuremark's beta programme). This is why they didn't know their cheating would be discovered.
And of course, this problem is compounded by people like yourself, Mr Sonic, who see big numbers in Quake benchmarks (you do realise 3D card mfgs "optimise" those too, right?), pop wood, and rush off to buy the latest hovercraft [pcworld.com] no matter if it's not really "all that".
Incidentally, ATI's optimisations were exactly that: optimisations. Essentially, they were reordering instructions in a shader -- exactly like a compiler optimising instruction order for Intel or AMD processors' particular quirks. The meaning and, more importantly, on-screen output of the code was not altered.
Whereas it's clear to see from the screenshots in the original expose article, that NVidia were not optimising, but actually not running code, causing the onscreen output to look wrong. As developers, we don't want gamers returning our games to the shops "because it goes wrong when you do X". Nor do we want to sweat blood trying to invent ways to avoid their driver bugs.
Oh... someone else who cares about 3DMarks? OEMs. When it comes around to picking what cards to put inside big-name off-the-shelf PCs or, eventually, which chips to surface-mount on the all-in-one motherboard, they're looking for price-performance, and 3DMark is a part of that equation.