More on Futuremark and nVidia 429
AzrealAO writes "Futuremark and nVidia have released statements regarding the controversy over nVidia driver optimzations and the FutureMark 2003 Benchmark. "Futuremark now has a deeper understanding of the situation and NVIDIA's optimization strategy. In the light of this, Futuremark now states that NVIDIA's driver design is an application specific optimization and not a cheat."" So nVidia's drivers are optimized specifically to run 3DMark2003... and that's not a cheat.
Fine With Me (Score:4, Interesting)
It's just another piece of information to keep in mind when selecting a new card.
It is NOT a cheat. (Score:0, Interesting)
I'm not going to complain if they can optimize the drivers to get 10% better performance in Battlefield 1942 with no degredation in quality, would you?
The only way I would call this a cheat would be if Nvidia never optimized for any other applications.
Re:Cheat? (Score:1, Interesting)
So in effect, the optimizations make 3DMark a *better* predictor of in-game performance than it otherwise would have been.
Bullshit (Score:5, Interesting)
There was no need for this nicey-nice statement other than NVidia threatening lawsuits and Futuremark wanting to protect what assets they have.
Futuremark had every right to call NVidia on their selfish claim and unbelievable hacks. To say that they weren't liable for their own blunder is to say that Futuremark's reputation has been replaced by corporatespeak and a lack of respect almost unparalelled.
What's worse is that I really thought "Yeah, this time the bad guy gets his due" and that NVidia should've known better.
But of course, a few weeks later we've got to put on the nice face again for the public en large.
What a complete waste of time. I know there isn't much respect left in corporate America, but hell, if you can't call a spade a spade, why even bother with the benchmarks when someone can just rewrite an ENTIRE SHADER and only keep a picture clear while the demo is on rails?
Re:Sure sounds fair to me (Score:0, Interesting)
Quality is determined by REAL use. (Score:3, Interesting)
Re:Cheat? (Score:5, Interesting)
Specifically designing your product to work better in a test than in real life should be considered cheating.
This could be avoided if 3DMark2003 would release different methods of testing the video cards each year... or if one could download updates from 3DMark2003 that would block any driver-specific optimizations.
I usually look at the latest and greatest fps benchmark for the latest and greatest game anyway.
Well, actually... my current Nvidia video card laughs at my little CPU anyway. I until I can find some more CPU to drive my screaming video card... I am not going to find any performance increase.
Davak
Looking closer... (Score:2, Interesting)
NVIDIA convinced them to change the rules (Score:5, Interesting)
However, recent developments in the graphics industry and game development suggest that a different approach for game performance benchmarking might be needed, where manufacturer-specific code path optimization is directly in the code source. Futuremark will consider whether this approach is needed in its future benchmarks.
I can sort of see the argument here, but it basically ruins the point of having a standard interface like DirectX. It's also like telling your math teacher, "no, it would be easier for my equations if you made 1+1=3. Now do it because I'm your star student."
Re:Fine With Me (Score:1, Interesting)
granted you shouldnt draw conclusions from a single benchmark, but being led to use false information to base choices on is WRONG.
goons...hired goons (Score:3, Interesting)
Nvidia: Knock knock
Futuremark: "Who's there?"
Nvidia: "Goons...hired goons."
Futuremark: "Oh...haha...um...Nvidia is actually in the business of application optimisation! Our mistake. Won't happen again."
Seriously folks, this is Nvidia using big bad lawyers to scare Futuremark into capitulating. They might have held their ground, until ATI was proven to be doing the same thing, albeit to a much lesser degree.
Unfortunately, the only person who loses in this scenario is the consumer.
Re:Futuremark shoots self in foot. (Score:3, Interesting)
Of course, each hack would or would not work with any particular game, but trial and error can be used to detect the "best set of hacks" for any particular game on any particular card. And we all know how geeks love tweaking things to the metal, just look at gentoo's current popularity.
Then FutureMark would get themselves a name as *the* benchmark software to run on end users' machines to test the hacks.
Or maybe I'm just dreaming.
Highly illogical posters (Score:5, Interesting)
When you're looking for a video card, you -should- rely on a capable, and untainted/optimized benchmark for comparison simply because you can't predict what the software companies that make the actual games are going to do. Will they support -your- chosen card, or will some other GPU maker offer a better bribe to the developer? You may know that kind of info about games shipping RSN, or already on the shelves, but what about next year's?
Getting the card based simply on one or two games instead of looking at some kind of objective benchmark does no good whatsoever. It's just a way to rope yourself into upgrading the card faster.
Re:GREAT With Me (Score:0, Interesting)
From what i could glean, Nvidia changed their software in such away that visuals outside of walls, where you're not supposed to be anyway, were not rendered correctly. I don't see how this is a cheat, if it even marginally improves game rendering. Mind you, it may have made an even bigger impact on 3Dmark, or what have you, than the game. But, that only means that 3Dmark must adjust to the card, and that their software is not able to correctly give a good diagnostic. Thus, they've since talked to Nvidia, and will adjust for the driver changes.
What people aren't realizing is that the first games, 2D games with scrolling screens, did the same thing. The way they could make those games actually perform, is they only rendered a few blocks outside of what is actually shown on the screen. Rendering an entire map in Duke Nukem 1,2,3 (the 2-D version), would have been suicide on a 80386. My god people, it's Engineering, doing the bare minimum to get the job done faster, cheaper.
Re:Cheat? (Score:1, Interesting)
Benchmarks and games (Score:2, Interesting)
When it comes to games, if they want to tweak for a specific game, im all for it. But if you want to tweak a benchmark thats rather unfair.
But then I think ATI does much of the same thing, they just havent been caught.
Re:Sure sounds fair to me (Score:2, Interesting)
You bet.
Wank?
On what grounds? I think there's more than enough evidence in the world to conclude that BG is a POS, whether you say it in a funny way or not. So get off the "Being anti-MS is so old" routine. It will be old once the people who run the company stop being the richest people in the world and get more than a slap on the wrist in court.
Having said that, I think the parallels between nVidia/FutureMark and MS/DOJ are pretty straightforward.
Re:riiiiight... (Score:1, Interesting)
He really didn't say it, though. It's a misattributed quote, just like "there's a sucker born every minute."
Re:NVIDIA convinced them to change the rules (Score:4, Interesting)
Futuremark should (Score:3, Interesting)
It's sort of akin to walking around with a backpack full of cinderblocks. That way, when you put down those cinderblocks(ie benchmark), you'll notice how much stronger you got.
Perhaps they should use .NET for their next benchmark. Or Java. That'll be the true test of a video card :-)
Re:I call this bullshit.... (Score:3, Interesting)
What if they optimized not just for Q3, but for Q3 Level 1 while you're using player model X on a sunny day with a BFG and 13 bots? Would you cry foul? Are they cheating to make Q3 run faster? Isn't that their job? Where is the line? And, if 90% of the best selling games today run at that same "optimized" speed, how good of a benchmark can it be if it isn't optimized? Isn't it then showing results that are artificially low?
If you want to use the video card to play optimized games, why would you care about the results of an unoptimized test?
Re:Cheat? (Score:2, Interesting)
Re:riiiiight... (Score:1, Interesting)
Nor do they have any way of knowing the code for future versions of 3Dmark, which would make it harder to implement "cheating" benchmark optimizations. The same is true for nextgen games. isnt that why they constantly release New Drivers, to add optimizations to make the latest & greatest games work & look better on your existing video card? I think so
What they're really saying... (Score:4, Interesting)
They are saying that the boards have become different enough that game writers are coding differently for them. Not too surprising, really. That's the way it's always been.
This makes writing a synthetic benchmark extraordinarily difficult, needless to say. I don't know if it's even possible in this case. Perhaps rather than try to come up with one number that specifies how fast a board is, you can come up with a series of metrics for each capability.
While I'm sure that FutureMark has had some pressure applied to them to make this statement, it's not an unreasonable statement on its face. It's just the path they took to get there that is questionable.
Bogus (Score:4, Interesting)
Sorry, but that's garbage, pure and simple. Or are you not aware that PS 1.4 support is _required_ for DX9 cards with PS2.0 support. Your complaint may be valid when comparing a GF4 against a Radeon 8500, but is totally bogus when comparing two DX9 cards.
"And their "DX9" onyl test is a piece of crap too. They use one or two new instructions in the VS, and PS2.0 is only used for the sky."
Gee, one minute you're complaining that they use PS1.4 instructions, and now you're complaining that they don't use PS2.0 instructions. PS1.4 instructions _are_ effectively DX9 instructions since other than ATI, no other DX8 chips use them: you need a DX9 chip to run PS1.4 shaders.
And it would appear to be real lucky for nvidia that they don't use many PS2.0 instructions since from the results of their shader test once the nvidia "optimization" of throwing away the shader and running a completely different shader was fixed, shows them running PS2.0 shaders at about half the speed of a Radeon 9800. The low performance of PS2.0 shaders on the FX card seems to be the reason why nvidia hated 3DMark03 so much; there was no way to get a good score without redesigning the chip or "optimizing".
Optimizations, Cheats, and Objectivity. (Score:5, Interesting)
first off, for those of you wondering what the big deal with 3DMark 2003 is - and why you might use it in place of "real games" to benchmark 3D performance - here you go:
3DMark is a test application to benchmark next-generation performance, so that you can get an idea how your video card might handle games that will be out this time _next_ year. Specifically some aspects of 3DMark are geared toward testing DX9 functionality, and it's Pixel and Vertex shaders. No game currently on the market uses these features (at least not that I am aware of).
Secondly, the difference between a cheat and optimization is a fine one. If a given function continually produces the same output for the same inputs, and it takes 1 second to do so, and another function can produce the same results given the same inputs, but only takes 1/2 a second - it can be said to be functionally equivalent. However, it has been optimized. It's entirely possible, even desirable to replace pixel shaders and vertex shaders with routines which are optimized for your hardware. In much the same way that compilers schedule instructions optimally for the underlying CPU architecture, so too can instructions be re-ordered in a pixel shader routine... It's an optimization.
Cheating occurs when people start making approximations (analogies to bringing a cheat-sheet to a test are not valid), or by failing to process (in the case of video cards) the same visual fidelity, and detail that was intended. By example:
A> Reducing texture bit-depth.
B> Reducing geometry detail (merging 2 or more polygons).
This is only cheating if it's not the intent of the original application developer (not driver developer).
A driver developer could make the following optimizations, since they don't affect the intent of the application developer:
A> Pre-calc tables. A classic demo optimization would be to precalc a SIN function table to some level of precision as looking up a value was faster than calculating it on the fly.
B> Replacing various pixel/vertex shader routines with functionally equivalent, but faster ones.
C> Reordering data and textures (keeping detail and fidelity) into more optimal chunks for your hardware architecture.
Those aren't cheats - they are optimizations. Of course, the only way you can tell this is if you have an objective standard to gauge against. 3DMark 2003 doesn't seem to provide this. In order to do so they would need the following:
A> A software renderer for their demo.
B> Timed snapshots of the demo saving uncompressed images from the software renderer to disk.
C> The ability to re-run the demo using a hardware renderer (3D Card and drivers).
D> The ability to take the same snapshots and save them, uncompressed to disk.
E> The ability to do a histogram, per-pixel comparison to the software renders...
This would enable you to arrive at some objective comparison of visual fidelity - instead of the occassionally subjective I think screenshot X looks better than screenshot Y. Without the intent of the 3DMark developers being known, we really can't know how true the hardware vendors and their drivers are to the original vision.
Anything less than 3% difference is highly likely to be indistinguishable from the intent of the developers in this case. 5% to 10% may be visible, but acceptable (i.e. tweaks for speed in place of quality). Over 10% and you're playing with fire.
Re:But we need FutureMark (Score:2, Interesting)
Well, how can you use 3DMark to get 'real world' performance out of a video card? Answer: you can't because it is a synthetic benchmark designed to test out features of cards that also have just come out.
It is not testing real world performance, just some synthetic made up grahpics demo. Shouldn't it also try and emulate an application that is optimized as much as possiable so as to get the highest possiable performance instead of the lowest score? It's not real world, so the highest possiable score should be what Futuremark should be aiming for. This means optimizing their programs for every card. Getting down to the metal and making sure that everything runs as smoothly as possiable on every card they want to test on so as to get the best possiable performace.
I can already tell you that the worst possiable perforamnce on all future video cards on any possiable software will be, it's less then one frame per second. This we already know, what we should be using programs like 3DMark to find out is how fast a computer with a card CAN run. because 3DMark software is about the possiabilities that a card has, not its REAL WORLD performance.
So I say that 3DMark should let nVidia, Ati and etc create their drivers with special code paths. As long as it doesn't impact performance with normal programs, why not? 3DMark and other synthetic benchmarks should welcome the 'cheating' as it only validates the fact that people actually care about their benchmark. Then we might actually know how fast our cards can go given the right conditions.
MOD PARENT DOWN (Score:3, Interesting)
All is fair in love and war (Score:2, Interesting)
For the record I have one personal expierence with this from ATI as well. My roomate at the time had a Gateway with a GF2 that went bad, Then sent him an early ATI Raedon to replace it. The card was flawless at 100fps in Q3, but if you loaded up Half-Life it ran like crap, never getting over 10fps. Ati coded the original Raedons to look great in Q3 because that was the hot benchmarking game out there at the time. My buddy eventually beat Gateway about the head and neck until they sent him another Nvidia card and then he got his 60+fps in both Q3 and HL.
Just another case of the pot calling the kettle black in my eyes.
Re:NVIDIA convinced them to change the rules (Score:3, Interesting)
Its odd they would make their benchmark useless, and thats what it has become now IMHO - it doesnt tell you how fast a random game is likely to run, it tells you how good their hackers are at hand tuning a meaningless benchmark.
Re:Highly illogical posters (Score:0, Interesting)
So, rather than using one or two real-world datapoints, you should base your decision on no real-world datapoints?
Re:3DMark is a terrible benchmark anyway (Score:3, Interesting)
What Futuremark is currently doing is akin to taking a Dodge Ram pickup and a Ford Mustang and saying that Dodge got 5427 Marks and Ford had 5621 Marks. What does that tell you? How can you truly compare the two? The Ford is better in some way, but without the base numbers, the result is useless.
Re:I Will Just Continue To Buy ATI (Score:3, Interesting)
If you're concerned about nVidia's ethics, perhaps you should check out ATi's background [tech-report.com].