More on Futuremark and nVidia 429
AzrealAO writes "Futuremark and nVidia have released statements regarding the controversy over nVidia driver optimzations and the FutureMark 2003 Benchmark. "Futuremark now has a deeper understanding of the situation and NVIDIA's optimization strategy. In the light of this, Futuremark now states that NVIDIA's driver design is an application specific optimization and not a cheat."" So nVidia's drivers are optimized specifically to run 3DMark2003... and that's not a cheat.
riiiiight... (Score:5, Funny)
Re:riiiiight... (Score:5, Insightful)
That is the way it sounds, isn't it?
"Application-specific optimization". . . In other words, "We're not cheating, we're just adding code to our driver to make sure our card works really well with benchmarking software." Of course, if it works better with benchmarking software than it does with real-world applications, that is cheating, isn't it?
It actually reminded me of the axiom, "That's not a bug, it's a feature!"
NVIDIA convinced them to change the rules (Score:5, Interesting)
However, recent developments in the graphics industry and game development suggest that a different approach for game performance benchmarking might be needed, where manufacturer-specific code path optimization is directly in the code source. Futuremark will consider whether this approach is needed in its future benchmarks.
I can sort of see the argument here, but it basically ruins the point of having a standard interface like DirectX. It's also like telling your math teacher, "no, it would be easier for my equations if you made 1+1=3. Now do it because I'm your star student."
Re:NVIDIA convinced them to change the rules (Score:4, Interesting)
Re:NVIDIA convinced them to change the rules (Score:5, Insightful)
Re:NVIDIA convinced them to change the rules (Score:4, Informative)
Shrug... welcome to reality. DirectX, OpenGL, etc. don't properly model the hardware in some cases leading to much worse performance than should be available.
It's not like saying 1+1 = 3. It's more like saying what's 7+7+7+7+7+7? Well, it's the same as 7*6, but guess which one is faster to calculate?
And it's not quite like that either, I know, because the bit that FutureMark is tentatively agreeing with is Nvidia changing the shader precision, which can lead to a loss of quality (so maybe it is 1+1 = 1.999999999998).
ATI did pretty much the same thing with their drivers, leading to a much slimmer 1.9% improvement. Of course, it's unclear how much of Nvidia's improvement was from the shader changes (which FM is considering) versus from the other modifications they made (like clipping issues). The latter points are not in question by FM - they are cheating.
Re:NVIDIA convinced them to change the rules (Score:5, Informative)
Re:NVIDIA convinced them to change the rules (Score:4, Insightful)
It's more like saying "What's 7+7+7+x+7+7?" For the benchmark program, x happens to be 7, leading to 7*6, but the general case is actually 7*5+x. No attempt is made to check an arbitrary game to see if x is 7. It only applies the optimization if the executable is a particular benchmark.
Re:NVIDIA convinced them to change the rules (Score:3, Insightful)
That wouldn't help. The optimization is applied only to the benchmark program. In this case, x represents the direction the camera is facing at a particular time. In a game, this is unpredictable and non-optimizable. In the demo, it was set. The optimization does not translate to any gains in any game, no matter what the game developers do.
Re:NVIDIA convinced them to change the rules (Score:3, Interesting)
Its odd they would make their benchmark useless, and thats what it has become now IMHO - it doesnt tell you how fast a random game is likely to run, it tells you how good their hackers are at hand tuning a meaningless benchmark.
Re:riiiiight... (Score:4, Funny)
Come to think of it, why doesn't nVidia just optimize their software for games instead of benchmarking software...?
Re:riiiiight... (Score:3, Insightful)
I guess you just need to buy another card at that point.
Re:riiiiight... (Score:3, Informative)
> with real-world applications, that is cheating, isn't it?
I like it when card manufacturers optimize their driver to achieve high
Quake III Arena frame rates, because coincidently Quake III Arena is my
favourite game (and actually the ONLY game I play). I don't care if the
drivers are good by thoughtful design, or by re-engineering the Q3A code
path and then constructing a driver that is an exact fit (possibly with
penalties for othe
Re:riiiiight... (Score:3, Funny)
Re:riiiiight... (Score:4, Funny)
"Officer, I wasn't speeding. I was driving in a manner consistant with the road, conditions and the huge motor in my car!"
"It was creative accounting sir! Not an attempt to 'cook the books'."
"We are only writing software with the features our users want. This isn't code bloat, and I never made that remark about 640k being enough for anyone!"
More?
Re:riiiiight... (Score:3, Informative)
Re:riiiiight... (Score:5, Funny)
Fine With Me (Score:4, Interesting)
It's just another piece of information to keep in mind when selecting a new card.
Futuremark should (Score:3, Interesting)
It's sort of akin to walking around with a backpack full of cinderblocks. That way, when you put down those cinderblocks(ie benchmark), you'll notice how much stronger you got.
Perhaps they should use .NET for their next benchmark. Or
Re:GREAT With Me (Score:5, Insightful)
Re:GREAT With Me (Score:3, Funny)
Re:GREAT With Me (Score:3, Informative)
Not clearing the back buffer and other "on rails" cheats are still classified as cheats. It's merely the shader changes that are being considered as possibly OK. Which, btw, is similar to what ATI did. But I don't know that
Re:GREAT With Me (Score:3, Insightful)
3DMark03 is designed as an un-optimized DirectX test and it provides performance comparisons accordingly.
ATI/Nvidia optimized shaders, not cheating.
Nvidia optimized for rendering in a benchmark, cheating.
You can optimize, as long as you dont know beforehand what is going to be used in the Benchmark.
Cheat? (Score:3, Funny)
Errrr... that seems like a cheat to me!
Davak
Re:Cheat? (Score:5, Interesting)
Specifically designing your product to work better in a test than in real life should be considered cheating.
This could be avoided if 3DMark2003 would release different methods of testing the video cards each year... or if one could download updates from 3DMark2003 that would block any driver-specific optimizations.
I usually look at the latest and greatest fps benchmark for the latest and greatest game anyway.
Well, actually... my current Nvidia video card laughs at my little CPU anyway. I until I can find some more CPU to drive my screaming video card... I am not going to find any performance increase.
Davak
Re:Cheat? (Score:5, Funny)
That is right, this is not a cheat.. we are just redesigning the arrow and repainting the target so they match;)
Re:Cheat? (Score:2)
However, if it is designed to make a benchmark run faster... well, that's just not nice.
Davak
Re:Cheat? (Score:2)
That was an editors comment, hence not in italics like the rest of the blurb. And it was posted by Michael so it probably wasn't meant to be sarcastic.
Get a clue, fuctape
Sure sounds fair to me (Score:5, Funny)
whee (Score:5, Funny)
Yeah, right (Score:3, Insightful)
Re:Yeah, right (Score:3, Funny)
optimize
verb. optimized, optimizing, optimizes
See cheat.
1. Jimmy optimized his test score when the teacher wasn't looking.
Futuremark shoots self in foot. (Score:5, Insightful)
And now Futuremark has totally invalidated their own benchmark software by declaring it "open season" for hardware manufacturers to distort the "tests" in any way shape or form they desire to make the numbers higher.
N.
Re:Futuremark shoots self in foot. (Score:2)
Re:Futuremark shoots self in foot. (Score:2)
Now the last 2 months of PC ads are worthless if based on Futuremark scores.
Hey, 0-60mph, in 2 hours...
Re:Futuremark shoots self in foot. (Score:3, Interesting)
Of course, each hack would or would not work with any particular game, but trial and error can be used to detect the "best set of hacks" for any particular game on any particular card. And we all know how geeks love tweaking things t
Futuremark scared? (Score:3, Insightful)
Re:Futuremark scared? (Score:5, Funny)
Ha, yes and we all know how well that works, don't we? You mark my words, we'll see nVidia's tanks rolling over Poland by Christmas...
In the spirit of Bill Clinton (Score:4, Funny)
Oh, I did get caught?
No, I didn't. Let's move on, shall we?
In the spirit of George Bush (Score:5, Insightful)
That seems a bit more appropriate to the story, doesn't it.
Bullshit (Score:5, Interesting)
There was no need for this nicey-nice statement other than NVidia threatening lawsuits and Futuremark wanting to protect what assets they have.
Futuremark had every right to call NVidia on their selfish claim and unbelievable hacks. To say that they weren't liable for their own blunder is to say that Futuremark's reputation has been replaced by corporatespeak and a lack of respect almost unparalelled.
What's worse is that I really thought "Yeah, this time the bad guy gets his due" and that NVidia should've known better.
But of course, a few weeks later we've got to put on the nice face again for the public en large.
What a complete waste of time. I know there isn't much respect left in corporate America, but hell, if you can't call a spade a spade, why even bother with the benchmarks when someone can just rewrite an ENTIRE SHADER and only keep a picture clear while the demo is on rails?
Re:Bullshit (Score:2)
You're calling bullshit? Okay, you can have it. I call the last piece of pie.
Why would anyone call bullshit? It tastes like, well you know.
Re:Bullshit (Score:3, Funny)
This is politics at its worst, and I'm calling bullshit.
You know, maybe that was the idea? Imagine the scenario: "Okay, folks, we need to rephrase the statement that 'NVidia cheated' so that they won't sue our pants off our asses. What can we come up with?"
'I know! Let's call it an application-specific enhancement! Their lawyers will stare blankly, but any geek shopping for a video card will read right through it!'
Who knows? Coulda happened that way. :)
Re:Bullshit (Score:3, Funny)
I'm sorry, Bullshit isn't here to answer your call.
Please leave a message after the tone.
*BEEP*
Re:Bullshit (Score:3, Informative)
Stack Creep (Score:2, Insightful)
Great! (Score:5, Funny)
Re:Great! (Score:3, Insightful)
Having such a logo labeling program might be a revenue opportunity for FutureMark.
Another revenue center for FutureMark might be to sell benchmark result coefficients. Each different card's results are biased by some coefficient, whose value can be purchased according to a tiered pricing model. The ensures uniformly fair bias according to what each video card vendor is willing to spend.
On the video card side of the fence, cou
WE DONT CARE. Just use games for benchmarks! (Score:5, Insightful)
Testing new technologies... (Score:3, Informative)
I think the idea was to test new technologies that haven't been implemented yet in Quake 3 or Unreal Tournament 2003 (like in the upcoming DOOM III).
A quote of a quote in their 10/26/98 press release:
Actually, WE care. (Score:3, Insightful)
It's no use benchmarking on the latest and greatest games -- because, as developers, we try and avoid releasing games that run horribly (slowly or with obvious bugs) on certain cards. Sometimes we can persuade the manufacturers to f
In Other News ... (Score:2, Funny)
... FutureMark and NVidia stated that they were proud to announce that former President Bill Clinton had joined their boards and had assumed management responsibilities.
Re:In Other News ... (Score:2, Funny)
Quality is determined by REAL use. (Score:3, Interesting)
Re:Quality is determined by REAL use. (Score:2)
The problem is that card manufacturers are optimizing for this year's games rather than for the standards that would optimize their performance on next year's games.
--G
Re:Quality is determined by REAL use. (Score:2)
Someone set up us the cheat! (Score:3, Funny)
Damn, I'm impressed. (Score:2)
In 2101, this joke will probably still get modded Funny. . . :)
Short on details, but raises questions... (Score:4, Insightful)
So the question in my mind is did Futuremark learn something from the discussions? Is there something it was ignoring in its tests?
I'm trying to not be a cynic and assume a big fat envelope was passed under the table. That what Nvidia did was legitimate.
Alas, not legit...... (Score:4, Insightful)
Big quality loss (Score:4, Insightful)
"This card is optimized for quake as long as you follow the left trail, the right trail will just look like crap but nobody follows it anyway".
What's the point of a benchmark? (Score:5, Insightful)
Looking closer... (Score:2, Interesting)
I call this bullshit.... (Score:5, Insightful)
The benchmark is ment to reflect performance in the actual game, the reason it takes the same path is merely to make the results comparable. What ATI was something the game *could* have achieved in game, if the operations were properly sequenced. What Nvidia did is to fake a performance it can't actually give if a person had followed the exact same path in the game. That is cheating.
It is pathetic by Nvidia, and it's pathetic by Futuremark to present this press statement. Get some backbone and integrity.
Kjella
Re:I call this bullshit.... (Score:3, Interesting)
What if they optimized not just for Q3, but for Q3 Level 1 while you're using player model X on a sunny day with a BFG and 13 bots? Would you cry foul? Are they cheating to make Q3 run faster? Isn't that their job? Where is the line? And, if 90% of the best selling games today run at that same "optimized" speed, how good
So much for FutureMark (Score:5, Insightful)
If NVidia wants to do application-specific optimizations that make UT2003 go faster, then that would be great. That's what they should be doing. Those are optimizations that genuinely benefit the user.
But we need FutureMark (Score:5, Informative)
Problem is, NVIDIA didn't just optimize. Their application specific "optimization" made the images look worse. And when you couldn't notice it, it was because they were clipping outside the camera angle (becuase they knew exactly where the camera was, something they can't do when you're playing UT2003)
Like the original statement by futurmark said, optimization is great. But when you change the image intended by the software designer in order to make it go faster, that's not an optimization. For god's sake, I can turn all the details to low on UT2003 and get it to go faster, but that's besides the point
Reason games specific benchmark don't fly for me (although now that Futuremark has issued this statement, I'm sure ATI will start cheating as well, making the whole thing useless) is that synthetic benchmarks can test features new to the cards that games may not yet have implemented. So I have an idea how it'll perform with future games.
Re:But we need FutureMark (Score:3, Informative)
Sure...but my point was that by just benchmarking with current games, you can't get real world performance of features that haven't been implemented in that game. A good 3dmark score generally means a good gaming experience (unless the drivers are cheating in the 3dmark score).
Shouldn't it also try and emulate an application that is optimized as much as possiable so as to get the highest possiable performance instead of
Well than you for the summary ... (Score:2, Funny)
Editorial comments like this are wonderful. They make the best shine out in all of us to which we can feel profoundly enlightened. Michael I thank you for taking the time to summarize two press releases by stating the complete obvious, if not for you I might just have been forced to click those god-awful annoying hyper-links.
Since you've obviously missed the "concern" over this whole issue let me help you out i
Re:Well than you for the summary ... (Score:2)
To quote President Clinton's lawyers... (Score:2)
It's not a cheat (Score:2)
I don't know about you, but I'd rather have my nVidia card optimized for, say, Quake3 or Battlefield 1942. Someone call ATI and tell them they've got their new marketing campaign.
NVidias meeting with futuremark (Score:5, Funny)
*shakes hundred dollar bill side to side, speaks in high tone out of side of mouth*
"Sure is, Boss!"
interesting... (Score:2)
I can understand that if there are thing in a GPU that can be optimized for an application then you should go ahead and do it, but of course that means how do you truly, evenly compare the performance of one piece of hardware (now with tons of customizable software) versus another one?
Futuremark has a tough time ahead now getting people to believe they add
goons...hired goons (Score:3, Interesting)
Nvidia: Knock knock
Futuremark: "Who's there?"
Nvidia: "Goons...hired goons."
Futuremark: "Oh...haha...um...Nvidia is actually in the business of application optimisation! Our mistake. Won't happen again."
Seriously folks, this is Nvidia using big bad lawyers to scare Futuremark into capitulating. They might have held their ground, until ATI was proven to be doing the same thing, albeit to a much lesser degree.
Unfortunately, the only person who loses in this scenario is the consumer.
Act I, scene 2: Goons (Score:3, Funny)
*knock* *knock* *knock*
nVidia: Who is it?
*Futuremark goons enter stage left*
Futuremark: You's late wit you's beta program "membership dues". You know what happens wit da peoples dat don't pay they's dues, right?
nVidia: Piss off! We're not paying this year.
Futuremark: Dat's a pretty benchmark score you got there. Be a shame is somethin' BAD happened to it...
*Futuremark pushes nVidia's benchmark trophy over, shattering it*
Futuremark: oops. Looks like
Should benchmarks allow optimizations? (Score:2, Insightful)
There's a line from the story:
"...However, recent developments in the graphics industry and game development suggest that a different approach for game performance benchmarking might be needed, where manufacturer-specific code path opt
The numbers... (Score:5, Insightful)
But I found a really nice german benchmark site 3dcenter.org [3dcenter.org] that had to be the best benchmarks ive ever seen, they actually use the games on each and lists the fps.
Looks like the FX/GF4 5200/4200 4600/5600 (non ultra) are the same. And the ATI 9700PRO/9800 are faster than the 5800 Ultra.
After reading these benchmarks, you can really tell nvidia tweaked the SHIT out its drivers for futuremark...
Highly illogical posters (Score:5, Interesting)
When you're looking for a video card, you -should- rely on a capable, and untainted/optimized benchmark for comparison simply because you can't predict what the software companies that make the actual games are going to do. Will they support -your- chosen card, or will some other GPU maker offer a better bribe to the developer? You may know that kind of info about games shipping RSN, or already on the shelves, but what about next year's?
Getting the card based simply on one or two games instead of looking at some kind of objective benchmark does no good whatsoever. It's just a way to rope yourself into upgrading the card faster.
3DMark is a terrible benchmark anyway (Score:4, Informative)
Also, PS 1.4 shaders don't always translate 'up' to PS 2.0 hardware very well, which is why (IMHO) Nvidia started all this hub-bub in the first place.
The only vendor that natively supports PS 1.4 is ATI.
They should have created PS 1.1 shaders for the masses, and then if 2.0 hardware is detected, had 2.0 shaders for everything.
And their "DX9" onyl test is a piece of crap too. They use one or two new instructions in the VS, and PS2.0 is only used for the sky. big whoop. No branching in the VS, two sided stencil, or anything cool.
It's sad the OEMs put alot of stock in 3Dmark, they don't seem to realize that gamers play games all day, not benchmarks.
Bogus (Score:4, Interesting)
Sorry, but that's garbage, pure and simple. Or are you not aware that PS 1.4 support is _required_ for DX9 cards with PS2.0 support. Your complaint may be valid when comparing a GF4 against a Radeon 8500, but is totally bogus when comparing two DX9 cards.
"And their "DX9" onyl test is a piece of crap too. They use one or two new instructions in the VS, and PS2.0 is only used for the sky."
Gee, one minute you're complaining that they use PS1.4 instructions, and now you're complaining that they don't use PS2.0 instructions. PS1.4 instructions _are_ effectively DX9 instructions since other than ATI, no other DX8 chips use them: you need a DX9 chip to run PS1.4 shaders.
And it would appear to be real lucky for nvidia that they don't use many PS2.0 instructions since from the results of their shader test once the nvidia "optimization" of throwing away the shader and running a completely different shader was fixed, shows them running PS2.0 shaders at about half the speed of a Radeon 9800. The low performance of PS2.0 shaders on the FX card seems to be the reason why nvidia hated 3DMark03 so much; there was no way to get a good score without redesigning the chip or "optimizing".
MOD PARENT DOWN (Score:3, Interesting)
You don't need to "analyze the data" to figure this out. futuremark themself stated how much PS 1.4/2.0 shaders are used. And that 3x times more geometry figure if you don't support PS1.4 is correct - guess why? If you do a
Re:3DMark is a terrible benchmark anyway (Score:3, Interesting)
What Futuremark is currently doing is akin to taking a Dodge Ram picku
I love corperate honesty! (Score:3, Funny)
And I suppose people who cheat at online MMRGPs are just using undocumented program calls and extended functionality. It's all so clear now...
What would be great...
If someone was to reverse engineer the drivers, remove the "Optimisation", recompile and compare results. See what percent the "Optimisations" fudged the results.
Re:I love corperate honesty! (Score:4, Informative)
If someone was to reverse engineer the drivers, remove the "Optimisation", recompile and compare results. See what percent the "Optimisations" fudged the results.
Don't have to. In the previous story, they stated how they simply removed the condition that the driver used to switch on this optimisation and, as you said, saw what percent the optimisation fudged the result.
Argh (Score:4, Insightful)
The point of a benchmark is to test dissimilar systems against common references to get an idea of how they perform against each other in such a way that you have an apples to apples comparison.
If 3DMark writes their program in a way that allows optimization paths for a specific GPU, then it is no longer a benchmark.
You now no longer have an idea of how fast the card REALLY runs as there is no guarantee that game writers will use GPU-specific optimizations. It's the same thing as MMX...Nobody sees the benefits if it's not hardcoded into the program, so what's the point if being uberfast in a benchmark if you won't necessarily see the same results in the real world?
What they're really saying... (Score:4, Interesting)
They are saying that the boards have become different enough that game writers are coding differently for them. Not too surprising, really. That's the way it's always been.
This makes writing a synthetic benchmark extraordinarily difficult, needless to say. I don't know if it's even possible in this case. Perhaps rather than try to come up with one number that specifies how fast a board is, you can come up with a series of metrics for each capability.
While I'm sure that FutureMark has had some pressure applied to them to make this statement, it's not an unreasonable statement on its face. It's just the path they took to get there that is questionable.
I made this observation the last time (Score:5, Insightful)
But this misses the whole point of 3dmark 2003. Different developers stress the pixel and triangle pipelines in different ways to produce a whole boatload of effects. While major games and engines are often optimized-for, there is no guarantee that ATI or Nvidia will sit down and optimize for the game you just bought.
That said, 3dmark 2003 should be considered a relational tool for generic perfrormance. Consider it a good bet that if two cards perform similarly and acceptably, the two cards should be able to run almost any DX8/DX9 game off the shelf acceptably.
The fact that Nvidia's unopitmized drivers perform significantly behind ATI's unoptimized drivers in 3dmark 2003 raises a significant question:
We all know how well the 5900 does in Quake III, Serious Sam 2, UT2003, etc, but how does it do in ?
I want to know that if I take *insert random DX8 game here* home to play, IT WILL PERFORM WELL. That is the entire point of having a benchmark like 3dmark. To do application-specific optimizations for it is to nullify the entire point of the benchmark.
Optimizations, Cheats, and Objectivity. (Score:5, Interesting)
first off, for those of you wondering what the big deal with 3DMark 2003 is - and why you might use it in place of "real games" to benchmark 3D performance - here you go:
3DMark is a test application to benchmark next-generation performance, so that you can get an idea how your video card might handle games that will be out this time _next_ year. Specifically some aspects of 3DMark are geared toward testing DX9 functionality, and it's Pixel and Vertex shaders. No game currently on the market uses these features (at least not that I am aware of).
Secondly, the difference between a cheat and optimization is a fine one. If a given function continually produces the same output for the same inputs, and it takes 1 second to do so, and another function can produce the same results given the same inputs, but only takes 1/2 a second - it can be said to be functionally equivalent. However, it has been optimized. It's entirely possible, even desirable to replace pixel shaders and vertex shaders with routines which are optimized for your hardware. In much the same way that compilers schedule instructions optimally for the underlying CPU architecture, so too can instructions be re-ordered in a pixel shader routine... It's an optimization.
Cheating occurs when people start making approximations (analogies to bringing a cheat-sheet to a test are not valid), or by failing to process (in the case of video cards) the same visual fidelity, and detail that was intended. By example:
A> Reducing texture bit-depth.
B> Reducing geometry detail (merging 2 or more polygons).
This is only cheating if it's not the intent of the original application developer (not driver developer).
A driver developer could make the following optimizations, since they don't affect the intent of the application developer:
A> Pre-calc tables. A classic demo optimization would be to precalc a SIN function table to some level of precision as looking up a value was faster than calculating it on the fly.
B> Replacing various pixel/vertex shader routines with functionally equivalent, but faster ones.
C> Reordering data and textures (keeping detail and fidelity) into more optimal chunks for your hardware architecture.
Those aren't cheats - they are optimizations. Of course, the only way you can tell this is if you have an objective standard to gauge against. 3DMark 2003 doesn't seem to provide this. In order to do so they would need the following:
A> A software renderer for their demo.
B> Timed snapshots of the demo saving uncompressed images from the software renderer to disk.
C> The ability to re-run the demo using a hardware renderer (3D Card and drivers).
D> The ability to take the same snapshots and save them, uncompressed to disk.
E> The ability to do a histogram, per-pixel comparison to the software renders...
This would enable you to arrive at some objective comparison of visual fidelity - instead of the occassionally subjective I think screenshot X looks better than screenshot Y. Without the intent of the 3DMark developers being known, we really can't know how true the hardware vendors and their drivers are to the original vision.
Anything less than 3% difference is highly likely to be indistinguishable from the intent of the developers in this case. 5% to 10% may be visible, but acceptable (i.e. tweaks for speed in place of quality). Over 10% and you're playing with fire.
Re:It is NOT a cheat. (Score:5, Insightful)
More tellingly, the driver deliberately flaunts the D3D spec by omitting buffer clears, mucking about with clip planes, etc.
As others have said, Futuremark's statement is just covering their legal arse. If someone modifies their code to get better scores in some benchmarks while introducing deliberate bugs (i.e. incorrect rendering), it's a cheat in my book.
Re:It is NOT a cheat. (Score:3, Informative)
That's the whole issue...
Re:It is NOT a cheat. (Score:2, Insightful)
And you say these are not cheats as long as nvidia uses them in games? are you completely lobotomized?
Re:Cheating is relative. (Score:3, Insightful)
There are a lot of "optimizations" you can use if you know in advance what you'll need to draw. You don't even need a 3d engine, you could just pre-render everything and put an mpeg into the driver. I bet that'd be very fast.
It wouldn't make the actual game run any faster, though.
Re:If it's a cheat, benchmark is, too! (Score:2, Informative)
Re:Article text: (Score:4, Funny)
You have got to be the only troll to get modded to +5 practically every time you post. Quite a feat, actually.
Article Text Interpreted (Score:5, Funny)
Paragraph 1: We're making a statement.
Paragraph 2: nVidia didn't really cheat.
Paragraph 3: Most computer games cheat.
Paragraph 4: We don't allow companies to cheat in their code.
Paragraph 5: Therefore, we should cheat in ours.
Nvidia Statement: They should have worked with us for a better cheat.
Joint Statement:We should all cheat together.
Footer 1: Futuremark rocks.
Footer 2: Don't steal our IP.
Footer 3: Nvidia rocks.
Footer 4: Really it does.
Footer 5: Of course, we could be lying.
Footer 6: Don't steal our IP.
Re:futuremark needs a new strategy (Score:5, Informative)
The inherent value of a benchmark is the notional "apples to apples" comparison, and you've taken even that away. There would now be no reason at all to use 3DMark.
Re:This is not a cheat--its a good thing. (Score:3, Insightful)
Re:I Will Just Continue To Buy ATI (Score:3, Interesting)
If you're concerned about nVidia's ethics, perhaps you should check out ATi's background [tech-report.com].
Re:Ati 9800 is faster than the 5900. (Score:3, Informative)
Future Mark build 320 vs. 330. (330 doesnt have the nvidia cheat...)
3,215 - ATI Radeon 9800 Pro 256MB w/ 3DMark03 Build 330
2,821 - Nvidia GeForce FX 5900 Ultra w/ 3DMark03 Build 330
Nvidia using the cheat had - 3,458.
The ATI 9800 Pro is faster.