Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

More on Futuremark and nVidia 429

AzrealAO writes "Futuremark and nVidia have released statements regarding the controversy over nVidia driver optimzations and the FutureMark 2003 Benchmark. "Futuremark now has a deeper understanding of the situation and NVIDIA's optimization strategy. In the light of this, Futuremark now states that NVIDIA's driver design is an application specific optimization and not a cheat."" So nVidia's drivers are optimized specifically to run 3DMark2003... and that's not a cheat.
This discussion has been archived. No new comments can be posted.

More on Futuremark and nVidia

Comments Filter:
  • Yeah, right (Score:3, Insightful)

    by jabbadabbadoo ( 599681 ) on Tuesday June 03, 2003 @02:19PM (#6107408)
    Time to update Websters. Cheat just got new semantics.
  • by Nogami_Saeko ( 466595 ) on Tuesday June 03, 2003 @02:20PM (#6107414)
    I think [H]ardOCP stated it best as "Futuremark didn't want to get sued by Nvidia". Nvidia has the legal and financial resources to totally ruin Futuremark and they know it.

    And now Futuremark has totally invalidated their own benchmark software by declaring it "open season" for hardware manufacturers to distort the "tests" in any way shape or form they desire to make the numbers higher.

    N.
  • Futuremark scared? (Score:3, Insightful)

    by steveit_is ( 650459 ) on Tuesday June 03, 2003 @02:21PM (#6107422) Homepage
    Looks like someone is scared of somebody elses lawyers. Yuck! This is obviously Futuremark trying to appease Nvidia.
  • by dopaz ( 148229 ) on Tuesday June 03, 2003 @02:22PM (#6107425) Homepage
    If nVidia intends to include driver optimizations for many popular applications, then is it really cheating? Lots of games are built upon the Quake3 engine, and I'm sure the Doom3 engine will be used for some great titles. If nVidia will optimize the drivers for specific games then I'm all for it.
  • Re:riiiiight... (Score:5, Insightful)

    by PhxBlue ( 562201 ) on Tuesday June 03, 2003 @02:22PM (#6107434) Homepage Journal

    That is the way it sounds, isn't it?

    "Application-specific optimization". . . In other words, "We're not cheating, we're just adding code to our driver to make sure our card works really well with benchmarking software." Of course, if it works better with benchmarking software than it does with real-world applications, that is cheating, isn't it?

    It actually reminded me of the axiom, "That's not a bug, it's a feature!"

  • Stack Creep (Score:2, Insightful)

    by netolder ( 655766 ) on Tuesday June 03, 2003 @02:22PM (#6107436)
    This sort of outcome is inevitable as drivers move "up the stack" into the application layer. To get better and better optimizations, the drivers need to know more and more about the application that is requesting the services - thus, we end up violating the strict separation between application and driver.
  • by Viewsonic ( 584922 ) on Tuesday June 03, 2003 @02:23PM (#6107448)
    Cripes already. No one even BOTHERS with #DMark anymore, and after this fiasco no one is ever going to bother with them again. Gamers will use REAL EVERYDAY GAMES to see what runs the fastest again. Looking at some goofy simulation app coming up with scores and people buying into the company and people tricking drivers for particular tests is just crappy and makes 3DMark 100% invalid to any of my concerns in the future. I will only trust reviews that benchmark the latest and greatest games that I will be buying these cards for, whoever can run them fastest at that particular time IS WHAT IM GOING TO BUY. Peroid. Enough of this 3DMark BS.
  • by MyNameIsFred ( 543994 ) on Tuesday June 03, 2003 @02:26PM (#6107483)
    The press release is short on details. But I think it raises two points. First, Futuremark is no longer calling it cheating. Second, Futuremark is considering changes to the way it benchmarks cards.

    So the question in my mind is did Futuremark learn something from the discussions? Is there something it was ignoring in its tests?

    I'm trying to not be a cynic and assume a big fat envelope was passed under the table. That what Nvidia did was legitimate.

  • Big quality loss (Score:4, Insightful)

    by 1001011010110101 ( 305349 ) on Tuesday June 03, 2003 @02:27PM (#6107491)
    Those that are following this, should check the pictures on the previous article. The quality of the nvidia "optimized" version sucked (showed big artifacts). That's no optimization, unless there was no image quality loss.
    "This card is optimized for quake as long as you follow the left trail, the right trail will just look like crap but nobody follows it anyway".
  • by pjwhite ( 18503 ) on Tuesday June 03, 2003 @02:28PM (#6107504) Homepage
    If a benchmark doesn't measure performance related to real-world applications, what's the point? If a driver is optimized to run a benchmark faster, that SHOULD mean that the real world apps should run faster, too. If not, the benchmark is useless.
  • by Kjella ( 173770 ) on Tuesday June 03, 2003 @02:29PM (#6107523) Homepage
    If you want to, you can prerender the whole fucking test, stick it in your driver and just play it off instead of actually rendering when Futuremark is running, that would be an "application-specific optimization" too.

    The benchmark is ment to reflect performance in the actual game, the reason it takes the same path is merely to make the results comparable. What ATI was something the game *could* have achieved in game, if the operations were properly sequenced. What Nvidia did is to fake a performance it can't actually give if a person had followed the exact same path in the game. That is cheating.

    It is pathetic by Nvidia, and it's pathetic by Futuremark to present this press statement. Get some backbone and integrity.

    Kjella
  • by corebreech ( 469871 ) on Tuesday June 03, 2003 @02:30PM (#6107531) Journal
    Now that all the latest games have benchmarking modes, what do we need FutureMark for?

    If NVidia wants to do application-specific optimizations that make UT2003 go faster, then that would be great. That's what they should be doing. Those are optimizations that genuinely benefit the user.
  • by HalfFlat ( 121672 ) on Tuesday June 03, 2003 @02:30PM (#6107534)
    It's not an optimization if it does not produce the same results! Recall that the shader code that the driver used did not produce the same visual results as the shader code it replaced.

    More tellingly, the driver deliberately flaunts the D3D spec by omitting buffer clears, mucking about with clip planes, etc. ... based purely on application-specific pattern matching, which by its very nature is fragile. As was demonstrated so aptly by the 'off the rails' mode in Futuremark. This isn't an accidental bug: it is obvious that such mechanisms are highly fragile, and are almost certain to cause bad rendering on these applications when they are modified in small ways.

    As others have said, Futuremark's statement is just covering their legal arse. If someone modifies their code to get better scores in some benchmarks while introducing deliberate bugs (i.e. incorrect rendering), it's a cheat in my book.
  • by fazzumar ( 574187 ) on Tuesday June 03, 2003 @02:37PM (#6107620)
    This has been discussed before. Other companies have tried to modify their drivers to produce better results for certain benchmarks. They've always been thrown out as invalid before. I wonder why Futuremark seems to be considering allowing NVidia's enhancement to stand.
    There's a line from the story:
    "...However, recent developments in the graphics industry and game development suggest that a different approach for game performance benchmarking might be needed, where manufacturer-specific code path optimization is directly in the code source. Futuremark will consider whether this approach is needed in its future benchmarks."
    I'm concerned because I feel that allowing video card manufacturers to put code specifically about certain benchmarks in to their product (making their product look better in that benchmark) may not be reflective of real world performance.
    However, the benchmark is useless if it doesn't measure real world performance, so I do believe that NVidia could put stuff in their product to make the benchmark run faster that would be beneficial to real applications, so I'm torn.
    Some game manufacturers make optimized versions of their code to work with certain video cards, but the normal use is an operating system driver (DirectX...) and I believe using the generic driver is more representative of what you'll get when you use the video card.
    It seems that NVidia is arguing that they should be allowed to put optimizations in to their code specifically for the benchmark because they do the same thing with some other populate applications.
  • by Anonymous Coward on Tuesday June 03, 2003 @02:38PM (#6107628)
    have you even read what nvidia's drivers did to "optimize" 3dMark03? hardcoded clipping planes (not applicable in games!), replacing shaders with inferior output, lowering rendering precision to FX12/FP16 (DX9 calls for FP24 minimum!) resulting in visible image quality degradation.

    And you say these are not cheats as long as nvidia uses them in games? are you completely lobotomized?
  • The numbers... (Score:5, Insightful)

    by BrookHarty ( 9119 ) on Tuesday June 03, 2003 @02:38PM (#6107630) Journal
    Well, after seeing the future mark scores, it looked to me as nvidia fx chips are blowing away ati and its gf4 line.

    But I found a really nice german benchmark site 3dcenter.org [3dcenter.org] that had to be the best benchmarks ive ever seen, they actually use the games on each and lists the fps.

    Looks like the FX/GF4 5200/4200 4600/5600 (non ultra) are the same. And the ATI 9700PRO/9800 are faster than the 5800 Ultra.

    After reading these benchmarks, you can really tell nvidia tweaked the SHIT out its drivers for futuremark...
  • by Pulzar ( 81031 ) on Tuesday June 03, 2003 @02:40PM (#6107653)
    Yes, but, if they'd put the optimization in to only make the Doom 3 *demo* faster, because they knew it was used as a benchmark, you wouldn't support it any more.

    There are a lot of "optimizations" you can use if you know in advance what you'll need to draw. You don't even need a 3d engine, you could just pre-render everything and put an mpeg into the driver. I bet that'd be very fast.

    It wouldn't make the actual game run any faster, though.

  • Re:Great! (Score:3, Insightful)

    by DickBreath ( 207180 ) on Tuesday June 03, 2003 @02:42PM (#6107682) Homepage
    now they can put 'Designed to run 3Dmark2003' on Nvidia product boxes!

    Having such a logo labeling program might be a revenue opportunity for FutureMark.

    Another revenue center for FutureMark might be to sell benchmark result coefficients. Each different card's results are biased by some coefficient, whose value can be purchased according to a tiered pricing model. The ensures uniformly fair bias according to what each video card vendor is willing to spend.

    On the video card side of the fence, couldn't the rom in a video card put up a boot time splash screen that sports advertising? Such an ad would be flashed into the card by the card's drivers once the OS is up and connected to the net. The video card driver could also take measures to be sure you see the boot-time ads sufficiently frequently.
  • Re:riiiiight... (Score:2, Insightful)

    by MindStalker ( 22827 ) <mindstalker@@@gmail...com> on Tuesday June 03, 2003 @02:49PM (#6107759) Journal
    They do, nVidia and ATI both create game specific optimizations in their drivers for all major games. This is why it is not a cheat to optimize for benchmakrs as it is reflective of real world optimizations had it been a game that nNidia considered important enough to optimize for. Of course image quality can in no way be reduced for these optimizations then, yes, it is a cheat.
  • by OmniGeek ( 72743 ) on Tuesday June 03, 2003 @02:50PM (#6107761)
    NVidia did things that were clearly NOT legitimate, and FutureMark caught them at it. There's a PDF report [futuremark.com] on FutureMark's Web site (assuming it hasn't met with an "accident" by now) detailing the dirty deeds. Chief among them, IMHO, was a trick where the driver was supposed to draw and update positions of stars in a night sky (involving clearing the background) as one moved along a 3D path; if one stays on the exact preprogrammed track of the demo, it looks OK. BUT... if you turn around (possible in the beta mode of the benchmark) you see that the driver SKIPPED clearing the background; the stars smear like mad. There is NO POSSIBLE WAY their driver was behaving legitimately. (Especially since changing the benchmark's fingerprint oh-so-slightly caused all these quirks to vanish; they were detecting the demo and screwing with things if it was being run...) The rest is just fear-of-pissing-off-the-800-pound-gorilla. A FutureMark developer admitted as much in a newsgroup posting. Sigh...
  • Re:GREAT With Me (Score:5, Insightful)

    by damiam ( 409504 ) on Tuesday June 03, 2003 @02:52PM (#6107792)
    All video card drivers have algorithms to keep from rendering unneccessary portions of the map. What NVidia did, I believe, was to bypass those algorithms and hard-code into the driver the portions that needed to be rendered. That wouldn't work in a real game, where the card must decide what to render at run-time based on user input. Therefor, it's cheating, no different from including an MPEG of the entire 3Dmark demo and showing it in lieu of actually rendering it.
  • Argh (Score:4, Insightful)

    by retro128 ( 318602 ) on Tuesday June 03, 2003 @02:52PM (#6107793)
    What a load of crap. This is one of those things that when you think about it too much a bunch of false lines of logic get drawn and you come up with a nonsensical answer. Either that or 3DMark is trying to avoid a lawsuit from nVidia, which no doubt has been threatened.

    The point of a benchmark is to test dissimilar systems against common references to get an idea of how they perform against each other in such a way that you have an apples to apples comparison.

    If 3DMark writes their program in a way that allows optimization paths for a specific GPU, then it is no longer a benchmark.

    You now no longer have an idea of how fast the card REALLY runs as there is no guarantee that game writers will use GPU-specific optimizations. It's the same thing as MMX...Nobody sees the benefits if it's not hardcoded into the program, so what's the point if being uberfast in a benchmark if you won't necessarily see the same results in the real world?
  • Re:riiiiight... (Score:3, Insightful)

    by residieu ( 577863 ) on Tuesday June 03, 2003 @02:58PM (#6107847)
    But that means your card will be good current games, and maybe games that are currently in production, but you have no way of knowing how well it will perform on the next generation of games. If you had real benchmarks that couldn't be optimized for then you could see which cards are better general purpose cards and which are just better optimized for the current crop of game.

    I guess you just need to buy another card at that point.

  • by IPFreely ( 47576 ) <mark@mwiley.org> on Tuesday June 03, 2003 @03:28PM (#6108110) Homepage Journal
    It's not cheating if you intimidate your accuser into recanting the accusation.

    That seems a bit more appropriate to the story, doesn't it.

  • by Anonymous Coward on Tuesday June 03, 2003 @03:40PM (#6108266)
    Coding for a particular path (i.e., on the rail) is like playing a computer game where everything happens exactly the same way each time: You go left, 3 guys appear, you shoot one, get hit by a bullet, shoot another, run down a alleyway and see one more, shoot, jump, shoot again, get ammo and then die. Nvidia's cheat assumed that a particular set of events were going to happen exactly and that is why certain portions of the screen were not rendered correctly. It is impossible for any computer game to render a scene exactly the same way many times during actual gameplay. This is a cheat and not an optimization. An optimization like what ATI did can still render all games correctly, but enhances the performance of a particular game at the possiblility of decreased performance in other games.
  • by default luser ( 529332 ) on Tuesday June 03, 2003 @03:42PM (#6108285) Journal
    Don't get me wrong, there is nothing wrong with application-specific optimizations.

    But this misses the whole point of 3dmark 2003. Different developers stress the pixel and triangle pipelines in different ways to produce a whole boatload of effects. While major games and engines are often optimized-for, there is no guarantee that ATI or Nvidia will sit down and optimize for the game you just bought.

    That said, 3dmark 2003 should be considered a relational tool for generic perfrormance. Consider it a good bet that if two cards perform similarly and acceptably, the two cards should be able to run almost any DX8/DX9 game off the shelf acceptably.

    The fact that Nvidia's unopitmized drivers perform significantly behind ATI's unoptimized drivers in 3dmark 2003 raises a significant question:

    We all know how well the 5900 does in Quake III, Serious Sam 2, UT2003, etc, but how does it do in ?

    I want to know that if I take *insert random DX8 game here* home to play, IT WILL PERFORM WELL. That is the entire point of having a benchmark like 3dmark. To do application-specific optimizations for it is to nullify the entire point of the benchmark.
  • by Happy Monkey ( 183927 ) on Tuesday June 03, 2003 @03:50PM (#6108378) Homepage
    It's not like saying 1+1 = 3. It's more like saying what's 7+7+7+7+7+7? Well, it's the same as 7*6, but guess which one is faster to calculate?

    It's more like saying "What's 7+7+7+x+7+7?" For the benchmark program, x happens to be 7, leading to 7*6, but the general case is actually 7*5+x. No attempt is made to check an arbitrary game to see if x is 7. It only applies the optimization if the executable is a particular benchmark.
  • by be-fan ( 61476 ) on Tuesday June 03, 2003 @04:05PM (#6108534)
    While I don't agree with everything you say, I must make one point: if someone offered me a Radeon 9800 Pro to replace my GeForce4MX, I wouldn't -- couldn't. I run Linux. The ATI Linux drivers blow compared to the ATI Windows drivers. For more important, to me, than any cheating NVIDIA might or might not have done, is that NVIDIA has a history of releasing quality products, at decent prices, with good driver support. Further, they were the first company to release OpenGL ICDs for consumer-level cards that were conformant enough to perform well with pro-level apps. They were, and still are, also the first consumer graphics card company whose Linux drivers matched their Windows drivers in performance. ATI's driver situation has gotten better, but until ATI can equal NVIDIA's driver prowess, I'm one loyal NVIDIA customer.
  • by Happy Monkey ( 183927 ) on Tuesday June 03, 2003 @04:16PM (#6108632) Homepage
    Yes, but the point is that most developers will change the value of x anyway.

    That wouldn't help. The optimization is applied only to the benchmark program. In this case, x represents the direction the camera is facing at a particular time. In a game, this is unpredictable and non-optimizable. In the demo, it was set. The optimization does not translate to any gains in any game, no matter what the game developers do.
  • by Naito ( 667851 ) on Tuesday June 03, 2003 @04:19PM (#6108683)
    First let me state that I'm thoroughly disgusted with how nVidia and Futuremark have handled this situation. In a vague kind of way, it sort of relates to the government line of "if you don't support this war, you are a traitor". No one is allowed to voice a bad opinion anymore.

    Secondly, and this is a new train of thought for me, if nVidia had made the benchmark run faster without sacrificing image quality, I think it should be allowed to detect the benchmark was running and have a code path optimized in the driver for it. This could be used to show exactly how fast the hardware is capable of running optimized to the hilt by the driver developers. It could actually have a benefit of showing game developers how they should code their software, that sequencing instructions in this particular order or using certain architecture specific instructions are THAT much faster. Sort of like how 3DNow enhanced Quake showed off how much faster the K6 could be. Unfortunately this was not what they did, they were caught red-handed, and now they're just throwing their weight around. SHAME ON nVidia.
  • by Kedanoth ( 591243 ) on Tuesday June 03, 2003 @04:23PM (#6108737)
    I just wonder why people are quickly asserting that nVidia is so horrible for altering their code specifically to get better scores on the benchmark (I do believe it's a deplorable act myself), yet no one has gone to say the same things about ATI. Indeed, ATI didn't do such drastic alterations to their code, but they still admitted to making specific changes to affect benchmark scores. In my book, that puts them with the same blame as nVidia.

    So I say that both nVidia and ATI should be ashamed of themselves for such unethical practices.

  • Re:Bullshit (Score:2, Insightful)

    by innocent_white_lamb ( 151825 ) on Tuesday June 03, 2003 @04:31PM (#6108867)
    in other words, that type of customer isn't in the market for a high-end card, doesn't read benchmarks, and won't be affected anyway. The people who do research benchmarks will see--actually, have seen--through the legalspeak and understand that nothing about FutureMark's original announcement has really changed.

    I disagree.

    The type of customer that I'm thinking of is the one who walks into the store and says "I want SuperDuper XXX with a 32bit frigmataz and the blue sticker on the box." In other words, a kid who has no idea of what the terminology actually means, but wants what was recommended to him by his buddy George who goes to school in the next town and has a super-cool copy of some hacker tool that, well, we don't know what it does but it has a cool name, and his sister is kind of cute too, you know, but of course we can't admit that.

    George says, "Look at this. This has a bigger number than that one!" and Little Johnny plonks down his cash.
  • Re:GREAT With Me (Score:3, Insightful)

    by BrookHarty ( 9119 ) on Tuesday June 03, 2003 @04:44PM (#6109045) Journal
    Exactly, as Futuremark put it.

    3DMark03 is designed as an un-optimized DirectX test and it provides performance comparisons accordingly.

    ATI/Nvidia optimized shaders, not cheating.
    Nvidia optimized for rendering in a benchmark, cheating.

    You can optimize, as long as you dont know beforehand what is going to be used in the Benchmark.
  • by daVinci1980 ( 73174 ) on Tuesday June 03, 2003 @04:54PM (#6109142) Homepage
    I would mod this down, but I'd rather argue (so much more fun! ;-) ). As someone who has worked on a very large game (team of >50, sold 500K copies so far), someone who works at a small studio now (20 people), and someone who develops at home on the side, (whew) I can say that size and clout has little to do with how much attention video card manufacturers are willing to give you. All that really matters is that they see an interesting prospect, and a way for their card to look "better."
  • Actually, WE care. (Score:3, Insightful)

    by Canis ( 9360 ) on Tuesday June 03, 2003 @05:59PM (#6109779)
    Actually, we game developers care. We want to make use of new features of graphics cards to increase performance and/or visual quality. But also, we want our games to run on all (relatively-recent) cards, without having to write complex hacks to work around the bugs of each one.

    It's no use benchmarking on the latest and greatest games -- because, as developers, we try and avoid releasing games that run horribly (slowly or with obvious bugs) on certain cards. Sometimes we can persuade the manufacturers to fix their bugs, but the timescales can be tight and soemtimes they're quite happy to not fix the bugs, especially if they know their competition runs the codepath you're looking at faster than them, and we get forced to drop the feature. So instead of games pushing up hardware quality, games are held back by shoddy hardware or (more usually) drivers. You're just benchmarking on what the manufacturers already know works. Zzzz. Futuremark's job is to stress-test in advance what's coming up.

    And even if Futuremark does things that aren't always what you'd do in games, they are trying to push the cards to the limits to see if they do what the manufacturers claim, or whether they only achieve their claimed performance "in controlled tests".

    So some of us talk to the Futuremark guys and say things like, "We're looking into using [technology X] in our next game, but the drivers on cards [A and B] are screwed, works on [C] though. Could you put a section into 3DMark 2004 that uses [X]?". Then, when their card performs miserably at [X] (even though the card's hardware can handle it -- it's just that they've been slack on the drivers) they get shamed into improving their quality at those features. A bit like WHQL [microsoft.com] for games.

    Once the driver bugs for the features are fixed, we can write code that uses them.

    Except NVidia decided to stop playing nice when it turned out the latest tests make their cards look quite poor, and noticeably slower than ATI's. So they took their ball and went home (dropped out of Futuremark's beta programme). This is why they didn't know their cheating would be discovered.

    And of course, this problem is compounded by people like yourself, Mr Sonic, who see big numbers in Quake benchmarks (you do realise 3D card mfgs "optimise" those too, right?), pop wood, and rush off to buy the latest hovercraft [pcworld.com] no matter if it's not really "all that".

    Incidentally, ATI's optimisations were exactly that: optimisations. Essentially, they were reordering instructions in a shader -- exactly like a compiler optimising instruction order for Intel or AMD processors' particular quirks. The meaning and, more importantly, on-screen output of the code was not altered.

    Whereas it's clear to see from the screenshots in the original expose article, that NVidia were not optimising, but actually not running code, causing the onscreen output to look wrong. As developers, we don't want gamers returning our games to the shops "because it goes wrong when you do X". Nor do we want to sweat blood trying to invent ways to avoid their driver bugs.

    Oh... someone else who cares about 3DMarks? OEMs. When it comes around to picking what cards to put inside big-name off-the-shelf PCs or, eventually, which chips to surface-mount on the all-in-one motherboard, they're looking for price-performance, and 3DMark is a part of that equation.

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...