Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

More on Futuremark and nVidia 429

AzrealAO writes "Futuremark and nVidia have released statements regarding the controversy over nVidia driver optimzations and the FutureMark 2003 Benchmark. "Futuremark now has a deeper understanding of the situation and NVIDIA's optimization strategy. In the light of this, Futuremark now states that NVIDIA's driver design is an application specific optimization and not a cheat."" So nVidia's drivers are optimized specifically to run 3DMark2003... and that's not a cheat.
This discussion has been archived. No new comments can be posted.

More on Futuremark and nVidia

Comments Filter:
  • by gazuga ( 128955 ) on Tuesday June 03, 2003 @02:35PM (#6107591) Homepage
    Yes, there is an increase in speed, but there was also a degradation in quality in the test. See here [extremetech.com].

    That's the whole issue...
  • by Anonymous Coward on Tuesday June 03, 2003 @02:41PM (#6107673)
    Total BS, 3DMark2003 is already geared toward specific vendors. I've personally analyzed the data from the driver (since I'm writing one), and they totally favor ATI with the heavy use of PS 1.4 shaders. In fact, the data changes completely if PS 1.4 support isn't claimed. (3x more geometry is sent)
    Also, PS 1.4 shaders don't always translate 'up' to PS 2.0 hardware very well, which is why (IMHO) Nvidia started all this hub-bub in the first place.

    The only vendor that natively supports PS 1.4 is ATI.

    They should have created PS 1.1 shaders for the masses, and then if 2.0 hardware is detected, had 2.0 shaders for everything.

    And their "DX9" onyl test is a piece of crap too. They use one or two new instructions in the VS, and PS2.0 is only used for the sky. big whoop. No branching in the VS, two sided stencil, or anything cool.

    It's sad the OEMs put alot of stock in 3Dmark, they don't seem to realize that gamers play games all day, not benchmarks.

  • by Anonymous Coward on Tuesday June 03, 2003 @02:43PM (#6107692)
    ...and that's not a cheat.

    Well, they are right, it's not a cheat. FutureMark, however, has just admitted that as a performance bench mark that might indicate suitability for other tasks their rating sucks.

  • by cens0r ( 655208 ) on Tuesday June 03, 2003 @02:45PM (#6107727) Homepage
    I think you don't understand what was done. Nvidia cheated on the benchmark because they new exactly what needed to be rendered. This kind of "optimization" won't help any other application, but it doesn't mean the benchmark is bogus. The same thing could be done by putting optimizations in for time_demo (or something similar) in quake. It will help that specific demo, but it won't help quake.
  • by Zone5 ( 179243 ) on Tuesday June 03, 2003 @02:51PM (#6107774)
    That would be worse than useless. No two test runs could be usefully compared, so the utility of the benchmark itself is called into question. You'd be reduced to aggregation of test results in the hope that they statistically settle out towards some sort of 'true' average for a given card/driver combo.

    The inherent value of a benchmark is the notional "apples to apples" comparison, and you've taken even that away. There would now be no reason at all to use 3DMark.
  • by micq ( 266015 ) on Tuesday June 03, 2003 @02:56PM (#6107835)
    What would be great...
    If someone was to reverse engineer the drivers, remove the "Optimisation", recompile and compare results. See what percent the "Optimisations" fudged the results.


    Don't have to. In the previous story, they stated how they simply removed the condition that the driver used to switch on this optimisation and, as you said, saw what percent the optimisation fudged the result.
  • Re:Bullshit (Score:3, Informative)

    by Trepalium ( 109107 ) on Tuesday June 03, 2003 @03:04PM (#6107898)
    But did you also not read that ATI CHEATS? Oh, that's right -- ATI and FutureMark are on good terms. There's three kinds of lies in this world -- lies, damn lies and benchmarks.
  • by msimm ( 580077 ) on Tuesday June 03, 2003 @03:08PM (#6107930) Homepage
    Posted in response to this [slashdot.org] initially, but this is such a popular misconception. ;-)

    I think the idea was to test new technologies that haven't been implemented yet in Quake 3 or Unreal Tournament 2003 (like in the upcoming DOOM III).

    A quote of a quote in their 10/26/98 press release:
    "3DMark sets a long awaited standard for testing actual game performance for titles like Unreal as well as the future technologies. I support it one hundred percent."

    -- Tim Sweeney, Unreal Programmer, Epic MegaGames
  • Re:riiiiight... (Score:3, Informative)

    by Surak ( 18578 ) * <surakNO@SPAMmailblocks.com> on Tuesday June 03, 2003 @03:19PM (#6108027) Homepage Journal
    actually, Gates did say something similar, but the "640K ought to be enough for anyone!" quote itself is misquoted, and I could cite a source if I had the stupid book with me. Stephen Manes' and Paul Andrews' "Gates" has the actual quote in it, (if someone has the book could they please post it? mine's packed away. thanks.) And in all fairness, the quote is taken out of context...it was said in like 1980 or 1981 when people were coming off the CP/M-80 machines with 64K memory onto the brand spankin' new 16-bit 8088-based IBM PCs.
  • by Zathrus ( 232140 ) on Tuesday June 03, 2003 @03:25PM (#6108089) Homepage
    I can sort of see the argument here, but it basically ruins the point of having a standard interface like DirectX

    Shrug... welcome to reality. DirectX, OpenGL, etc. don't properly model the hardware in some cases leading to much worse performance than should be available.

    It's not like saying 1+1 = 3. It's more like saying what's 7+7+7+7+7+7? Well, it's the same as 7*6, but guess which one is faster to calculate?

    And it's not quite like that either, I know, because the bit that FutureMark is tentatively agreeing with is Nvidia changing the shader precision, which can lead to a loss of quality (so maybe it is 1+1 = 1.999999999998).

    ATI did pretty much the same thing with their drivers, leading to a much slimmer 1.9% improvement. Of course, it's unclear how much of Nvidia's improvement was from the shader changes (which FM is considering) versus from the other modifications they made (like clipping issues). The latter points are not in question by FM - they are cheating.
  • Re:GREAT With Me (Score:3, Informative)

    by Zathrus ( 232140 ) on Tuesday June 03, 2003 @03:29PM (#6108124) Homepage
    That's not in question though... what's in question is NVidia's changing of the shader precision from FP32 to INT12. It's entirely reasonable that this is a tradeoff that would be made to improve speed at a slight cost to render quality -- since the speed improvement can be substantial.

    Not clearing the back buffer and other "on rails" cheats are still classified as cheats. It's merely the shader changes that are being considered as possibly OK. Which, btw, is similar to what ATI did. But I don't know that ATI's changes sacrificed visual quality at all. (Not that they haven't done that in the past...)
  • by be-fan ( 61476 ) on Tuesday June 03, 2003 @03:41PM (#6108276)
    One key point, though: NVIDIA's shader precision is much higher than ATI's at its highest settings. NVIDIA's middle precision, however, is lower than ATI's maximum. This means that comparing the performance of the two is kind of a crap shot, because you can't configure the two to use the same precision.
  • by TrekkieGod ( 627867 ) on Tuesday June 03, 2003 @03:47PM (#6108354) Homepage Journal
    If NVidia wants to do application-specific optimizations that make UT2003 go faster, then that would be great. That's what they should be doing. Those are optimizations that genuinely benefit the user.

    Problem is, NVIDIA didn't just optimize. Their application specific "optimization" made the images look worse. And when you couldn't notice it, it was because they were clipping outside the camera angle (becuase they knew exactly where the camera was, something they can't do when you're playing UT2003)

    Like the original statement by futurmark said, optimization is great. But when you change the image intended by the software designer in order to make it go faster, that's not an optimization. For god's sake, I can turn all the details to low on UT2003 and get it to go faster, but that's besides the point

    Reason games specific benchmark don't fly for me (although now that Futuremark has issued this statement, I'm sure ATI will start cheating as well, making the whole thing useless) is that synthetic benchmarks can test features new to the cards that games may not yet have implemented. So I have an idea how it'll perform with future games.

  • by TrekkieGod ( 627867 ) on Tuesday June 03, 2003 @04:29PM (#6108843) Homepage Journal
    The whole point of benchmarks is to get real world performance of something right?

    Sure...but my point was that by just benchmarking with current games, you can't get real world performance of features that haven't been implemented in that game. A good 3dmark score generally means a good gaming experience (unless the drivers are cheating in the 3dmark score).

    Shouldn't it also try and emulate an application that is optimized as much as possiable so as to get the highest possiable performance instead of the lowest score?

    Nope. They're testing the graphics cards and their respective drivers, not the skills of game programmers.

    I can already tell you that the worst possiable perforamnce on all future video cards on any possiable software will be, it's less then one frame per second. This we already know, what we should be using programs like 3DMark to find out is how fast a computer with a card CAN run. because 3DMark software is about the possiabilities that a card has, not its REAL WORLD performance.

    Hmm...what? You're NOT trying to find out how fast a computer with a card can run. You're trying to compare video cards so that you know which one to buy..that's why nvidia wants to cheat, so that you choose theirs instead of the ati equivalent. To do that accurately you tax them both equally as much as you can, and you *definitely* test all new features.

    Game benchmarking is good. It tells you how well a card will run the current games. Synthetic benchmarks are also good...it tells you how well your card will run with the games not yet out. If you get a card scoring well on 3dmark, chances are you don't need to replace your card as early as one that does well with current games, but completely flops on the new features

  • by afidel ( 530433 ) on Tuesday June 03, 2003 @04:33PM (#6108892)
    And he also said flat out that Nvidia's "optimization" was a cheat. By throwing away data in a manner that was not possible from the data given by the engine the Nvidia engineers cheated, a real world application would never make those optimizations because as soon as the viewpoint changes the optimization is null and the results are incredibly horrible images.
  • Re:riiiiight... (Score:3, Informative)

    by jetmarc ( 592741 ) on Tuesday June 03, 2003 @05:51PM (#6109716)
    > Of course, if it works better with benchmarking software than it does
    > with real-world applications, that is cheating, isn't it?

    I like it when card manufacturers optimize their driver to achieve high
    Quake III Arena frame rates, because coincidently Quake III Arena is my
    favourite game (and actually the ONLY game I play). I don't care if the
    drivers are good by thoughtful design, or by re-engineering the Q3A code
    path and then constructing a driver that is an exact fit (possibly with
    penalties for other games).

    While your favourite game probably is not "Futuremark", I can see that a
    lot of people (including buyers) are happy when their card performs well
    with that application. So, I won't consider this cheating.

    Take advantage of this race: promote your own favourite game until it is
    popular enough to be included in mag benchmarks. Then the card manufacturers
    will optimize their drivers for YOUR game, too! ...to such a breath-taking
    degree, that others will bash on them for "cheating".

    Marc

    PS: Hey, this FPS shooter cheats, too. It's not modeled down to the molecule,
    but rather makes up the characters from triangles. Bah, CHEATING! :-)
  • by Kris_J ( 10111 ) on Tuesday June 03, 2003 @07:51PM (#6110691) Homepage Journal
    Not True.

    An "application specific optimisation" still displays a picture that is of adertised quality, still crunches all the numbers. A "cheat" drops the quality to bump up the frame rate when it sees a particular application running, without reporting the drop in quality.

    Nvidia's card produces the advertised quality at all times, it just understands what 3Dmark2003 is trying to do and knows how to do it particularly well. It's a fine line they didn't cross.

  • by BrookHarty ( 9119 ) on Tuesday June 03, 2003 @09:29PM (#6111196) Journal
    Guess you should read the entire article.

    Future Mark build 320 vs. 330. (330 doesnt have the nvidia cheat...)

    3,215 - ATI Radeon 9800 Pro 256MB w/ 3DMark03 Build 330
    2,821 - Nvidia GeForce FX 5900 Ultra w/ 3DMark03 Build 330

    Nvidia using the cheat had - 3,458.

    The ATI 9800 Pro is faster.

"Experience has proved that some people indeed know everything." -- Russell Baker

Working...