Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

Futuremark Replies to Nvidia's Claims 317

Nathan writes "Tero Sarkkinen, Executive Vice President of Sales and Marketing at Futuremark, has commented on the claims by Nvidia that 3DMark2003 intentionally puts the GeforceFX in bad light, after Nvidia had declined becoming a member of Futuremark's beta program. This issue looks like it will get worse before it gets better." ATI also seems to be guilty of tweaking their drivers to recognize 3DMark.
This discussion has been archived. No new comments can be posted.

Futuremark Replies to Nvidia's Claims

Comments Filter:
  • by Anonymous Coward on Tuesday May 27, 2003 @11:31AM (#6048860)
    It won't be fast enough next year.
    • You're backwards (Score:4, Informative)

      by Anonymous Brave Guy ( 457657 ) on Tuesday May 27, 2003 @02:34PM (#6050614)

      Nothing will even use the new kit to its fullest for that long.

      I'm rapidly coming to the conclusion that the way to go with video cards is to buy one a year old. It's much cheaper, and typically handles all current and near future games perfectly well. The new gizmos, and speed boosts, on these cards rarely provide worthwhile bang for your buck these days.

      Use the money you save to buy a faster processor, more RAM, a RAID array or something else that provides a useful improvement in performance outside of the theoretical. Or if you're buying/upgrading card + monitor together, get an extra couple of inches of screen real estate or go for a nice flat panel. The difference in price really is of that order, yet the difference in ability is irrelevant for almost all real applications.

  • by wowbagger ( 69688 ) * on Tuesday May 27, 2003 @11:34AM (#6048893) Homepage Journal
    nVidia: "Well, for this program they will never step off the rail, so we can fake it so it looks good from the rail only."

    ATI: "Well, this shader program isn't optimally coded - here is a more optimally coded shader that does the exact same thing but more quickly."

    nVidia: "Well, you caught us, but we have to cheat because you have it in for us!"

    ATI: "Well, you caught us, and although we were doing the exact same thing (only faster), we will remove that code ASAP."

    • by KillerHamster ( 645942 ) on Tuesday May 27, 2003 @11:37AM (#6048930) Homepage
      SCO: Hey, we have a patent on cheating, pay us money!
      • by Anonymous Coward
        "SCO: Hey, we have a patent on cheating, pay us money!"

        Wouldn't that be:

        SCO: Hey, we have a patent on SOMETHING you did, pay us money!
    • Re:nVidia vs. ATI (Score:5, Interesting)

      by asdkrht ( 655186 ) on Tuesday May 27, 2003 @11:44AM (#6049017)
      From what I've heard, Nvidia totally replaced the shader program with ones that they wrote. All ATI did was re order some of the instructions in the shaders to "optimize" them. The optmized and the original shader programs were functionally evquivalent. Sort of what happens when a complier optimizes code. The same can't be said for what Nvidia did.
      • Re:nVidia vs. ATI (Score:5, Insightful)

        by stratjakt ( 596332 ) on Tuesday May 27, 2003 @12:13PM (#6049284) Journal
        If it was a generic optimization (and probably should have been), there'd be no issue.

        ATI recognized the 3dmark executable and special cased for it. Which is misleading and wrong. The performance is enhanced for 3DMark and 3DMark alone.

        • Re:nVidia vs. ATI (Score:4, Insightful)

          by tha_mink ( 518151 ) on Tuesday May 27, 2003 @01:00PM (#6049760)
          But the key point is...it's only a fucking benchmark. Who cares anyways. Just another reason to never trust benchmark programs. I don't care how well a card performs on a benchmark since I don't PLAY benchmarks.
        • Re:nVidia vs. ATI (Score:2, Interesting)

          by dnoyeb ( 547705 )
          You are absolutely correct. Especially considering that ATI is part of the beta program. So this means that other cards will not benefit from the optimizations to "FutureMark03" that ATI made. The correct procedure for ATI would have been to tell Futuremark that they need to optimize at a certain point.

          Its kind of like ATI finding a performance bug, and working around it. But not telling anyone else about it. Its more of oportunistic cheating. Its not blatant. hehe, I just don't feel as bad about th
      • Re:nVidia vs. ATI (Score:2, Insightful)

        by Anonymous Coward
        Neither is problematic unless it removes generality from the code-piece in question or reduces quality. There's a fine line: If the code substitution produces the exact same output for all possible inputs, not just the inputs which occur in the benchmark, then it's an optimization (which could possibly be done in the same way for other programs). If the code-substitution only works when certain parameters are limited by the benchmark, then it's a cheat. AFAIK NVidia's modifications do not remove generality
  • by Frederique Coq-Bloqu ( 628621 ) on Tuesday May 27, 2003 @11:34AM (#6048895) Journal
    3DMark will look totally sweet because it's *optimised* for both cards.
  • by Cannelbrae ( 157237 ) on Tuesday May 27, 2003 @11:35AM (#6048906)
    Its about the OEMs as much or more than the consumer market. They watch the benchmarks closely -- and make decisions based on results.

    This is where the money really is, and what is worth fighting for.
    • by anonymous loser ( 58627 ) on Tuesday May 27, 2003 @01:23PM (#6049967)
      OEMs make decisions based on cost, feature set and possibly name recognition, but only if it adds value to their product. They care about the business relationships they have with the vendors, and whether they can get price breaks, and whether the vendor's product integrates easily with their own.

      Benchmarks that differ by a couple of percent depending on which test is run are not going to make a big difference in the overall decision process. If they made decisions based on benchmarks then ATI would have closed its doors many years ago, since until very recently they have been consistently outclassed by their competitors performance-wise for several years. However, ATI has done VERY well in the OEM market during this time not due to better performance, but due the the factors I listed.

      • Thats odd. ATI has always been known for the best TV-out in the industry. I would say their "performance" IS the direct reason whey they have won out in the OEM market for so long. Remember, OEMs barely care about 3D performance. And I am talking laptops with builtin graphics here, and desktops for big corporations. Any 3D work historically went to SGI or Sun anyway.
        • ATI Rage Mobility was the defacto standard 3D chipset on many platforms. It had nothing to do with TV-out, as most machines using the ATI chipset didn't even offer that feature. That is what kept ATI in business, not selling a few consumer-level cards with TV-out. And SGI has been a has-been in 3D for YEARS. They decided to restructure around high-performance computing. Maybe you missed the part where they bought Cray and made that huge announcement. Cray, BTW has subsequently split off again as a separate
  • by leeroybrown ( 624767 ) on Tuesday May 27, 2003 @11:35AM (#6048908)
    I suppose you have to expect some poor practices considering that the top 3DMark card will be considered by many gamers to be the best to buy. It's a massive temptation for such a big industry. I find ATI's decision to remove code which it claims boosts over all performance quite funny.
    • by default luser ( 529332 ) on Tuesday May 27, 2003 @01:17PM (#6049914) Journal
      ATI, suddenly finding themselves in a corner, made a very smart decision under pressure.

      Point is, they can come out of this wearing the white hat, because they were the first to be such good guys about the issue.

      The fact is, even with all Nvidia optimizations in-place, their high-end card will just barely edge out a 9800 Pro without optimizations. Add ot this the fact that ATI, 3dmark and the community will hound them and discount Nvidia's optimizations until they are removed, and you've got an all-out win for ATI.

      Remember folks: everyone cheats. Fools take things too far and get caught. ATI has played the fool before, Nvidia plays it now; that is the game.
  • by Loie ( 603717 ) on Tuesday May 27, 2003 @11:36AM (#6048913)
    ATi's tweak yields a 1.9% gain by rearranging the instructions 3dmark issues it's hardware. Anyone familiar with assembly language knows that properly arranging your instructions prevents stalls; the end result, however, is exactly as intended. It sounds to me that this is what ATi did. nVidia, on the other hand...40% gains with very obvious visual errors is..well, wrong.
    • by homer_ca ( 144738 )
      In one sense ATI's tweak is not as bad because they're still rendering the scene with full image quality, where NVIDIA is rendering with reduced quality. However, it's still deceptive because it's optimizing for the special case of a benchmark, and real games (or a renamed 3dmark executable) will run slower.
      • No renaming the executable wont help as they use a different method of detection 3DMark.
        I guess they learned a bit from ATI and their quack.exe debacel .. not enough tho :/
      • and real games (or a renamed 3dmark executable) will run slower

        Not entirly true, as ATI and nVidia both work closly with big name game studios to make sure that optimizations such as the these are in the game. Obviously the benchmarks didn't use the optimizations they asked for so they took it into their own hands. Sneaky yes, but it is reflective of preformance in real games (atleast big name ones -snicker-)

    • The jury is still out on what ATI did. They were cought in the NVIDIA investigation. FutureMark has yet to start an investigation into ATIs drivers(as they say they soon will). So ATI is hoping by pulling this out, they can get Futuremark to NOT investigate their drivers any further.

      Any modifications are wrong. If you cheat a little, you will cheat a lot. I can not imagine anyone at ATI agreeing to ONLY a 1.9% cheat. That is only asking to be cought with no benefit to the cheat. I am sure we will fi
  • "3DMark03 was developed strictly according to DirectX9 standard in very close cooperation with Microsoft and other BETA members. If hardware performs well 3DMark03, it performs well in all applications that use DirectX 9. Note that since 3DMark is designed to be an objective evaluation tool, it does _not_ include manufacturer-specific optimizations. This is why it is exceptionally well suitable for objective performance measurement. "

    Does this guy [slashdot.org] work for NVidia?
    • by aliens ( 90441 ) on Tuesday May 27, 2003 @11:44AM (#6049014) Homepage Journal
      Hardocp [hardocp.com]

      They do a good job of disecting the benchmark, and I'd have to agree that as a DX9 benchmark it fails.

      Whatever, it's still just a synthetic mark and nothing more.

    • by Anonymous Coward on Tuesday May 27, 2003 @11:48AM (#6049066)
      This quote is misleading. "DirectX9" alone means nothing.

      We need to look at the new shader features offered by DirectX9, these are:
      - Pixel and Vertex shaders 2.0 (supported by ATI R3xx line and GeForceFX)
      - extended Pixel and Vertex shaders 2.0 (supported only by GeForceFX)
      - Pixel and Vertex shaders 3.0 (no support until R400/NV40)

      Now let's look at the features which are used by 3DMark03:
      - Game 1: no shaders at all, only static T&L
      - Game 2: vertex shader 1.1 and pixel shader 1.4 (which isn't natively supported by NVIDIA cards)
      - Game 3: vertex shader 1.1 and pixel shader 1.4 (which isn't natively supported by NVIDIA cards)
      - Game 4: vertex shader 2.0 and pixel shader 1.4+2.0

      This means that:
      -DirectX9 offers three new different shaders.
      -Three of four 3DMark03 demos don't use new DirectX9 shaders at all
      -Three of four 3DMark03 demos use Pixel Shader 1.4 which was introduced with DirectX8.1 and isn't natively supported by NVIDIA cards
      -Only one set of new DirectX9 shaders are partially used in one 3DMark03 demo

      Thus 3DMark03 shouldn't be called "DirectX9" benchmark. Following quote: "If hardware performs well 3DMark03, it performs well in all applications that use DirectX 9" should be changed: "If hardware performs well 3DMark03, it performs well in all applications that use Pixel Shader 1.4"
      • by 0123456 ( 636235 ) on Tuesday May 27, 2003 @12:53PM (#6049697)
        "Three of four 3DMark03 demos don't use new DirectX9 shaders at all"

        No, but they use shaders which are generally only supported on DX9 cards and a few older ATI cards. Just because you have a PS2.0 card that doesn't mean you have to use PS2.0 if PS1.4 can do the same: why deliberately make more work for yourself by not supporting older cards?

        "Three of four 3DMark03 demos use Pixel Shader 1.4 which was introduced with DirectX8.1 and isn't natively supported by NVIDIA cards"

        Support for PS1.4 is a requirement of DX9, so if the GF FX is a DX9 card then it supports PS1.4, and your claim is therefore bogus. If it doesn't support PS1.4, then it's not a real DX9 card.
        • He's right on the money. PS 2.0 is not always needed, sometimes 1.4 is enough, so there's no point in using 2.0. And in other cases 2.0 is needed because 1.4 just doesn't cut it. 3DMark03 uses 1.4 where it's the smart thing to do, and 2.0 when it's smart thing to do. All DX9 vid-cards support 1.4, since one of the requirements of Direct3D-compatibility is backward-compatibility.
      • by GarfBond ( 565331 ) on Tuesday May 27, 2003 @02:32PM (#6050588)
        PS1.4 isn't natively supported by *most* nvidia cards. The spec for PS2.0 is such that it's all-encompassing. If you support PS2.0 you support PS 1.4 and PS 1.1. If you support PS1.4 you support PS 1.1, etc.

        So this is how it should look, properly:
        - Game 1: no shaders at all, only static T&L (DX7-class effects, given comparatively little weighting in overall score)
        - Game 2: vertex shader 1.1 and pixel shader 1.4 (natively supported by GFFX, ATI Radeon 8500 and above)
        - Game 3: vertex shader 1.1 and pixel shader 1.4 (natively supported by GFFX, ATI Radeon 8500 and above)
        - Game 4: vertex shader 2.0 and pixel shader 1.4+2.0 (DX9 cards only, Radeon 9x00 and GFFX)

        Nvidia's lack of support for PS1.4 is their own design choice, and now they have to live with it. The GF4 was released after DX8.1 came out, which contained the PS1.4 spec, but they chose not to support it. ATI Radeon 8500 and above have no problem with this because they supported DX8.1 from the getgo, but nvidia did not change and continued their 8.0 support. As was previously mentioned in the article, nvidia was participating in the developer's beta until Dec 2002, well into the development period for 3dm03 and a month after they paper launched the GFFX, so they knew what was going on with the benchmark for a long time beforehand and didn't change their stance for a while. Presumably, as a beta member up until Dec 2002 if they didn't like the choice of PS 1.4 in extensive use, then they could've said something earlier.

        The key to regarding 3dm03 is it's goal as a forward-looking benchmark. Both DX8 games and DX9 games are currently in development, and many DX7 games are still in existence (remember, HL2 doesn't require anything above a DX6 card), so in this respect 3DM03 is still fair in its test design.
    • by sjelkjd ( 541324 ) on Tuesday May 27, 2003 @11:55AM (#6049122)
      3dMark isn't a standard. It's a business, who makes money by charging hardware developers LOTS of money to be included in their "BETA" program. In real life(TM), manufacturer-specific optimizations matter. Many games will look better and run faster if they use vendor-specific OpenGL extensions, for instance. For a gamer looking to buy the fastest card to run his favorite game, he should look for benchmarks on that game. FutureMark is trying to make a business by predicting behavior of games that aren't out. Well, either the game you want to play is out or it isn't. If it's out, buy your card based on benchmarks for it. If it's not, wait until it's out before you spend your money. There is no guarantee that 3dMark is a good predictor of DirectX 9 performance.
      • 3dMark isn't a standard. It's a business, who makes money by charging hardware developers LOTS of money to be included in their "BETA" program.


        Membership in the beta-program costs about 5000 dollars. That's peanuts for companies like Ati and NVIDIA.
  • open source (Score:5, Insightful)

    by phre4k ( 627961 ) <esbenp AT cs DOT aau DOT dk> on Tuesday May 27, 2003 @11:36AM (#6048923)
    People keep begging that nvidia release their drivers under a open license. Well i guess we now know why they don't. /Esben
  • by Poofat ( 675020 ) on Tuesday May 27, 2003 @11:37AM (#6048933)
    This is why you need many forms of evaluations to properly test something. Just running one program to show you pretty pictures is not going to give any meaningful result. You need to stress test the card in other ways.

    And, since one of the main reasons people will buy this is to play flashy and pretty games, ignoring the performance in those games is rediculous.
    • And, since one of the main reasons people will buy this is to play flashy and pretty games, ignoring the performance in those games is rediculous.

      Which is one of the reasons most people would look at publications that provide multiple types of benchmarks including performance in various popular games or game engines.
  • by cmburns69 ( 169686 ) on Tuesday May 27, 2003 @11:38AM (#6048946) Homepage Journal
    Back when nvidia aquired 3dfx, they began to merge their development teams. The fx is the first card by nvidia to be developed by engineers from both the nvidia and 3dfx groups.

    Of course it will work better when you do it their way; It was 3dfx's strength in the beginning, and its downfall in the end.

    But I believe that their current development team has yet to hit its stride, and future offerings will see the trophy going back to nvidia... ... But who knows! I'm no fortune teller ...
  • driver tweaking (Score:5, Informative)

    by erikdotla ( 609033 ) on Tuesday May 27, 2003 @11:42AM (#6048987)
    ATI also is crafty at tweaking their drivers to suck. They should be working on decent drivers instead of cheating on stupid benchmarks to get +1.9%.

    I bought a Radeon 9700 Pro. The driver issues almost make it not worth the increased FPS over my Ti4400.

    The refresh rate problem in XP is annoying as hell. ATI handles it even worse than NVidia, where you set your "maximum" refresh rate and your "highest" resolution, and it assumes that all lower resolutions can only handle that refresh rate.

    There's no way to tell your ATI card, "My monitor can do 1280x1024 @ 85hz, but 1600x1200 @ only 75hz." You either get 75hz in everything if you want 1600x1200, or you get 85hz up to 1280x1024, and have to avoid 1600x1200 use lest ye risk getting "frequency over range".

    NV handles it better with the driver, allowing you to set maximum refresh rates for every resolution individually.

    These refresh rate tweaking programs don't help either, since half the games out there choke when you use them. PC 3d is in a bit of a sorry state right now, and I'm tired of it.

    • Re:driver tweaking (Score:3, Informative)

      by JDevers ( 83155 )
      You forget that both of those paths are workarounds for a problem with the operating system. If you want to complain about the problem, complain about Microsoft.

      I have a 9700 and don't have ANY driver problems, what sort of issues are you having?
    • Re:driver tweaking (Score:5, Informative)

      by UberLord ( 631313 ) on Tuesday May 27, 2003 @11:59AM (#6049166) Homepage
      Try using Refresh Force [pagehosting.co.uk] which directly alters the windows monitor information in the registry and removes every mode but the one you specify for each resolution. This allows games to run in any screen size at the requested refresh rate and not cause them to choke or crash. It's worked fine on all my cards so far (GeForces, Radeons, Kyros)

      Besides, this is more of a windows quirk than a driver thing as MS requires the driver to behave like this to pass it's WHQL tests.
  • What a mess! (Score:5, Informative)

    by georgep77 ( 97111 ) on Tuesday May 27, 2003 @11:43AM (#6048996) Homepage Journal
    This whole episode has turned into a big mess. NVDA seems to be the bad guy in all of this. Their DX-9 product was delayed and their existing products where only DX 8.0*. The benchmark heavily favours DX-9 parts and NVDA's existing lineup was/is getting smoked in the benchmark by it's main (only) competitor. They decided to go on the offensive and try to kill off this benchmark. The 30 person company that produces 3D Mark have stood their ground against the multi-billion dollar NVDA. NVDA instead of admitting that their Pixel Shader is quite slow when running against 2.0 specs insteads tries to decieve and FUD their way out of it. Looks like they got more than just some patents when they purchased 3DFX...
    Now they have painted themselves into a corner and how this will turn out is anyone's guess.

    *DX8.1 has PS 1.4 which is actually much closer (AFAIK) to PS 2.0 than PS 1.3 (DX8).
    • Re:What a mess! (Score:3, Informative)

      by htmlboy ( 31265 )
      The benchmark heavily favours DX-9 parts and NVDA's existing lineup was/is getting smoked in the benchmark by it's main (only) competitor. They decided to go on the offensive and try to kill off this benchmark. The 30 person company that produces 3D Mark have stood their ground against the multi-billion dollar NVDA. NVDA instead of admitting that their Pixel Shader is quite slow when running against 2.0 specs insteads tries to decieve and FUD their way out of it.

      other people have mentioned this, but take
  • by LegendOfLink ( 574790 ) on Tuesday May 27, 2003 @11:43AM (#6048998) Homepage
    What!? Two giant corporations actually doing something MS-like to make themselves more appealing?! That's unheard of! Why, one might think this is a ploy to increase marketshare! Corporations are our friends, they would never manipulate the people. Damn the man!
  • by EXTomar ( 78739 ) on Tuesday May 27, 2003 @11:43AM (#6049000)
    The problem isn't that benchmarks lie. We all know they do. The problem is we don't know how they lie. Creating open source benchmark applications can show how the driver is excirsed so everyone who wants to know or learn where cards and drivers are strong and weak. Everyone is on the level if everyone can look at the code that came up with numbers. Not to mention there are things to learn from code in benchmarks that excirse the fringe elements of graphics cards and drivers.

    The alternative is what we have now: hand waving voodoo. Not only do we have to take the vendor's word they aren't monkeying around with the driver to match execution of the benchmark but now we have to question where the aligence of the benchmark makers.
    • by UberLord ( 631313 ) on Tuesday May 27, 2003 @11:55AM (#6049129) Homepage
      Open Source benchmarks are a bad idea because Closed Source drivers can still be used which may or may not contain these "cheats".

      Better to have open source drivers so we can inspect the driver for cheating/optimisations.

      Infact, open source them all if the hard numbers are that important!
    • Once Again A Call For Open Source Benchmarks

      Well, that's all well and good, but what does it accomplish? How do we decide who is allowed to work on the standard, because virtually everyone with sufficient skills and clout will have an angle. Do we let hardware developers do it? No, that's like having the fox guard the proverbial henhouse. Game developers? Maybe, although they're tainted too - for one, because they partner with hardware makers, and second, there are instances where they might want to

    • Hmm... wouldn't that only make situations like this worse and easier to accomplish? It would make the benchmarking even more predictable to the card manufacturers, who could then cheat even better.
  • by jtilak ( 596402 ) on Tuesday May 27, 2003 @11:44AM (#6049010) Journal
    AFAIK, ATI displays the graphics on screen properly, the drivers are just optimized for the benchmark. One could still consider this cheating. NVIDIA however does not display the graphics properly, it really does cut corners (literally) to get higher scores. ATI got an extra 3% from cheating. NVIDIA got a whopping 24% higher scores from cheating! take a look at the extremetech screenshots:

    http://www.extremetech.com/print_article/0,3998, a= 41574,00.asp
    • Don't get me wrong, there is nothing wrong with application-specific optimizations.

      But this misses the whole point of 3dmark 2003. Different developers stress the pixel and triangle pipelines in different ways to produce a whole boatload of effects. While major games and engines are often optimized-for, there is no guarantee that ATI or Nvidia will sit down and optimize for the game you just bought.

      That said, 3dmark 2003 should be considered a relational tool for generic perfrormance. Consider it a goo
  • by rimcrazy ( 146022 ) on Tuesday May 27, 2003 @11:45AM (#6049021)
    I've worked in the PC industry more years than I care to think about. All graphic card vendors tweak their drivers and bios to make their cards look better. If people didn't put so much emphisis on benchmarks for buying decisions then there would not be much reason to tweak things but the reality of the world is they do.

    On a side note, me and my team many, many years ago designed, what was at the time, one of the fastest chip sets for the blinding new 12 Mhz 386 PC. We had discovered that the Norton SI program that everyone was using to benchmark PC's based most of it's performance on a small 64 Byte (yes, that is not a typo 64 BYTE) loop. We had considered putting a 64 byte cache in our memory controller chip but our ethos won at the end of the day as cleary what we would have done would have been discovered and the benchmark would have been rewritten. Had we done it however, for our 15 mins of fame our benchmarks would have been something crazy like 10x or 100x better than anything out there.

    • Comment removed based on user account deletion
      • I think not but it has been so long. I was design manager at VLSI Technology. We made a 12Mhz chipset which was our first, I thought for the 386 but it may have been for the 286. Our chip set was used by IBM for their "reentry" back into the ISA bus channel PC's after their big micro-channel effort resulted in them loosing half of their market share by dropping the ISA box for a while.

        The cache comment is correct, regardless of the CPU it was going to work with. We reversed engineered the Norton SI ben
  • from the what-exactly-are-kid-gloves dept.

    Get the answer at Straight Dope [straightdope.com]
  • by Performer Guy ( 69820 ) on Tuesday May 27, 2003 @11:47AM (#6049053)
    Look there is a clear difference between what NVIDIA and ATI have done here. ATI are not cheating, they look at a sequence of instructions and reorder them if they fit a pattern, but they do exactly the same thing as before. This is central to the kinds of things optimizing compilers and CPUs do. Maybe you thing it's too narrow a path, but it's a minor infraction at best compared to the blatant cheats of NVIDIA, who not only rewrote shaders but did several otehr really heinous things, like disableing screen clear and adding their own hidden clip planes.

    It's a real shame that The Register obscured the truth here with an article that attacks ATI for conservatively removing optimizations while giving the real miscreant gets a free pass. ATI should leave their optimizations in IMHO, but maybe you disagree because their mathematically equivalent optimization is not general enough, it's a close call, but they don't deserve what the distorted treatment given in The Register.
    • The problem is that ATI's "optimization" applies ONLY for this particular benchmark. An actual game which used these shaders would not benefit from ATI's "optimization". Since the optimization affects only the performance of the benchmark, and not real-world performance, it is a cheat.

      You're loosing sight of what 3DMark03 is trying to acheive here; the objective of a benchmark is to provide two or more systems with the same input, and therefore the same workload, and see which system performs more favora
      • No, that is NOT TRUE. The ATI optimization applies generically to any shader with similar instructions. Futuremark made a CHANGE TO THE SHADER CODE to uncover this. I do think the acceptability depends upon just how generic the optimization was, there is some wiggle room, unlike the blatant NVIDIA cheat.

        If ATI had triggered on the shader name, application name or modified the results of the shader to be functionally different, I'd be right with you calling it a cheat, but they didn't.
  • Confused (Score:3, Insightful)

    by Otter ( 3800 ) on Tuesday May 27, 2003 @11:48AM (#6049060) Journal
    ATI came a cropper the same way. Futuremark saw an eight per cent decrease in the score of one benchmark, Game Test 4, when it conducted the test with a renamed executable rather than correctly titled code. ATI's fix, said Futuremark, contributed to an improvement of just under two per cent in the overall 3DMark 03 score.

    I'm confused about what this means. Is the 1.9% difference in ATI performance between Game Test 4 with correct and modified names, or between the current driver and an older version?

    Most people here seem to think it's the latter, and I'd agree that they did nothing wrong if that's the case. But it's not obvious to me that they're not accused of the same thing as NVIDIA.

  • by jridley ( 9305 ) on Tuesday May 27, 2003 @11:49AM (#6049069)
    I remember video card companies cheating on benchmarks 10 or more years ago. This was when PCI was the latest thing on the block, and was competing with VESA-local bus. They wrote drivers specifically to detect when they were being called by the PC-Magazine benchmark program, and they'd do some stuff like just returning from every other call, since the prog was just calling the same thing 10000 times.
  • by Genjurosan ( 601032 ) on Tuesday May 27, 2003 @11:55AM (#6049127)
    When Quake III runs at 300 FPS on my system under my 9700 Pro with 4x AA, I could care less about 3DMark and what ATI or Nvidia tweak. If the games run smooth and they look good, then go with it. Truth is, the ATI looks better than the Nvidia card under QIII, WCII, JKII, and pretty much everything else I've been playing.

    The issue with low FPS is a game problem 9 out of 10 times. The faster the video card, the less the game development houses work to streamline and improve their framerate.
    • The issue with low FPS is a game problem 9 out of 10 times. The faster the video card, the less the game development houses work to streamline and improve their framerate.

      Word.

      Lots of development houses are focusing exclusively on the high-high end graphics card market and are forgetting that their LOD rendering engine *could* display the characters as a virtua-fighters esque 50 shaded polygon mess. I was personally blown away when the Half-Life 2 development team decided that the onboard i-810 would be
  • Cheating??? (Score:3, Insightful)

    by JDevers ( 83155 ) on Tuesday May 27, 2003 @11:56AM (#6049139)
    In what way is Ati cheating, really? If you think about it, virtually every modern processor does some minor instruction rescheduling right? Basically, Ati is doing this in the driver and not on-chip, that's the only difference. I'm sure in the next few generations of GPUs we'll see the introduction of hardware features like this. Once the pixel/vertex shaders get ironed out pretty well and a lot of people use them. Right now very few games really make use of them and they spend most of their time emulating hardcoded T&L which is again a part of the driver.

    Nvidia is cheating and acting like a child, er, large corporation...but that isn't at all what Ati is doing.
    • Re:Cheating??? (Score:3, Insightful)

      by Anonymous Coward
      There is a difference between out of order execution as found on modern CPUs and what ATI is doing. The rearranging of CPU instructions is done on-chip, and is done no matter what application is being executed. What ATI did was hard code a rearrangement of instructions into their driver. Something like if(app=3dmark) swap(instruction1,instruction2),swap(instruction3 , instruction4)... If the app being run isn't 3dmark3003 then no performance gain will be had. Now if ATI came up with a general algorithm to r
    • I agree Ati is not cheating: if they were doing this for a game, nobody would complain.

      Still I own a Radeon 9800 and I'm pissed off by ATI: there are still bugs in their driver (SplinterCell, and IL2 Forgotten Battles) and they used some developper's time to optimise for 3DMark instead of debugging the driver!!

  • by 222 ( 551054 ) <stormseeker@@@gmail...com> on Tuesday May 27, 2003 @12:00PM (#6049176) Homepage
    Hey, i want to know the truth just as much as the next guy, but seriously.... does this seem odd that when Nvidia opts out of this hundred-thousand-dollar beta program, this happens?
    Ive read 5-6 reviews of the FX 5900 and everyone seems to think its great, and rightly gives Nvidia the 3d crown. (Especially concerning Doom ]|[ :)
    If you read the interview, its even brought up that the 5900 seems to do just fine in all other benchmarks, only futuremark seems to give it a hard time, and im not buying that crap about Doom 3 benchmarks not being readily available.
    If i remember, Toms had a good review of that....
  • by bhsx ( 458600 ) on Tuesday May 27, 2003 @12:03PM (#6049196)
    Is it coincidence or some sort of nVidia inside joke that changing the name of the Dawn executable (fairy.exe iirc) http://www.nvidia.com/view.asp?IO=demo_dawn
    to quake3.exe removes those pesky leaves, revealing her suptle nature, and that renaming it to 3dmark2003.exe removes the leaves and her wings? Is the inside joke that they leave "certain things out" of quake3 and 3dmark? Does the government know of the existence of aliens and wormhole portals to other worlds?
    • What's going on is that the driver is applying the wrong optimizations to the graphics command stream. Since these hidden tweaks were designed to work on known code, they are very dependent on it. Whatever is used to render the leaves is completely broken by whatever changes are performed to boost Q3 scores, and the leaves and wings both break when it thinks it's running 3DM2K3.
  • Coincidence? (Score:4, Insightful)

    by Throatwarbler Mangro ( 584565 ) <delisle42@yah[ ]com ['oo.' in gap]> on Tuesday May 27, 2003 @12:04PM (#6049203) Homepage
    What fortune should I happen to get at the bottom of the comments page?

    What upsets me is not that you lied to me, but that from now on I can no longer believe you. -- Nietzsche

    Classic.

  • by onyxruby ( 118189 ) <onyxruby&comcast,net> on Tuesday May 27, 2003 @12:05PM (#6049214)
    The less cooperation between the testing companies and the tested companies the better. The last thing this industry needs is to become like so many other industries where the test standards lose all merit because the testers and testee's are in bed togethor. Test results are only of merit of they are done completely independent of the manufacture during the entire test process.


    Think of it this way, when's the last time you saw PC World roast a product that truely deserved it? How many review sites gave WinMe a thumbs up when it's widely viewed in the industry at MS's worst OS to date? We (the public) simply aren't being served if the test companies are cooperating with the companies their testing. Look if a testing company, review site or whatever other lab doesn't occasionaly come out and just say "this sucks" than you know they aren't credible. There's too much out there that sucks, and too few reviewers willing to let the public know before they waste their money.


    It's the same reasoning that dictates why consumer's reports will buy their cars anonymously from dealers using third parties instead of getting "special" delivery directly from the manufacture. What we should really see with the behaviour were observing so far is an impetus to develop an open source test benchmark application. By doing this we would assure that the results can't be bought, just like has become common practice in so many other industries.


  • SHOCKED! to find that there is optimization going on here.

    (Alphonse enters.)

    Your SPECmarks, sir.

    Thank you.
  • by LostCluster ( 625375 ) on Tuesday May 27, 2003 @12:14PM (#6049300)
    This happens so often in grade school I'm surprised the computer industry hasn't caught on to it yet. If you give students a copy of the exam the night before the exam, the only material they are going to bother to study the question-answer pairs on that exam, and may just remember what the answer to #6 is rather than even try to understand the question.

    In order for a driver benchmark to be useful at all, it needs to be kept absolutely secret from the chip manufacturers before the test, and then once it is used and revealed that benchmark needs to be retired, because the next generation of testing should be designed to concentrate on the new features that the graphic card developers are expected to put in their next generation of cards that will be used in the next generation of games.

    In short, the best benchmark will always be based on "that sure-to-hit game that's just about to come out."
  • In one test NVDA performance sucked and they cheated in their drivers to make it look similar to what their performance is on other tests.

    In Doom ]|[, the most advanced graphics game out there currently, NVDA smokes ATI. On test equipment that NVDA only provided the card.

    Am I going to "play" 3DMark or am I going to play DOOM 3. For all of you who will be playing 3DMark more than doom, go ahead and get the ATI card. I'll make my decision based upon the stellar DOOM 3 performance.

    • You mean the stellar performace of a game that won't be available till the end of the year at the earliest?

      By that time newer, faster cards will have been released. If you intend to buy a new card now then look at what games you currently play as the market will certainly change when DOOM 3 hits the shelves.
  • I never put much stock into benchmarks. One test says one thing another something else.

    All this proves is that a benchmark is a highly isolated incident of observable performance.

    For example, most charts I see rate the P4 as "faster than the Athlon" at the same clock rate. Yet when I benchmarked my bignum math code I found that the Athlon completely kicked the P4s ass

    [K7]
    http://iahu.ca:8080/ltm_log/k7/index.html

    [P4]
    http://iahu.ca:8080/ltm_log/p4/index.html

    Does this single test prove the athlon
  • So many of the drivers are special casing the test based on the executable name. It would seem that all you have to do is rename the EXE to test the true nature of the driver.

    It would be easy enough for ALL of the testing labs to simply rename the benchmark and game application EXE to something random before starting the tests as a matter of course. If they state this fact up front for all to see, it would make special casing like this extinct overnight.

    It wouldn't prevent cheating. Data profiling could p

  • First, as we all know, there are lies, damn lies and benchmarks. But a company cheating here can really be damaging to the industry.

    In this day and age, the top of the line video cards have more than 100 million transistors, and have become increasingly "intelligent" in sorting and rearranging data, instructions and choosing algorithms to get the best performance out a given engine. Some engines perform better than the others under different circumstances, making benchmarks even more subjective than ever.
  • by brunes69 ( 86786 ) <slashdot@keir[ ]ad.org ['ste' in gap]> on Tuesday May 27, 2003 @12:34PM (#6049484)

    See here [slashdot.org] for the original /. store describing the Quack / Quake 3 cheat ATI had a while back. MUCH worse than the current NVidia cheat IMO.

    Regardless of if you think it is worse, the point is that BOTH companies have cheated in benchmarks so there is NO point in "glorifying" ATI at NVidia's expense. They are just as bad if not worse (and their drivers blow ass ).

  • If the drivers were open-source- they dont have to be "free" software, just open, this could all go away, couldnt it?
  • Maybe they'll get more honest results renaming the benchmark executable to "3DLark03" or something. After all, something like this worked once. ;-)
  • by Call Me Black Cloud ( 616282 ) on Tuesday May 27, 2003 @01:07PM (#6049827)

    I like nvidia but I'm disappointed that the reply sounds like a justification. From Derek Perez (dperez@nvidia.com):

    Since NVIDIA is not part in the FutureMark beta program (a program which costs of hundreds of thousands of dollars to participate in) we do not get a chance to work with Futuremark on writing the shaders like we would with a real applications developer.

    We don't know what they did but it looks like they have intentionally tried to create a scenario that makes our products look bad. This is obvious since our relative performance on games like Unreal Tournament 2003 and Doom3 shows that The GeForce FX 5900 is by far the fastest graphics on the market today.

    dp
  • by crivens ( 112213 ) on Tuesday May 27, 2003 @01:16PM (#6049907)
    Who really cares about benchmarks? Personally I don't (that was a rhetorical question). So what if one card can do 10 more mega-flip-flops per microsecond than another one. I don't really care about getting an extra 0.00001fps from Doom3.

    I don't believe claims anyway; ATI says their card is faster than NVidia's. NVidia says theirs is faster than ATI's. Bleh....
  • Is something more along the lines of "live" demos. Not these flash-stuff-by, fixed-POV, fixed-scene demos, but something more akin to a game or whatever. The "Dawn" demo is cool, and would be a neat thing to have in testing as you can see the differences and/or notice any graphical distortions or lag.
    A better thing would be to have a demo with a simple multi-room house or whatnot. Various lighting effects, a few characters perhaps, window panes, maybe the outside is blocked by lava or something so that yo
  • by xaoslaad ( 590527 ) on Tuesday May 27, 2003 @01:25PM (#6049994)
    We all know ATI did it in the past. We all know Nvidia is doing it now. And even better we know that ATI is up to its old tricks again as well.

    And how have people figured this out time and time again? Oh, they renamed the executable...

    Why does the benchmarking software not rename the executable to some-254-character-long-file-name-random-string.ex e? Use some kind of encryption to prevent the driver software from snooping in on the rename process and oooh no more cheating....

    I'm sure that there is some way that Nvidia and ATI could get around even this but what are they gonna do make a 75MB driver in retaliation to what the benchmark companies do?
  • There was a comment that you can do certain operations in different ways and come up with the same result in the end. It sounds like what nVidia might have been doing was re-arranging the operations to run faster, in other words performing a simple optimization of future mark's code. Of there was no VISUAL degedation, then I don't really see much of a cause for complaint.

    Obviously any professional game engine is going to have optimization profiles for the major cards, so I don't see this as a big deal.
  • Now that they have "patched" 3DMark so that the NVIDIA cards don't run it as fast, I wonder why I would want to use this program on anything but an ATI card.

    Thier comment that the benchmark ran great on the released version but showed errors on the unreleased private devleoper version is like saying "Well it may get 120fps in QuakeIII but if you run it on Super Secret unreleased QuakeIII beta 2 it has errors"

    If I have an NVIDIA, S3, Matrox, etc. and they did not pay for the FutureMark beta program then
  • by WndrBr3d ( 219963 ) * on Tuesday May 27, 2003 @03:22PM (#6051122) Homepage Journal
    I mean, seriously now folks. I think benchmarking these days is nothing compared to what it used to mean. When the Voodoo3's rolled out to tackle the TNT2's, there was a considerable gap between the two. When nVidia introduced the GeForce GPU, the whole game world was changed and the 3dMark score of a GeForce box was 3x that of a non.

    I think benchmarking these days is almost trivial. The scoring difference between a 9800 Pro and a 5900 Ultra, in the end, will only mean about a 15fps difference in your gaming experience. Honestly now, does playing UT2003 at 60fps vs. 45fps really pump your nuts? if so, then you can go ahead and moderate this as flamebait.

    And as far as 'optomizing code' for benchmarks, it's industry wide. Intel releases custom compilers just so programs will run faster on P4 chips! Is that cheating? Not really. The programs still run the same, just better on the hardware they chose. Same with nVidia in this situation, the picture still LOOKED the same (unless you enabled the free viewing). So who cares what happens in the background?

    My point is, people need to make decisions on their own when it comes to purchasing hardware. It all boils down to personal choice. Some people are hardcore ATI fans no matter what the benchmarks say, others are nVidia fans until the bitter end.

    Personally, I choose nVidia because of hardware compatibility issues in the past with several chipsets i used to have, now it's just habitual. People who are on the fence and really don't have their feet in the water when it comes to hardware might be sold by the gold PCB.

    In the end, well, it boils down to this. You know what they say about opinions ;-)
  • Driver strategies (Score:5, Insightful)

    by John Carmack ( 101025 ) on Tuesday May 27, 2003 @03:31PM (#6051216)
    Rewriting shaders behind an application's back in a way that changes the output under non-controlled circumstances is absolutely, positively wrong and indefensible.

    Rewriting a shader so that it does exactly the same thing, but in a more efficient way, is generally acceptable compiler optimization, but there is a range of defensibility from completely generic instruction scheduling that helps almost everyone, to exact shader comparisons that only help one specific application. Full shader comparisons are morally grungy, but not deeply evil.

    The significant issue that clouds current ATI / Nvidia comparisons is fragment shader precision. Nvidia can work at 12 bit integer, 16 bit float, and 32 bit float. ATI works only at 24 bit float. There isn't actually a mode where they can be exactly compared. DX9 and ARB_fragment_program assume 32 bit float operation, and ATI just converts everything to 24 bit. For just about any given set of operations, the Nvidia card operating at 16 bit float will be faster than the ATI, while the Nvidia operating at 32 bit float will be slower. When DOOM runs the NV30 specific fragment shader, it is faster than the ATI, while if they both run the ARB2 shader, the ATI is faster.

    When the output goes to a normal 32 bit framebuffer, as all current tests do, it is possible for Nvidia to analyze data flow from textures, constants, and attributes, and change many 32 bit operations to 16 or even 12 bit operations with absolutely no loss of quality or functionality. This is completely acceptable, and will benefit all applications, but will almost certainly induce hard to find bugs in the shader compiler. You can really go overboard with this -- if you wanted every last possible precision savings, you would need to examine texture dimensions and track vertex buffer data ranges for each shader binding. That would be a really poor architectural decision, but benchmark pressure pushes vendors to such lengths if they avoid outright cheating. If really aggressive compiler optimizations are implemented, I hope they include a hint or pragma for "debug mode" that skips all the optimizations.

    John Carmack
    • Perhaps if you spent less time on Slashdot and more time on Duke Nukem Forever, it would be on the shelves. I don't mean to be insulting. After all I'm a huge fan of your work with Half Life and Unreal, but I've been waiting *forever* for DNF and its just vaporware. Do you know - for certain this time - when it will be released?
  • by gotan ( 60103 ) on Wednesday May 28, 2003 @12:28AM (#6054513) Homepage
    I found it interesting that ATI claimed they simply "optimized" the code by simply shuffling around a few instructions while still getting the same results. That may even be so, but obviously they made this optimisation only for the very specific case of one of 3dmarks benchmarks, else it would have worked as well with the slightly altered drivers as well. Will their engineers look at all and any game out there and optimize that code too or will they come over to my house and write a new driver on the fly when i need one? No? Well then, since i don't benefit from their optimisation in any real-world szenario and it only serves to boost their score a little. In the real world their graphics card will have to deal with suboptimal drivers as well, if they want to improve the situation they should give out a few guidelines to game-developers how to write a fast engine.

    This isn't about Nvidia vs. ATI or about defending Nvidia, what NV did by clipping planes was even worse. It's just that there is no justification for cheating on the benchmarks, even if the graphical results are the same. The benchmarks should be an indicator how the card will perform in real-world-szenarios (i.e. games) and any driver-tweaks that are benchmark-specific but don't help performance otherwise are just cheating and make-believe.

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (1) Gee, I wish we hadn't backed down on 'noalias'.

Working...