Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

More on Futuremark and nVidia 429

AzrealAO writes "Futuremark and nVidia have released statements regarding the controversy over nVidia driver optimzations and the FutureMark 2003 Benchmark. "Futuremark now has a deeper understanding of the situation and NVIDIA's optimization strategy. In the light of this, Futuremark now states that NVIDIA's driver design is an application specific optimization and not a cheat."" So nVidia's drivers are optimized specifically to run 3DMark2003... and that's not a cheat.
This discussion has been archived. No new comments can be posted.

More on Futuremark and nVidia

Comments Filter:
  • Fine With Me (Score:4, Interesting)

    by HeelToe ( 615905 ) on Tuesday June 03, 2003 @02:19PM (#6107395) Homepage
    Though, I do prefer they make application specific optimizations that mean better gameplay.

    It's just another piece of information to keep in mind when selecting a new card.
  • It is NOT a cheat. (Score:0, Interesting)

    by Niles_Stonne ( 105949 ) on Tuesday June 03, 2003 @02:21PM (#6107420) Homepage
    I don't think this is a cheat, as long as the energy taken to increase the performance in the test is similar to what is used to optimize for particular games.

    I'm not going to complain if they can optimize the drivers to get 10% better performance in Battlefield 1942 with no degredation in quality, would you?

    The only way I would call this a cheat would be if Nvidia never optimized for any other applications.

  • Re:Cheat? (Score:1, Interesting)

    by Anonymous Coward on Tuesday June 03, 2003 @02:21PM (#6107421)
    Only if those modifications are not matched by equivalent optimizations for big-name games - but they are.

    So in effect, the optimizations make 3DMark a *better* predictor of in-game performance than it otherwise would have been.
  • Bullshit (Score:5, Interesting)

    by Obiwan Kenobi ( 32807 ) * <evan@misterFORTR ... m minus language> on Tuesday June 03, 2003 @02:22PM (#6107429) Homepage
    This is politics at its worst, and I'm calling bullshit.

    There was no need for this nicey-nice statement other than NVidia threatening lawsuits and Futuremark wanting to protect what assets they have.

    Futuremark had every right to call NVidia on their selfish claim and unbelievable hacks. To say that they weren't liable for their own blunder is to say that Futuremark's reputation has been replaced by corporatespeak and a lack of respect almost unparalelled.

    What's worse is that I really thought "Yeah, this time the bad guy gets his due" and that NVidia should've known better.

    But of course, a few weeks later we've got to put on the nice face again for the public en large.

    What a complete waste of time. I know there isn't much respect left in corporate America, but hell, if you can't call a spade a spade, why even bother with the benchmarks when someone can just rewrite an ENTIRE SHADER and only keep a picture clear while the demo is on rails?
  • by Doomrat ( 615771 ) on Tuesday June 03, 2003 @02:23PM (#6107443) Homepage
    +82382383, Anti-MS Wank.
  • by MongooseCN ( 139203 ) on Tuesday June 03, 2003 @02:25PM (#6107472) Homepage
    Run Quake3 with the video card and check out the frame rate and image quality. Run it under UT also and every other 3D game you can. Then compare the framerates and image quality. Who cares what it's 3DMark is. Did you a buy the card to specifically run it under 3DMark? Most people buy these cards for playing games so comparing how it runs the actual game to other cards is the only meaningful measurement.
  • Re:Cheat? (Score:5, Interesting)

    by Davak ( 526912 ) on Tuesday June 03, 2003 @02:26PM (#6107477) Homepage
    Maybe I should elaborate...

    Specifically designing your product to work better in a test than in real life should be considered cheating.

    This could be avoided if 3DMark2003 would release different methods of testing the video cards each year... or if one could download updates from 3DMark2003 that would block any driver-specific optimizations.

    I usually look at the latest and greatest fps benchmark for the latest and greatest game anyway.

    Well, actually... my current Nvidia video card laughs at my little CPU anyway. I until I can find some more CPU to drive my screaming video card... I am not going to find any performance increase.

    Davak
  • Looking closer... (Score:2, Interesting)

    by Infernon ( 460398 ) <infernon@gmail. c o m> on Tuesday June 03, 2003 @02:29PM (#6107516)
    While nVidia has made great cards for some time now (I still use and love my GeForce 3), could it be possible that they're not able to keep up? A lot of the reviews that I've been reading tend to favor the Radeon cards over any anything that nVidia has put out lately. While I doubht that nVidia will become another 3Dfx because they're involved in other markets and I've read about them having US government contracts for this project or that, I would propose that they will not be the huge players that they once were in the vid card market.
  • by YetAnotherAnonymousC ( 594097 ) on Tuesday June 03, 2003 @02:32PM (#6107559)
    I think the money quote is:

    However, recent developments in the graphics industry and game development suggest that a different approach for game performance benchmarking might be needed, where manufacturer-specific code path optimization is directly in the code source. Futuremark will consider whether this approach is needed in its future benchmarks.

    I can sort of see the argument here, but it basically ruins the point of having a standard interface like DirectX. It's also like telling your math teacher, "no, it would be easier for my equations if you made 1+1=3. Now do it because I'm your star student."
  • Re:Fine With Me (Score:1, Interesting)

    by floodo1 ( 246910 ) <floodo1&garfias,org> on Tuesday June 03, 2003 @02:34PM (#6107578) Journal
    more like its a lie. nvidia was trying to convince consumers that their product is better than it actually is.

    granted you shouldnt draw conclusions from a single benchmark, but being led to use false information to base choices on is WRONG.
  • goons...hired goons (Score:3, Interesting)

    by Ubergrendle ( 531719 ) on Tuesday June 03, 2003 @02:36PM (#6107613) Journal
    Futuremark: "NVidia is cheating! Not as much as ATI, but they're cheating!"

    Nvidia: Knock knock

    Futuremark: "Who's there?"

    Nvidia: "Goons...hired goons."

    Futuremark: "Oh...haha...um...Nvidia is actually in the business of application optimisation! Our mistake. Won't happen again."

    Seriously folks, this is Nvidia using big bad lawyers to scare Futuremark into capitulating. They might have held their ground, until ATI was proven to be doing the same thing, albeit to a much lesser degree.

    Unfortunately, the only person who loses in this scenario is the consumer.
  • by koh ( 124962 ) on Tuesday June 03, 2003 @02:37PM (#6107624) Journal
    Not really, IMHO the smart move here would be for nVidia to adapt its "optimization strategy" to release a driver with various hacks available (and accessible to the end user) that can be used on many games to increase rendering speed while sacrificing quality or some features.

    Of course, each hack would or would not work with any particular game, but trial and error can be used to detect the "best set of hacks" for any particular game on any particular card. And we all know how geeks love tweaking things to the metal, just look at gentoo's current popularity.

    Then FutureMark would get themselves a name as *the* benchmark software to run on end users' machines to test the hacks.

    Or maybe I'm just dreaming.

  • by TellarHK ( 159748 ) <tellarhk@NOSPam.hotmail.com> on Tuesday June 03, 2003 @02:38PM (#6107632) Homepage Journal
    (blunt)The problem with a lot of the reasoning I see here with people saying they want the card that plays the game they're interested in quickly, is that it's completely stupid. (/blunt)

    When you're looking for a video card, you -should- rely on a capable, and untainted/optimized benchmark for comparison simply because you can't predict what the software companies that make the actual games are going to do. Will they support -your- chosen card, or will some other GPU maker offer a better bribe to the developer? You may know that kind of info about games shipping RSN, or already on the shelves, but what about next year's?

    Getting the card based simply on one or two games instead of looking at some kind of objective benchmark does no good whatsoever. It's just a way to rope yourself into upgrading the card faster.
  • Re:GREAT With Me (Score:0, Interesting)

    by gerf ( 532474 ) on Tuesday June 03, 2003 @02:43PM (#6107691) Journal

    From what i could glean, Nvidia changed their software in such away that visuals outside of walls, where you're not supposed to be anyway, were not rendered correctly. I don't see how this is a cheat, if it even marginally improves game rendering. Mind you, it may have made an even bigger impact on 3Dmark, or what have you, than the game. But, that only means that 3Dmark must adjust to the card, and that their software is not able to correctly give a good diagnostic. Thus, they've since talked to Nvidia, and will adjust for the driver changes.

    What people aren't realizing is that the first games, 2D games with scrolling screens, did the same thing. The way they could make those games actually perform, is they only rendered a few blocks outside of what is actually shown on the screen. Rendering an entire map in Duke Nukem 1,2,3 (the 2-D version), would have been suicide on a 80386. My god people, it's Engineering, doing the bare minimum to get the job done faster, cheaper.

  • Re:Cheat? (Score:1, Interesting)

    by Anonymous Coward on Tuesday June 03, 2003 @02:46PM (#6107731)
    3DMark is not designed to be a "predictor of big-name game performance" though. If people want that they test using demos for big-name games. 3DMark is (supposedly) for providing a level playing field. The results should allow people to predict a card's performance in ALL games as well as any other consumer-level 3D application. This includes software that is still under development and not available to anyone yet.
  • Benchmarks and games (Score:2, Interesting)

    by Datasage ( 214357 ) <Datasage AT theworldisgrey DOT com> on Tuesday June 03, 2003 @02:46PM (#6107733) Homepage Journal
    Driver optimization for games is one thing, but driver optimization for benchmarks is indeed cheating. Benchmarks are there to be a good overall unbaised mesurement of performance based on a standard approach. Now if Nvidia is allowed to tweak for a specific application as opposed to tweaking overall, it will bias the results.

    When it comes to games, if they want to tweak for a specific game, im all for it. But if you want to tweak a benchmark thats rather unfair.

    But then I think ATI does much of the same thing, they just havent been caught.
  • by MerryGoByeBye ( 447358 ) on Tuesday June 03, 2003 @02:52PM (#6107785) Journal
    Anti-MS?
    You bet.

    Wank?
    On what grounds? I think there's more than enough evidence in the world to conclude that BG is a POS, whether you say it in a funny way or not. So get off the "Being anti-MS is so old" routine. It will be old once the people who run the company stop being the richest people in the world and get more than a slap on the wrist in court.

    Having said that, I think the parallels between nVidia/FutureMark and MS/DOJ are pretty straightforward.
  • Re:riiiiight... (Score:1, Interesting)

    by Overly Critical Guy ( 663429 ) on Tuesday June 03, 2003 @02:54PM (#6107808)
    "I never made that remark about 640k being enough for anyone!"

    He really didn't say it, though. It's a misattributed quote, just like "there's a sucker born every minute."
  • by TopShelf ( 92521 ) on Tuesday June 03, 2003 @02:54PM (#6107811) Homepage Journal
    This is a pretty disturbing point, in that it makes it even more difficult for independent game developers to make headway. Basically, the videocard manufacturers offer to assist in optimizing games to work with their hardware. But of course that assistance will vary with the size and clout of the developer, leaving smaller outfits with the task of trying to optimize for various cards on their own, or suboptimizing the features in their products... either way it's a mess.
  • Futuremark should (Score:3, Interesting)

    by Achoi77 ( 669484 ) on Tuesday June 03, 2003 @03:04PM (#6107888)
    ..pump out the most ugly coded, cycle-wasting benchmark. I don't mean one that showcases all the newest rendering techniques, but rather one that strains to put out a simple rotating triangle. Then just have Nvidia pull out all the stops to try to make their card work faster.

    It's sort of akin to walking around with a backpack full of cinderblocks. That way, when you put down those cinderblocks(ie benchmark), you'll notice how much stronger you got.

    Perhaps they should use .NET for their next benchmark. Or Java. That'll be the true test of a video card :-)

  • by syle ( 638903 ) * <syle.waygate@org> on Tuesday June 03, 2003 @03:10PM (#6107947) Homepage
    The question is to what extent it was "optimized." Nvidia also optimizes for Q3 and UT, so is it not valid to ask how these optimized performances compare to the optimized benchmark?

    What if they optimized not just for Q3, but for Q3 Level 1 while you're using player model X on a sunny day with a BFG and 13 bots? Would you cry foul? Are they cheating to make Q3 run faster? Isn't that their job? Where is the line? And, if 90% of the best selling games today run at that same "optimized" speed, how good of a benchmark can it be if it isn't optimized? Isn't it then showing results that are artificially low?

    If you want to use the video card to play optimized games, why would you care about the results of an unoptimized test?

  • Re:Cheat? (Score:2, Interesting)

    by obsid1an ( 665888 ) <{moc.ishcm} {ta} {naidisbo}> on Tuesday June 03, 2003 @03:12PM (#6107974)
    Application specific optimizations are out there for a lot of the highly demanding games. You don't think Doom3 is going to be optimized as much as possible by both ATI and NVIDIA? What makes it cheating is degrading QUALITY for PERFORMANCE. The drivers NVIDIA put forth increased performance without decreasing quality. That's why they are not considered cheating.
  • Re:riiiiight... (Score:1, Interesting)

    by violent.ed ( 656912 ) on Tuesday June 03, 2003 @03:23PM (#6108072)
    but you have no way of knowing how well it will perform on the next generation of games.

    Nor do they have any way of knowing the code for future versions of 3Dmark, which would make it harder to implement "cheating" benchmark optimizations. The same is true for nextgen games. isnt that why they constantly release New Drivers, to add optimizations to make the latest & greatest games work & look better on your existing video card? I think so :)
  • by Thagg ( 9904 ) <thadbeier@gmail.com> on Tuesday June 03, 2003 @03:26PM (#6108094) Journal
    I believe the correct interpretation of what FutureMark is saying is that the game writers are building their games differently for the different boards that are out there. That's what they mean by "manufacturer-specific code path optimization is directly in the code source." The source code they are referring to are UT and Q3, as examples.

    They are saying that the boards have become different enough that game writers are coding differently for them. Not too surprising, really. That's the way it's always been.

    This makes writing a synthetic benchmark extraordinarily difficult, needless to say. I don't know if it's even possible in this case. Perhaps rather than try to come up with one number that specifies how fast a board is, you can come up with a series of metrics for each capability.

    While I'm sure that FutureMark has had some pressure applied to them to make this statement, it's not an unreasonable statement on its face. It's just the path they took to get there that is questionable.
  • Bogus (Score:4, Interesting)

    by 0123456 ( 636235 ) on Tuesday June 03, 2003 @03:45PM (#6108328)
    "The only vendor that natively supports PS 1.4 is ATI."

    Sorry, but that's garbage, pure and simple. Or are you not aware that PS 1.4 support is _required_ for DX9 cards with PS2.0 support. Your complaint may be valid when comparing a GF4 against a Radeon 8500, but is totally bogus when comparing two DX9 cards.

    "And their "DX9" onyl test is a piece of crap too. They use one or two new instructions in the VS, and PS2.0 is only used for the sky."

    Gee, one minute you're complaining that they use PS1.4 instructions, and now you're complaining that they don't use PS2.0 instructions. PS1.4 instructions _are_ effectively DX9 instructions since other than ATI, no other DX8 chips use them: you need a DX9 chip to run PS1.4 shaders.

    And it would appear to be real lucky for nvidia that they don't use many PS2.0 instructions since from the results of their shader test once the nvidia "optimization" of throwing away the shader and running a completely different shader was fixed, shows them running PS2.0 shaders at about half the speed of a Radeon 9800. The low performance of PS2.0 shaders on the FX card seems to be the reason why nvidia hated 3DMark03 so much; there was no way to get a good score without redesigning the chip or "optimizing".
  • by Woodie ( 8139 ) on Tuesday June 03, 2003 @04:01PM (#6108495) Homepage
    OK -

    first off, for those of you wondering what the big deal with 3DMark 2003 is - and why you might use it in place of "real games" to benchmark 3D performance - here you go:

    3DMark is a test application to benchmark next-generation performance, so that you can get an idea how your video card might handle games that will be out this time _next_ year. Specifically some aspects of 3DMark are geared toward testing DX9 functionality, and it's Pixel and Vertex shaders. No game currently on the market uses these features (at least not that I am aware of).

    Secondly, the difference between a cheat and optimization is a fine one. If a given function continually produces the same output for the same inputs, and it takes 1 second to do so, and another function can produce the same results given the same inputs, but only takes 1/2 a second - it can be said to be functionally equivalent. However, it has been optimized. It's entirely possible, even desirable to replace pixel shaders and vertex shaders with routines which are optimized for your hardware. In much the same way that compilers schedule instructions optimally for the underlying CPU architecture, so too can instructions be re-ordered in a pixel shader routine... It's an optimization.

    Cheating occurs when people start making approximations (analogies to bringing a cheat-sheet to a test are not valid), or by failing to process (in the case of video cards) the same visual fidelity, and detail that was intended. By example:

    A> Reducing texture bit-depth.
    B> Reducing geometry detail (merging 2 or more polygons).

    This is only cheating if it's not the intent of the original application developer (not driver developer).

    A driver developer could make the following optimizations, since they don't affect the intent of the application developer:

    A> Pre-calc tables. A classic demo optimization would be to precalc a SIN function table to some level of precision as looking up a value was faster than calculating it on the fly.
    B> Replacing various pixel/vertex shader routines with functionally equivalent, but faster ones.
    C> Reordering data and textures (keeping detail and fidelity) into more optimal chunks for your hardware architecture.

    Those aren't cheats - they are optimizations. Of course, the only way you can tell this is if you have an objective standard to gauge against. 3DMark 2003 doesn't seem to provide this. In order to do so they would need the following:

    A> A software renderer for their demo.
    B> Timed snapshots of the demo saving uncompressed images from the software renderer to disk.
    C> The ability to re-run the demo using a hardware renderer (3D Card and drivers).
    D> The ability to take the same snapshots and save them, uncompressed to disk.
    E> The ability to do a histogram, per-pixel comparison to the software renders...

    This would enable you to arrive at some objective comparison of visual fidelity - instead of the occassionally subjective I think screenshot X looks better than screenshot Y. Without the intent of the 3DMark developers being known, we really can't know how true the hardware vendors and their drivers are to the original vision.

    Anything less than 3% difference is highly likely to be indistinguishable from the intent of the developers in this case. 5% to 10% may be visible, but acceptable (i.e. tweaks for speed in place of quality). Over 10% and you're playing with fire.
  • by Steveftoth ( 78419 ) on Tuesday June 03, 2003 @04:07PM (#6108551) Homepage
    The whole point of benchmarks is to get real world performance of something right?

    Well, how can you use 3DMark to get 'real world' performance out of a video card? Answer: you can't because it is a synthetic benchmark designed to test out features of cards that also have just come out.

    It is not testing real world performance, just some synthetic made up grahpics demo. Shouldn't it also try and emulate an application that is optimized as much as possiable so as to get the highest possiable performance instead of the lowest score? It's not real world, so the highest possiable score should be what Futuremark should be aiming for. This means optimizing their programs for every card. Getting down to the metal and making sure that everything runs as smoothly as possiable on every card they want to test on so as to get the best possiable performace.

    I can already tell you that the worst possiable perforamnce on all future video cards on any possiable software will be, it's less then one frame per second. This we already know, what we should be using programs like 3DMark to find out is how fast a computer with a card CAN run. because 3DMark software is about the possiabilities that a card has, not its REAL WORLD performance.

    So I say that 3DMark should let nVidia, Ati and etc create their drivers with special code paths. As long as it doesn't impact performance with normal programs, why not? 3DMark and other synthetic benchmarks should welcome the 'cheating' as it only validates the fact that people actually care about their benchmark. Then we might actually know how fast our cards can go given the right conditions.
  • MOD PARENT DOWN (Score:3, Interesting)

    by mczak ( 575986 ) on Tuesday June 03, 2003 @04:11PM (#6108594)
    Please mod that troll down - it's just nvidia pr.
    I've personally analyzed the data from the driver (since I'm writing one), and they totally favor ATI with the heavy use of PS 1.4 shaders. In fact, the data changes completely if PS 1.4 support isn't claimed. (3x more geometry is sent)
    You don't need to "analyze the data" to figure this out. futuremark themself stated how much PS 1.4/2.0 shaders are used. And that 3x times more geometry figure if you don't support PS1.4 is correct - guess why? If you do a fallback to PS1.1 (which 3dmark03 does if your hardware doesn't support 1.4) you need multiple passes, so the geometry data needs to be sent 3 times...
    Also, PS 1.4 shaders don't always translate 'up' to PS 2.0 hardware very well
    bull****. PS1.4 and PS2.0 shaders are actually very similar, PS2.0 shaders can be much longer and support some things PS1.4 shaders don't.
    The only vendor that natively supports PS 1.4 is ATI.
    Well, PS1.4 is a feature of DX8.1. And the GeForceFX supports PS 1.4 just as natively as the Radeon 9500 and up - both of them don't have dedicated hardware for PS1.4. The only cards which benefit from that are the Radeon 8500/9000/9100/9200, and regardless of that, the GeForce4Ti which do not support PS1.4 (nvidia's decision - why blame futuremark?) are still faster than those.
    They should have created PS 1.1 shaders for the masses, and then if 2.0 hardware is detected, had 2.0 shaders for everything.
    They DO have PS1.1 for the masses (the fallback from the PS1.4), you just can't do the effects in a single pass with PS1.1, which is why there are PS1.4 shaders. And converting the PS1.4 shaders to PS2.0 wouldn't change the speed they'd run on the FX (or the Radeon 9500 and up) anyway, but would just make it unable to run the them on older hardware.
    It's sad the OEMs put alot of stock in 3Dmark, they don't seem to realize that gamers play games all day, not benchmarks.
    A valid argument, but unfortunately no game today is even close to really depend (performance-wise) on the DX9 features of the newest graphic cards available. Not even the upcoming DoomIII will really depend on those features (all it requires is PS1.4 equivalent (remember, it's OpenGL) to do everything in a single pass).
  • by Remlik ( 654872 ) on Tuesday June 03, 2003 @04:16PM (#6108635) Homepage
    I think it's fairly safe to say the only people Nvidia cheated were themselves. If you create a card that only looks good in benchmarks and then performs poorly in real world games your customers will take notice. Especially since there are now 4.3*10^26 hardware review sites in every language. All of those sites will run side by side game evals and show the truth. On top of that most people buy these days by word of mouth. If HardOCP says FX is the shit then everyone buys, if they like ATI then ATI sells (Obligitory sheep sound here).

    For the record I have one personal expierence with this from ATI as well. My roomate at the time had a Gateway with a GF2 that went bad, Then sent him an early ATI Raedon to replace it. The card was flawless at 100fps in Q3, but if you loaded up Half-Life it ran like crap, never getting over 10fps. Ati coded the original Raedons to look great in Q3 because that was the hot benchmarking game out there at the time. My buddy eventually beat Gateway about the head and neck until they sent him another Nvidia card and then he got his 60+fps in both Q3 and HL.

    Just another case of the pot calling the kettle black in my eyes.

  • by Alan Cox ( 27532 ) on Tuesday June 03, 2003 @04:17PM (#6108652) Homepage
    Am I the only one wondering whether the discussion in question was of the "we'll sue you out of existance" variety.

    Its odd they would make their benchmark useless, and thats what it has become now IMHO - it doesnt tell you how fast a random game is likely to run, it tells you how good their hackers are at hand tuning a meaningless benchmark.
  • by Piquan ( 49943 ) on Tuesday June 03, 2003 @04:25PM (#6108779)

    So, rather than using one or two real-world datapoints, you should base your decision on no real-world datapoints?

  • by Torgo's Pizza ( 547926 ) on Tuesday June 03, 2003 @05:04PM (#6109252) Homepage Journal
    You are pretty much on target. However the real reason that 3DMark is a bad benchmark [videogamestumpers.com] is because the 3DMark Score is an aggregate number. You lose all your detail and are able to hide any "cheats" the hardware makers want. It's the reason that whenever people compare hardware in the real world (cars, televisions, computers) they don't use some made up figure. They measure using metrics like horsepower, torque, size, weight and speed.

    What Futuremark is currently doing is akin to taking a Dodge Ram pickup and a Ford Mustang and saying that Dodge got 5427 Marks and Ford had 5621 Marks. What does that tell you? How can you truly compare the two? The Ford is better in some way, but without the base numbers, the result is useless.

  • by DeathPenguin ( 449875 ) * on Tuesday June 03, 2003 @08:53PM (#6111004)
    Define "convince." Anand's review [anandtech.com] shows that the FX5900 is indeed a worthy competitor for the latest ATi offering, and the superior Linux drivers put nVidia over the top for Linux users.

    If you're concerned about nVidia's ethics, perhaps you should check out ATi's background [tech-report.com].

Our OS who art in CPU, UNIX be thy name. Thy programs run, thy syscalls done, In kernel as it is in user!

Working...