Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

NVidia Accused of Inflating Benchmarks 440

Junky191 writes "With the NVidia GeForce FX 5900 recently released, this new high-end card seems to beat out ATI's 9800 pro, yet things are not as they appear. NVidia seems to be cheating on their drivers, inflating benchmark scores by cutting corners and causing scenes to be rendered improperly. Check out the ExtremeTech test results (especially their screenshots of garbled frames)."
This discussion has been archived. No new comments can be posted.

NVidia Accused of Inflating Benchmarks

Comments Filter:
  • by SRCR ( 671581 ) on Thursday May 15, 2003 @09:07AM (#5963232) Homepage
    To bad Nvidia has to resort to these things to keep selling there cards.. The used to be great.. but now i have my doubts..
  • Hmmmm (Score:2, Interesting)

    by the-dude-man ( 629634 ) on Thursday May 15, 2003 @09:09AM (#5963245)
    Well they got caught...they obviously arnt to good at it, after all they did get caught

    I dont know why anyone ever cheats on benchmarks...how could you ever get away with it? do you really think no one is going to do their own benchmark? Come on. This is probably one of those most retarded things I have ever seen a company do.

    Oh well, Nvidia is getting to the point were they are going to have beat out ATI at some point if they want to survive
  • Yeah well... (Score:3, Interesting)

    by IpsissimusMarr ( 672940 ) * on Thursday May 15, 2003 @09:11AM (#5963256) Journal
    Read this article NVIDIA's Back with NV35 - GeForceFX 5900 Ultra [anandtech.com]

    3Dmark03 may be inflated but what counts is real world game benching. And FX 5900 wins over ATI in all but Comanche 4.

    Interesting ehh?
  • by diesel_jackass ( 534880 ) <travis...hardiman@@@gmail...com> on Thursday May 15, 2003 @09:12AM (#5963265) Homepage Journal
    I know, I thought this was common practice across the board in the video card industry. NVidia has always had the shadiest marketing (remember what the 256 stood for in the GeForce 256?) so I don't really think anyone would be surprised by this.
  • by mahdi13 ( 660205 ) <icarus.lnx@gmail.com> on Thursday May 15, 2003 @09:13AM (#5963271) Journal
    nVidia has been one of the more customer friendly video card makers...ever. They have full support for all platforms from Windows to Macs to Linux, this makes them, to me, one of the best companies around.
    So now they are falling into the power trap of "we need to be better and faster then the others" which is only going to have them end up like 3DFX in the end. Cutting corners is NOT the way to gain consumer support.

    As I look at it, it doesn't matter if your the fastest or not...it's the wide variety of platform support that has made them the best. ATi does make better hardware but their software (drivers) are terrible and not very well supported. If ATi would get the support that nVidia has been giving for the last few years, I would start using ATi hands down...It's the platform support that I require, not speed.
  • Very old practice. (Score:5, Interesting)

    by shippo ( 166521 ) on Thursday May 15, 2003 @09:17AM (#5963307)
    I recall about 10 years ago that one of the video adaptor manufacturers optimised their Windows 3.1 acclerated video drivers to give the best performance possible with the benchmark program Ziff-Davis used for their reviews.

    One test involved writing a text string in a particular font continuously to the screen in. This text string was encoded directly in the driver for speed. Similarly one of the polygon drawing routines was optimised for the particular polygons used in this benchmark.
  • Re:I don't know (Score:2, Interesting)

    by eddy ( 18759 ) on Thursday May 15, 2003 @09:23AM (#5963346) Homepage Journal

    Read the article. The cheating does not directly affect quality. Then how is it cheating I hear you ask? Because it only increase performance in the _specific_ scene and path rendered in the benchmark.

    This is similar to claiming having the worlds fastest _calculator_ of decimals of Phi, only to have it revaled that you're simply doing std::cout << phi_string << std::endl;

    ATI, Trident [spodesabode.com] and now nVidia. I really hoped nVidia would stand about this kind of lying.

  • by TopShelf ( 92521 ) on Thursday May 15, 2003 @09:31AM (#5963401) Homepage Journal
    In a way, it's a symptom of the importance that these benchmarks have assumed in reviews. Now, cards are tweaked towards improved performance within a particular benchmark, rather than improving overall.
  • by Anonymous Coward on Thursday May 15, 2003 @09:31AM (#5963406)
    Posting anonymously because I used to work for a graphics card company.

    I've seen a video card driver where about half the performance-related source code was put in specifically for benchmarks (WinBench, Quake3, and some CAD-related benchmarks), and the code was ONLY used when the user is running said benchmark. This is one of the MAJOR consumer cards, people.

    So many programming hours put into marketing's request to optimize the drivers for a particular benchmark. It makes me sick to think that we could have been improving the driver's OVERALL performance and add more features! One of the reasons I left......
  • by pecosdave ( 536896 ) on Thursday May 15, 2003 @09:33AM (#5963427) Homepage Journal
    I bought the first 64MB DDR Radeon right after it came out. I held on to the card for months waiting for ATI to release a driver, didn't happen. I heard of people having sucess getting 3D acceleration to work, but I could never duplicate that success.

    Finally after months of waiting I traded my Radeon to my roommate and got a GeForce 2 Pro with 64MB of DDR. Runs beutifully on Linux, I even play UT2K3 with it on an Athlon 850. Finally after having the GeForce2 for about four months I happen across a site that tells me how to make 3D acceleration work for the Radeon. To late now, I'm happy with my GeForce, and UT2K3 seems to only really want to work with nVidia anyways.

    I don't think drivers are the best way to defend ATI considering they tend to shrug off other OS's and nVidia has committed themselves to supporting Alternate OS's.
  • by op51n ( 544058 ) on Thursday May 15, 2003 @09:38AM (#5963458)
    Since upon reading the article it even states that nVidia don't have access to the version of 3dmark2003 (not on the beta team) so they can have errors between the drivers and the code for 3dmark and not know. This is the kind of thing that can happen, and will take a driver update to fix, but does not necessarily mean they are doing anything wrong.
    As someone who has always been impressed by nVidia's driver updates and the benefits they can give each time, I am going to wait to see if it really is something bad they are doing deliberately before changing my opinion of them.

    There is, at the moment, no real evidence in anyones favour.
  • by Anonymous Coward on Thursday May 15, 2003 @09:41AM (#5963490)
    It would be real funny, if true, as nVidia was slamming 3DMark for not being a real-world indicator of performance of their NV30.

    Remember, those great Doom III numbers were obtained on machines that nVidia supplied to reviewers. These numbers should also be suspect. If this is true, they had to know it would not look good. If nVidia did cheat like this, it can only mean the 5900 DOES NOT BEAT the ATI card. Desperate times indeed at nVidia.
  • Re:whatever (Score:5, Interesting)

    by GarfBond ( 565331 ) on Thursday May 15, 2003 @09:45AM (#5963522)
    Because these rendering errors only occur when you go off the timedemo camera track. If you were on the normal track (like you would be if you were just running the standard demo) you would not notice it. Go off the track and the card ceases to render properly. It's an optimization that is too specific and too coincidental for the excuse "driver bug" to work. It's not the first time nvidia has been seen to 'optimize' for 3dmark either (there was a driver set, a 42.xx or 43.xx, can't remember, where it didn't even render things like explosions and smoke in game test 1 for 3DM03)
  • by Ed Avis ( 5917 ) <ed@membled.com> on Thursday May 15, 2003 @09:47AM (#5963550) Homepage
    Why is it that people are assessing the performance of cards based on running the same narrow set of benchmarks each time? Of _course_ if you do that then performance optimization will be narrowly focused towards those benchmarks. Not just on the level of blatant cheating (recording a particular hardcoded text string or clipping plane) but more subtle things like only optimizing one particular code path because that's the only one the benchmark exercises.

    More importantly why is any benchmark rendering the exact same scene each time? Nobody would test an FPU based on how many times per second it could take the square root of seven. You need to generate thousands, millions of different scenes and render them all. Optionally, the benchmark could generate the scenes at random, saving the random seed so the results are reproducible and results can be compared.
  • Enough of that... (Score:3, Interesting)

    by Dwedit ( 232252 ) on Thursday May 15, 2003 @09:47AM (#5963555) Homepage
    Show me the Quack 3 Arena benchmarks! Then we'll decide which card is the best!
  • by Anonymous Coward on Thursday May 15, 2003 @09:48AM (#5963559)
    Now we know why there is no chance of open sourcing the NVidia drivers on linux.
  • by D3 ( 31029 ) <`moc.liamg' `ta' `gninnehddivad'> on Thursday May 15, 2003 @09:49AM (#5963569) Journal
    Damn, a few years ago ATI did a similar thing to the drivers with the Xpert@play cards. The cards got good benchmarks that never held up once people actually played the games. They got beat up pretty bad for it at the time. Now it looks like nVidia's turn.
  • Possible solutions. (Score:2, Interesting)

    by eddy ( 18759 ) on Thursday May 15, 2003 @09:55AM (#5963616) Homepage Journal

    The article talks about possible solutions to the problem of "repeatability" while still avoiding the problem of cheating in the way alleged here. I don't remember it mentioning this possible solution though: How about if the camera was controlled by a mathematical function of a seed given by hand. Like you'd seed a PRNG.

    This way you could repeat the benchmarks by giving the same seed. Generate a 'default one' at each new install (this to ensure clueless reviewers get a new seed). Make it easy to enter a new one or generate a random one.

    The explosion of possible views (if implemented correctly) would make it all but impossible to cheat in the way alleged, no?

  • Re:I don't know (Score:3, Interesting)

    by tedgyz ( 515156 ) on Thursday May 15, 2003 @09:56AM (#5963621) Homepage
    I absolutely agree with the drive towards real-world benchmarks. Who the hell cares if some stupid artificial benchmark can be derailed. Show me problems with a real game or app and then I'll care.
  • by BenjyD ( 316700 ) on Thursday May 15, 2003 @09:57AM (#5963629)
    If Nvidia GPL their drivers, no other company can directly incorporate code from them without also releasing their drivers under the GPL. So, NVidia found out just as much as ATI do.

    GPLing the drivers would give NVidia:

    1) Thousands of developers willing to submit detailed bug reports, port drivers, improve performance on 'alternative' operating systems etc.
    2) Protection from these kind of cheating accusations
    3) Better relationship with game developers - optimising for an NVidia card when you've got details of exactly how the drivers work is going to be much easier than for a competitor card.
    4) A huge popularity boost amongst the geek community, who spend a lot on hardware every year.

    NVidia is, first and foremost, a hardware company. In the same way that Sun, IBM etc. contribute to open-source projects in order to make their hardware or other services more appealing, NVidia stand to gain a lot too.

    And as for rogue drivers? I suppose you're worried about rogue versions of the Linux kernel destroying your processor?
  • Database Vendors (Score:3, Interesting)

    by CaptainZapp ( 182233 ) on Thursday May 15, 2003 @10:00AM (#5963660) Homepage
    DB Vendors absolutely love benchmarks. Especially when they can rig them themselves. My take is that it looks good to management type geezers. Something along the line of:

    20zillion transactions per second provided you have a massive parallel Alpha with 1024 processors and 256 TB of physical memory for just 23.99$ per transaction assuming that you found your massive parallel Alpha on a heap of scrap metal.

  • by YE ( 23647 ) on Thursday May 15, 2003 @10:03AM (#5963700)
    The 3dmark03 benchmark is cheating in the first place, implementing stencil shadows in two of the game tests in such a braindead manner which no sane programmer would put in an actual game.

    It also uses ATI-only pixel shaders 1.4, and reverts to dual-pass on other cards.

    Why all this?

    NVIDIA isn't on the 3dmark03 beta program (read: didn't pay FutureMark a hefty lump of greenbacks).
  • 5) Liability. Though it doesn't Make Sense (tm), if someone downloaded an "optimized driver" from superoptimizedrivers.com that in turn melted their chip or corrupted their vid card RAM in some way there would be repurcussions.

    Realize, in a society in which people sue others over dogs barking too loud, NVidia would definitely hear from a very small but very vocal group about it.

    6) Nivida's Programmers Don't Want This. Why? Let's say they GPL'd just the Linux reference driver. And in less than two weeks, a new optimized version came out that was TWICE as fast as the one before. This makes the programmers looks foolish. I know this is pure ego, but it is a concern I'm sure, for a programmer w/ a wife and kids.

    I know this all sounds goofy, and trivial. But politics and Common Sense do not mesh. Again, I think your intentions are great and in a perfect world there would be thousands working on making the best, most optimized driver out there.

    But if such a community were to exist (and you know it would), why bother paying a league of great programmers and not just send out a few test boards to those most active in that new community, more than willing to do work for Free (as in beer?)

    Just something to think about.
  • Mod parent up (Score:3, Interesting)

    by Gizzmonic ( 412910 ) on Thursday May 15, 2003 @10:57AM (#5964161) Homepage Journal
    It's rude, but also true.

    Benchmarks, even so-called 'real-world' benchmarks, are a poor indicator of system performance. Sites like Tom's Hardware and Anandtech exist as a kind of group therapy for hardcore gamers and 'performance enthuiasists'. You know if you read their "technical" articles that they understand as much about the inner workings of a computer as the rice rocket driver with the huge spoiler and chrome wheel covers understands about his car's engine.

    These sites always have an incestuous relationship with their advertisers, they don't know anything about statistics, the scientific method, or how valid data is gleaned and collected.

    Even ArsTechnica has tons of articles that pass off conjecture as fact (case in point: the latest PPC970 article). While their writers seem more technically knowledgeable, it's still deceipt.

    Benchmark and "performance enthusiast" sites are a con job, plain and simple. They should be treated as what they are, the "EZ WEIGHT LOSS PLAN!!!!" scams of the geek community.
  • by doinky ( 633328 ) on Thursday May 15, 2003 @11:23AM (#5964459)
    Actually, the software can, in most cases, check the rendering to the screen. The "correctness" benchmarks do this. The problem is that it is slow. WHQL DCT tests do this - you get two windows (in most tests); one of which was drawn using the reference rasterizer; and one of which was drawn using the graphics card; and believe me, they do test pixel-by-pixel. PC Magazine's benchmark did something similar; but again, it's not factored into the benchmark score. And obviously they don't test enough things if they got fooled here; but that's an argument for expanding the "correctness" suite.
  • by Maul ( 83993 ) on Thursday May 15, 2003 @11:36AM (#5964587) Journal
    Companies always tweak their code, insist on tests optimized for their hardware, etc. in order to get an edge up on benchmarks. This is probably especially true in cases where the competition is so neck-and-neck, as it seems to be with the video card industry. It seems that these companies will do anything to show they can get even two or three more FPS than the competition. It is hard to treat any benchmark seriously because of this.

    At the same time, I'm debating what my next video card should be. Even though ATI's hardware might be slightly better this round, the differences will probably be negligable to all but the most extreme gamers. At the same time NVidia has proven to me that they have a history of writing good drivers, and they still provide significantly better support to the Linux community than ATI does.

    For this reason I'm still siding with the GeForce family of video cards.
  • Just a note (Score:3, Interesting)

    by Sycraft-fu ( 314770 ) on Thursday May 15, 2003 @12:06PM (#5964914)
    On the whole scene being rendered correctly:

    It is perfectly possable ot read the graphics data from the card and write it to a file, like a tiff. In fact, I've seen some benchmarking programs that do. Then what you can do, for DirectX at any rate, is compare against a reference renderer. The development version of DX has a full software renderer built in that can do everything. It is slow as hell, being a pure software implementation, but also 100% 'correct' being that it is how DirectX intends for stuff to be rendered.

    Well, if you have a benchmark that includes images from the reference renderer, you can then compare those to the current renderer. Aside from just looking at them, you can do mathematical calculations of the images to see where and how they differ. A simple one would just be a straight XOR on all the pixels. If the current renderer got the same result as the reference renderer, you'll get black as a result (since anything XORed with itself is 0). Any time there is a difference, it will show up as a soloured pixel, and the more colour, the more it was different. I've seen a benchmark do this but I don't remember which one.

    Not saying that this is the perfect, end-all solution for graphics cards, but there ARE ways that they can be tested versus some kind of reference.
  • by aksansai ( 56788 ) <aksansai@gmEEEail.com minus threevowels> on Thursday May 15, 2003 @12:24PM (#5965115)
    Companies have long adopted the "open-source" fundamental philosophy even before Linux and what I call the modern open source movement caught on. Often, a company would have a nice product - license the code to a sub-company (who would modify/repackage/etc the original product). The license agreement stipulated that all modifications would 1) have to be reviewed by the company without restriction from the sub-company 2) the modifications would have to be approved by the company.

    Take for instance the relationship between Microsoft and IBM during the OS/2 era. The two companies working on the same code base produced OS/2 and, eventually, the NT kernel.

    Or, more recently - the brilliant strategy of Netscape Communications Corporation - the birth of the Mozilla project. To the open source community - take our browser, modify it like hell, make it a better project. You have, of course, Mozilla as the browser - but Netscape (Navigator) still exists (as a repackaged, "enhanced" Mozilla).

    nVidia's source code release would have two major impacts as far as their performance goes.

    1) ATI (et al.) would find the actual software-based enhancements they could also incorporate into their own driver to improve their product.

    2) nVidia could capture the many brilliant software developers that happen to be a part of the whole nVidia "cult" - this could lead to significant advancements to their driver quality (and overall product quality).

    My guess is that the lid is kept so tightly shut on nVidia's drivers because they can keep their chips relatively simple through their complex software driver. ATI, perhaps, has the technical edge in the hardware arena, but does not have the finesse for software enhancing drivers like nVidia does.
  • Re:Sigh... (Score:3, Interesting)

    by cgenman ( 325138 ) on Thursday May 15, 2003 @12:34PM (#5965198) Homepage
    Good point, but I think the larger point is.

    No one has ever held onto the #1 spot in the graphics card industry. No one.

    Perhaps it is because you are competing against a monolith that the up-and-comers can convince their engineers to give up hobbies and work 12 hour days. Perhaps it is because the leader of a #1 must be conservative in its movements to please the shareholders. Perhaps it is because with 10 other companies gunning for your head, one of them will be gambling on the right combination of technologies to mature in time for them to release their winning card.

    Anyone remember when the Hercules was the be-all-end-all? Where are they now?

    nVidia will go down. ATI will go down. What will not go down is the graphics card industry. Despite our multi-hundred dollar investment in one particular company, our allegiance should be to good gaming in general, and not to any specific manufacturer.

    And yes, I'm sick of synthetic benchmarking. We should find ways to compare across games... for example running two graphics cards simultaneously in a system on retail games, and slowly upping the framerate until one then the other cannot keep up.

  • They did it before (Score:3, Interesting)

    by kwiqsilver ( 585008 ) on Thursday May 15, 2003 @01:12PM (#5965585)
    With the Riva128, back when I had a 3Dfx Voodoo (or Voodoo2).
    They garbled texture maps to achieve a higher transfer rate and frame rate. Then they went legit for the TNT line.
    I guess the belief "if you can't win, cheat" is still there at nvidia.
    I wonder if ATi makes a good Linux driver...
  • by Animats ( 122034 ) on Thursday May 15, 2003 @01:50PM (#5965907) Homepage
    This problem came up in compiler benchmarks years ago, and a solution was developed. Someone wrote a benchmark suite which consisted of widely used benchmarks plus slightly modified versions of them. Honest compilers did the same on both. Compilers that were recognizing the benchmarks did quite differently. The results were presented as a row of bar graphs - a straight line indicated the compiler was honest; peaks indicated a cheat.

    Some compilers miscomplied the modified benchmark, because they recognized the code as the standard benchmark even though it wasn't exactly the same.

    (Anybody have a reference for this? I heard the author give a talk at Stanford years ago.)

  • Voodoo economics (Score:3, Interesting)

    by Charcharodon ( 611187 ) on Thursday May 15, 2003 @02:04PM (#5966050)
    Nvidia's current problems sound familiar don't they? 3DFX started floundering once they made it to the top, and started worrying more about profit margin and market share than putting out the best video cards. If they keep this behavior up, I give it two years before ATI starts looking at buying them out.
  • actually it is 80 GB (Score:3, Interesting)

    by Trepidity ( 597 ) <[gro.hsikcah] [ta] [todhsals-muiriled]> on Thursday May 15, 2003 @02:12PM (#5966130)
    giga = 10^9, and an 80 GB hard drive has 80 x 10^9 (10 billion) bytes. This is standard notation that has been in use for at least a hundred years. Perhaps what you're looking for is 80 GiB, which the hard drives are not advertised as.

    This is standard even in most other parts of computing (anything engineering-oriented especially). For example, that 128kbps mp3 you downloaded is 128000 bits/second, not 128*1024 bits/second.
  • It's about time. (Score:2, Interesting)

    by jaritsu ( 543231 ) on Thursday May 15, 2003 @02:39PM (#5966369) Homepage
    NVida has always stood silent in the race to win benchmarks. Fact: Every video card manufacturer tweeks drivers specificly for benchmarks. ATI scammed people into a 50% performance increase years ago with a new set of drivers. This of course was completely false.

    Fact: NVidia is probably the last company to join in this race. when they denouced the use of Futuremarks programs after 3DMark2k3 showed undeserved favorability towards ATI's driver set they were ostracized for not being a big player. It seems to me that they finally said "fuck it, the public wants bullshit drivers that inflate thier benchmarks, then we will give it to them!"

    Good for NVidia. They always have been, and for the forseeable future always will be the no compromise 3d gaming solution.

    The funniest part of this all is this, in unreal2k3 I personally have seen a 160/80 flyby/botmatch score jump up to 220/103 on a 5800FX based AMD1700+ system. So the drivers are not complete bullshit. Unlike ATI who was chastised in the past for having lower game scores after the fact.
  • by tuxlove ( 316502 ) on Thursday May 15, 2003 @02:53PM (#5966495)
    I had an nVidia GeForce 2 card in my previous PIII Win2k box. It wasn't the fastest card around, but it did the job. And it seemed to render game video correctly and true to the game vendor's intent. Eventually I was unable to play certain games with big CPU/video requirements, and bought one of those little Shuttle Mini ATX boxes with a nice P4 3Ghz CPU and an ATI Radeon 9700 AIW Pro. The box screams, and I can play games at untold resolutions now. But that Radeon just isn't quite right.

    Playing "Jedi Outcast", it seems to omit the sky in outdoor scenes, which is completely lame. Sometimes you can see through the corner of walls, as if there is a crack. It displays severe performance problems in 32-bit mode, as well as some other behavioral quirks. And in at least one popular game (Raven Shield) you have to completely turn off antialiasing or the mouse doesn't work properly (go figure!).

    My nVidia card didn't seem to have any issues at all, at least none that I could detect. Certainly nothing as plain as what I've found with my Radeon 9700. I would not be surprised if nVidia has some problems with their latest card, nor would I be surprised if they were consciously cutting corners. But there are enough issues with the products their competitors put out that I have to wonder why nVidia is being singled out here?

Real Programmers don't eat quiche. They eat Twinkies and Szechwan food.

Working...