NVidia Accused of Inflating Benchmarks 440
Junky191 writes "With the NVidia GeForce FX 5900 recently released, this new high-end card seems to beat out ATI's 9800 pro, yet things are not as they appear. NVidia seems to be cheating on their drivers, inflating benchmark scores by cutting corners and causing scenes to be rendered improperly. Check out the ExtremeTech test results (especially their screenshots of garbled frames)."
Giveing them self a bad name (Score:3, Interesting)
Hmmmm (Score:2, Interesting)
I dont know why anyone ever cheats on benchmarks...how could you ever get away with it? do you really think no one is going to do their own benchmark? Come on. This is probably one of those most retarded things I have ever seen a company do.
Oh well, Nvidia is getting to the point were they are going to have beat out ATI at some point if they want to survive
Yeah well... (Score:3, Interesting)
3Dmark03 may be inflated but what counts is real world game benching. And FX 5900 wins over ATI in all but Comanche 4.
Interesting ehh?
Re:What's the big news? (Score:3, Interesting)
As the mighty start to fall... (Score:5, Interesting)
So now they are falling into the power trap of "we need to be better and faster then the others" which is only going to have them end up like 3DFX in the end. Cutting corners is NOT the way to gain consumer support.
As I look at it, it doesn't matter if your the fastest or not...it's the wide variety of platform support that has made them the best. ATi does make better hardware but their software (drivers) are terrible and not very well supported. If ATi would get the support that nVidia has been giving for the last few years, I would start using ATi hands down...It's the platform support that I require, not speed.
Very old practice. (Score:5, Interesting)
One test involved writing a text string in a particular font continuously to the screen in. This text string was encoded directly in the driver for speed. Similarly one of the polygon drawing routines was optimised for the particular polygons used in this benchmark.
Re:I don't know (Score:2, Interesting)
Read the article. The cheating does not directly affect quality. Then how is it cheating I hear you ask? Because it only increase performance in the _specific_ scene and path rendered in the benchmark.
This is similar to claiming having the worlds fastest _calculator_ of decimals of Phi, only to have it revaled that you're simply doing std::cout << phi_string << std::endl;
ATI, Trident [spodesabode.com] and now nVidia. I really hoped nVidia would stand about this kind of lying.
Re:What's the big news? (Score:3, Interesting)
Re:What's the big news? (Score:5, Interesting)
I've seen a video card driver where about half the performance-related source code was put in specifically for benchmarks (WinBench, Quake3, and some CAD-related benchmarks), and the code was ONLY used when the user is running said benchmark. This is one of the MAJOR consumer cards, people.
So many programming hours put into marketing's request to optimize the drivers for a particular benchmark. It makes me sick to think that we could have been improving the driver's OVERALL performance and add more features! One of the reasons I left......
Re:They have all done it (Score:4, Interesting)
Finally after months of waiting I traded my Radeon to my roommate and got a GeForce 2 Pro with 64MB of DDR. Runs beutifully on Linux, I even play UT2K3 with it on an Athlon 850. Finally after having the GeForce2 for about four months I happen across a site that tells me how to make 3D acceleration work for the Radeon. To late now, I'm happy with my GeForce, and UT2K3 seems to only really want to work with nVidia anyways.
I don't think drivers are the best way to defend ATI considering they tend to shrug off other OS's and nVidia has committed themselves to supporting Alternate OS's.
Re: seems misleading.. (Score:3, Interesting)
As someone who has always been impressed by nVidia's driver updates and the benefits they can give each time, I am going to wait to see if it really is something bad they are doing deliberately before changing my opinion of them.
There is, at the moment, no real evidence in anyones favour.
Don’t trust the Doom III benchmarks (Score:1, Interesting)
Remember, those great Doom III numbers were obtained on machines that nVidia supplied to reviewers. These numbers should also be suspect. If this is true, they had to know it would not look good. If nVidia did cheat like this, it can only mean the 5900 DOES NOT BEAT the ATI card. Desperate times indeed at nVidia.
Re:whatever (Score:5, Interesting)
Problem is the benchmarks themselves (Score:5, Interesting)
More importantly why is any benchmark rendering the exact same scene each time? Nobody would test an FPU based on how many times per second it could take the square root of seven. You need to generate thousands, millions of different scenes and render them all. Optionally, the benchmark could generate the scenes at random, saving the random seed so the results are reproducible and results can be compared.
Enough of that... (Score:3, Interesting)
Re:What's the big news? (Score:2, Interesting)
The circle is complete (Score:3, Interesting)
Possible solutions. (Score:2, Interesting)
The article talks about possible solutions to the problem of "repeatability" while still avoiding the problem of cheating in the way alleged here. I don't remember it mentioning this possible solution though: How about if the camera was controlled by a mathematical function of a seed given by hand. Like you'd seed a PRNG.
This way you could repeat the benchmarks by giving the same seed. Generate a 'default one' at each new install (this to ensure clueless reviewers get a new seed). Make it easy to enter a new one or generate a random one.
The explosion of possible views (if implemented correctly) would make it all but impossible to cheat in the way alleged, no?
Re:I don't know (Score:3, Interesting)
Re:Another reason to open-source drivers (Score:4, Interesting)
GPLing the drivers would give NVidia:
1) Thousands of developers willing to submit detailed bug reports, port drivers, improve performance on 'alternative' operating systems etc.
2) Protection from these kind of cheating accusations
3) Better relationship with game developers - optimising for an NVidia card when you've got details of exactly how the drivers work is going to be much easier than for a competitor card.
4) A huge popularity boost amongst the geek community, who spend a lot on hardware every year.
NVidia is, first and foremost, a hardware company. In the same way that Sun, IBM etc. contribute to open-source projects in order to make their hardware or other services more appealing, NVidia stand to gain a lot too.
And as for rogue drivers? I suppose you're worried about rogue versions of the Linux kernel destroying your processor?
Database Vendors (Score:3, Interesting)
20zillion transactions per second provided you have a massive parallel Alpha with 1024 processors and 256 TB of physical memory for just 23.99$ per transaction assuming that you found your massive parallel Alpha on a heap of scrap metal.
The Kettle or the Pot? (Score:2, Interesting)
It also uses ATI-only pixel shaders 1.4, and reverts to dual-pass on other cards.
Why all this?
NVIDIA isn't on the 3dmark03 beta program (read: didn't pay FutureMark a hefty lump of greenbacks).
Re:Another reason to open-source drivers (Score:5, Interesting)
Realize, in a society in which people sue others over dogs barking too loud, NVidia would definitely hear from a very small but very vocal group about it.
6) Nivida's Programmers Don't Want This. Why? Let's say they GPL'd just the Linux reference driver. And in less than two weeks, a new optimized version came out that was TWICE as fast as the one before. This makes the programmers looks foolish. I know this is pure ego, but it is a concern I'm sure, for a programmer w/ a wife and kids.
I know this all sounds goofy, and trivial. But politics and Common Sense do not mesh. Again, I think your intentions are great and in a perfect world there would be thousands working on making the best, most optimized driver out there.
But if such a community were to exist (and you know it would), why bother paying a league of great programmers and not just send out a few test boards to those most active in that new community, more than willing to do work for Free (as in beer?)
Just something to think about.
Mod parent up (Score:3, Interesting)
Benchmarks, even so-called 'real-world' benchmarks, are a poor indicator of system performance. Sites like Tom's Hardware and Anandtech exist as a kind of group therapy for hardcore gamers and 'performance enthuiasists'. You know if you read their "technical" articles that they understand as much about the inner workings of a computer as the rice rocket driver with the huge spoiler and chrome wheel covers understands about his car's engine.
These sites always have an incestuous relationship with their advertisers, they don't know anything about statistics, the scientific method, or how valid data is gleaned and collected.
Even ArsTechnica has tons of articles that pass off conjecture as fact (case in point: the latest PPC970 article). While their writers seem more technically knowledgeable, it's still deceipt.
Benchmark and "performance enthusiast" sites are a con job, plain and simple. They should be treated as what they are, the "EZ WEIGHT LOSS PLAN!!!!" scams of the geek community.
Re:Problem is the benchmarks themselves (Score:2, Interesting)
Everyone seems to mess with benchmarks. (Score:5, Interesting)
At the same time, I'm debating what my next video card should be. Even though ATI's hardware might be slightly better this round, the differences will probably be negligable to all but the most extreme gamers. At the same time NVidia has proven to me that they have a history of writing good drivers, and they still provide significantly better support to the Linux community than ATI does.
For this reason I'm still siding with the GeForce family of video cards.
Just a note (Score:3, Interesting)
It is perfectly possable ot read the graphics data from the card and write it to a file, like a tiff. In fact, I've seen some benchmarking programs that do. Then what you can do, for DirectX at any rate, is compare against a reference renderer. The development version of DX has a full software renderer built in that can do everything. It is slow as hell, being a pure software implementation, but also 100% 'correct' being that it is how DirectX intends for stuff to be rendered.
Well, if you have a benchmark that includes images from the reference renderer, you can then compare those to the current renderer. Aside from just looking at them, you can do mathematical calculations of the images to see where and how they differ. A simple one would just be a straight XOR on all the pixels. If the current renderer got the same result as the reference renderer, you'll get black as a result (since anything XORed with itself is 0). Any time there is a difference, it will show up as a soloured pixel, and the more colour, the more it was different. I've seen a benchmark do this but I don't remember which one.
Not saying that this is the perfect, end-all solution for graphics cards, but there ARE ways that they can be tested versus some kind of reference.
Reason for open-source - period. (Score:4, Interesting)
Take for instance the relationship between Microsoft and IBM during the OS/2 era. The two companies working on the same code base produced OS/2 and, eventually, the NT kernel.
Or, more recently - the brilliant strategy of Netscape Communications Corporation - the birth of the Mozilla project. To the open source community - take our browser, modify it like hell, make it a better project. You have, of course, Mozilla as the browser - but Netscape (Navigator) still exists (as a repackaged, "enhanced" Mozilla).
nVidia's source code release would have two major impacts as far as their performance goes.
1) ATI (et al.) would find the actual software-based enhancements they could also incorporate into their own driver to improve their product.
2) nVidia could capture the many brilliant software developers that happen to be a part of the whole nVidia "cult" - this could lead to significant advancements to their driver quality (and overall product quality).
My guess is that the lid is kept so tightly shut on nVidia's drivers because they can keep their chips relatively simple through their complex software driver. ATI, perhaps, has the technical edge in the hardware arena, but does not have the finesse for software enhancing drivers like nVidia does.
Re:Sigh... (Score:3, Interesting)
No one has ever held onto the #1 spot in the graphics card industry. No one.
Perhaps it is because you are competing against a monolith that the up-and-comers can convince their engineers to give up hobbies and work 12 hour days. Perhaps it is because the leader of a #1 must be conservative in its movements to please the shareholders. Perhaps it is because with 10 other companies gunning for your head, one of them will be gambling on the right combination of technologies to mature in time for them to release their winning card.
Anyone remember when the Hercules was the be-all-end-all? Where are they now?
nVidia will go down. ATI will go down. What will not go down is the graphics card industry. Despite our multi-hundred dollar investment in one particular company, our allegiance should be to good gaming in general, and not to any specific manufacturer.
And yes, I'm sick of synthetic benchmarking. We should find ways to compare across games... for example running two graphics cards simultaneously in a system on retail games, and slowly upping the framerate until one then the other cannot keep up.
They did it before (Score:3, Interesting)
They garbled texture maps to achieve a higher transfer rate and frame rate. Then they went legit for the TNT line.
I guess the belief "if you can't win, cheat" is still there at nvidia.
I wonder if ATi makes a good Linux driver...
Benchmarks for catching cheating vendors (Score:3, Interesting)
Some compilers miscomplied the modified benchmark, because they recognized the code as the standard benchmark even though it wasn't exactly the same.
(Anybody have a reference for this? I heard the author give a talk at Stanford years ago.)
Voodoo economics (Score:3, Interesting)
actually it is 80 GB (Score:3, Interesting)
This is standard even in most other parts of computing (anything engineering-oriented especially). For example, that 128kbps mp3 you downloaded is 128000 bits/second, not 128*1024 bits/second.
It's about time. (Score:2, Interesting)
Fact: NVidia is probably the last company to join in this race. when they denouced the use of Futuremarks programs after 3DMark2k3 showed undeserved favorability towards ATI's driver set they were ostracized for not being a big player. It seems to me that they finally said "fuck it, the public wants bullshit drivers that inflate thier benchmarks, then we will give it to them!"
Good for NVidia. They always have been, and for the forseeable future always will be the no compromise 3d gaming solution.
The funniest part of this all is this, in unreal2k3 I personally have seen a 160/80 flyby/botmatch score jump up to 220/103 on a 5800FX based AMD1700+ system. So the drivers are not complete bullshit. Unlike ATI who was chastised in the past for having lower game scores after the fact.
I have doubts about ATI, not nVidia (Score:1, Interesting)
Playing "Jedi Outcast", it seems to omit the sky in outdoor scenes, which is completely lame. Sometimes you can see through the corner of walls, as if there is a crack. It displays severe performance problems in 32-bit mode, as well as some other behavioral quirks. And in at least one popular game (Raven Shield) you have to completely turn off antialiasing or the mouse doesn't work properly (go figure!).
My nVidia card didn't seem to have any issues at all, at least none that I could detect. Certainly nothing as plain as what I've found with my Radeon 9700. I would not be surprised if nVidia has some problems with their latest card, nor would I be surprised if they were consciously cutting corners. But there are enough issues with the products their competitors put out that I have to wonder why nVidia is being singled out here?