Comprehensive Video Benchmarks 50
Crusader writes: "Matt Matthews has produced an extensive series of benchmarks which examine four separate games' performance on the Voodoo5, the Rage 128 Pro, the G400 Max, and the GeForce 2 3D graphics cards under Linux. Performance against Windows 98 is also included." We also received: driveitlikeyoustoleit writes, "3dfx, NVIDIA and ATI's best are all pitted against each other in a high-end 3D video card
roundup. The authors pit six GeForce2 GTS (from ASUS, Creative,
ELSA and Hercules) based cards against an ATI Radeon 256 and a 3dfx Voodoo5
5500. Performance for a change isn't the only criteria in question (although the end scores are somewhat weighted in favor of fps) but also at full-scene
anti-aliasing, image quality and DVD performance/quality are critically looked at. The screen shots page showing off FSAA comparisons are great visual indicators of what the cards can do."
Sharky is flaky (Score:2)
For as long as I can remember, ATI's chips have performed far better in 32-bit color depth tests than 16-bit color-depth tests. Yet, Sharky doesn't seem to show any charts comparing the cards in 32-bit except for the Re-Volt benchmark which they admit is outdated. However, they do state on page 6 [sharkyextreme.com] of their review that the GeForce-2 cards rule both 16-bit and 32-bit.
Did I miss something?
Anyways, it seems that the Radeon is only a few FPS behind the GeForce-2 cards, and I imagine that difference is humanly imperceivable except for super-humans. Meanwhile, you gain better DVD playback and other huge multimedia offerings, especially if you look into the always-a-pleasuer All-in-Wonder line from ATI.
So, why did Sharky need to use so many pages to get these points across?
A comparison without the page flipping difference (Score:2)
As far as the display driver is concerned, doing a page flip on a non-fullscreen surface is impossible. That should eliminate the difference produced by the blitting drivers for Linux.
Besides, issues like the efficient use of AGP bus, DMA, eliminating bad polygons when you do clipping and synchronization all can have a great impact for performance - blitting vs flipping is just a minor issue on all but the highest resolutions.
This is certainly interesting.. (Score:1)
In Windows, I'm running the Detonator 3 6.31 drivers. Before I was using 5.16a drivers (Came on the cd) and was pulling about 45-50fps on timedemos at 1152x864 with 32bit color on and all the other settings jacked (Sound on during the timedemos too). Now with those same settings and the DET3 6.31 drivers I get about 70fps. Sure I probably can't truely see 70fps, but it's WAY smoother than 45-50fps.
I'd just like to see how Linux could handle my card and Quake3. Oh yeah, my box is an Athlon 750/128mb ram/3d Prophet2 GF2 GTS 64mb/Of course more stuff but nothing else I need to mention that is essential to gaming or something. So what I would like to know is if anyone else has similiar hardware that can run tests with Linux and Windows to see what difference they get. I did read that article a while back about comparing Linux to Windows with 3d games, but I'd rather see what a user gets not some lab.
-PovRayMan
A related question...building a workstation. (Score:1)
Thanks.
Re:Weighted statistics and reviewer bias (Score:1)
Does the 2D look any good on my monitor, because I spend far more time using 2D than playing 3D games?
VIA chipsets? (Score:2)
On the whole, I've never had problems with VIA chipsets and AGP, this is the first unstable driver I've ever had, and I've been using VIA-based mobos (Epox Super7 for a K6-2 and now an Athlon KT133-based board.) for 3 years.
Re:Sharky is flaky (Score:1)
Anyways, back to your comments... The Radeon has been quite surprising to the industry. It came in just behind all of the GeForce-2 cards in Sharky's benchmarks and well above the Voodoo5. My point of contention with them is that their tests seem to be limited to 16-bit color-depth.
As for the stability of their drivers, I don't know where you're coming from. I've personally had no problems with their drivers. Hanging out in comp.sys.ibm.pc.hardware.video [pc.hardware.video], I've seen an pretty-much uniform distribution of complaints for all card manufacturers.
Finally, as I'm not a Linux user, so I can't comment on the drivers available for Linux. I've found limited BeOS [be.com] drivers available, though, and they're stable. Perhaps by your logic BeOS developers are better than Linux developers? Perhaps ATI doesn't forecast enough profit in the Linux sector to justify making in-house drivers? This is spawns a whole 'nother can of worms.
Stability? (Score:3)
We run Linux because it's stable, even in cases where we know that we lose performance because of that stability.
Well, all of these reviews basically say, "If you can afford it, buy NVIDIA! They're fast!".
So what if your card is fast but your machine crashes in the middle of a game? NVIDIA's drivers just plain suck in the stability arena - My machine crashes on a regular basis in both Q3 and UT with those crap drivers. Even Windows looks rock-solid compared to Linux when using those accursed drivers! And all this because the only difference between the Quadro and GeForce is a PCI/AGP device ID and a few flags set in the driver because of said ID - it's the only plausible reason for being the only vendor not to release source/specs. (Check out http://www.geocities.com/tnaw_xtennis/)
Re:simple graphics techie questions (Score:1)
So, what I'm hearing is that there is no attempt or standard way to make the quality of each frame's graphics lower to lighten the load and sustain a high framerate? That would seem like a better solution for the player.
Driver stability more important than speed (Score:2)
Linux still don't cut it for gaming. (Score:2)
Actually, I think I need to cogitate more. The NVIDIA Linux drivers are pulled from more or less the same source as the Windows Linux drivers. This means code-wise, the two are fairly even in terms of performance tweeks. Now the Linux drivers will gain a little bit as NVIDIA tweeks glue between the drivers and OS, but I doubt the speed will ever reach that of Windows. Now page-flipping could be one of the things holding back the Linux drivers, but note that even at the lower-res tests, Linux is still behind. All this ignores one point: Shouldn't Linux be undeniable FASTER? Why use the OS of it is only just ALMOST as fast as Windows. Windows is a piece of junk. Why can't Linux beat it?
Why 6 GF2's? (Score:1)
It doesn't take a rocket scientist to tell you that the same card will perform similarly. One of the definitions of insanity is repeating the same action and expecting a different result. Now, I guess the silkscreening of the brand name might cause some molecular differences, leading to a performance boost/loss. If you were to perform some simple statistics on those results from the various GF's you could eliminate most, if not all, of them as normal variation.
Re:#nvidia on irc.openprojects.net/irc.linux.com (Score:1)
Re:Linux still don't cut it for gaming. (Score:1)
>>>>>>>
A 20fps difference because of page flipping? I highly doubt it. At most, you're sucking up around 200MB of bandwidth because of the blitting. That's a drop in the bucket for the GF2. Not to mention the fact that even at low res, the Linux drivers are STILL slower. No, the evidence suggests that something other than the blitting is holding back the GF2.
Because most Linux users routinely have tens of processes running in the background while using their systems?
>>>>>
If those idiots conducted benchmarking with tens of processes running in the background, than it is their own damn fault. Not to mention the fact that any half intelligent user tweeks their machine, cutting out unneeded processes. Of course, half of the people using RH6 are running SendMail in the background (which is loaded by default on those systems.)
Because, as good as the NVIDIA drivers are, they are still new (labeled as Beta, even)?
>>>>>>>
They are still beta, but they are based on rock solid code. What about cross-platform access don't you understand? Any bugs are in the driver/OS glue layer, and I seriously doubt that any major performance bugs could be hiding in there.
Because a good Linux driver should perform some minimal input checking to enforce security (I don't know if NVIDIA does this--more of a DRI-ish
thing)?
>>>>>>>
If security involves a 20-30% performance hit, than I say "welcome crackers!" On a game machine, security at that level is absolutely and totally useless and should be turned of. For a home machine, the only security should be in the network server. There is no need on a desktop to protect the user from local processes.
simple graphics techie questions (Score:1)
Is it easy to describe or is there a place that has a good explanation? Oh, and what are the most useful/critical accelerations?
Weighted statistics and reviewer bias (Score:2)
Well this review is really is going to be a factor in the purchase of my next card when it admits that it has been "weighted" in favour of certain types of applications. Is it really too much to expect to be given reviews in terms of raw perforamce? Considering the amount of journalistic integrity seen so far on the net, probably.
Then again, this isn't really suprising from a website that somehow manages to fit more banner ads onto each page (and there are for some reason, a lot of pages) than there is actual content. With the amount of corporate whoring they're managing to acheive in their page layouts, is it any wonder that their reviews feature skewed statistics which practically invalidate their purpose? It also makes you wonder where else corporate $$$ comes into the equation in these kinds of reviews.
I'd much rather that we saw more reviews from sources that don't appear to be pandering for cash from commercial sources. Whenever you see a banner ad, you can't trust the information you're being given. Hmm, now what site does that bring to mind?
Bottom Line (Score:1)
- With the exception of Nvidia, Linux drivers substantially trailed the windows drivers on any given card.
- The disparity between Windows and Linux performance gets bigger at higher resolutions and texture sizes.
- Certain cards (omitted to prevent flame wars) aren't worth bothering with.
Re:Weighted statistics and reviewer bias (Score:1)
How well will it run with Windows 2000 and Linux
How well will it play Unreal Tournament and Quake
How much does it cost?
I don't give a flying hoot how many floating point mega textured shaded pixel snagget things it can draw in 1 bazillionith of a second. I wanna play games and have KDE look funky in 1024x768 mode. Benchmarks don't impress me, I'm afraid. I honestly just care about how well the card will run on my machine and play the games I like.
And this review answered exactly that. It is almost precisly the review I have been waiting for. The NIVA kicked serious butt, and is now my #1 choice.
Re:Weighted statistics and reviewer bias (Score:1)
Re:simple graphics techie questions (Score:1)
Re:#nvidia on irc.openprojects.net/irc.linux.com (Score:2)
I don't. NVidia's drivers are unstable and crash my system all the time.
#nvidia on irc.openprojects.net/irc.linux.com (Score:3)
Re:More (Score:2)
Re:#nvidia on irc.openprojects.net/irc.linux.com (Score:2)
Re:#nvidia on irc.openprojects.net/irc.linux.com (Score:2)
B) (Much more important) NVIDIA has no reason to help out Matrox and the others. If you get your head out of your oss ass and look around, you'll notice that all the card companies *except* NVIDIA are having problems with their OpenGL drivers. If you read last months MaximumPC, you'll read an interview with the OpenGL driver developer at Matrox. He says that a GL driver is a lot of work, which is why the Matrox GL drivers aren't 100% yet. NVIDIA has a kick-ass OpenGL driver. A GL driver isn't an ordinary graphics driver. It is a complete implementation of the OpenGL pipeline (an ICD) Now if your implementation of OpenGL was faster than everyone else's (who were having problems with their own implementations and would love to get their hands on yours) wouldn't YOU keep yours closed?
Re:Obviously not you (Score:1)
It's funny you mention this - this is ridiculous, especially for us loyal few who have been patiently waiting for 3dfx to get their head out of their *ss. I would be content with a linux driver from 3dfx that was halfway decent. They're not even close to performance of the windows drivers, and yet NVIDIA linux users have some pretty solid drivers. Not a real strong argument for open source is it? My next card will be an NVIDIA, closed source or whatever...
Re:Stability? (Score:1)
As to the NVIDIA drivers, it is my experience that their stability is inversely proportional to how crappy your AGP chipset is. My work box has never had a problem running any of our games, my flatmate's box with an AVi chipset has occasional crashes during games for no explicable reason.
As a counterpoint, the Win32 TNT setup I'm using right now incorrectly renders triangle strips for some reason...
Nvidia! (Score:1)
Re:A related question...building a workstation. (Score:1)
I want to build a Linux OpenGL (err...MESA) development system and plan to use XF86 4.0 and DRI to take advantage of windowed hardware acceleration. Can anyone recommend a solution here?
YMMV, but here goes. We recently installed SuSE 7.0 on a handful of machines in order to get a "one-stop-shop" for X 4.0 and OpenGL.
The Matrox G400 and Creative Labs Annihilator Pro GeForce cards we had lying around worked but weren't too stable (Matrox is rock solid on 2D tho'); but (cross fingers) the Creative Labs TNT2 Ultra seems stable (and fast) over the last week or so.
I would like some feedback from other people who have done more than run fullscreen gaming benchmarks.
We use them for writing OpenGL and OpenInventor programs, and also Quake III.
Re:Linux still don't cut it for gaming. (Score:1)
Because a good Linux driver should perform some minimal input checking to enforce security (I don't know if NVIDIA does this--more of a DRI-ish thing)?
Etc.
m.
Re:#nvidia on irc.openprojects.net/irc.linux.com (Score:1)
And after having done that, whine to them, not to us.
Re: (Score:1)
Re:simple graphics techie questions (Score:1)
If the framerates are less that doesn't effect the speed of the game. This is because the game is framerate independant. They do this by calculating everything according to realtime. Lets say you have a car moving at 3 game units per second. To find the cars position you take the last position and using the velocity and time since the last calculation you can calculate the new position. This has its own problems if the framerate is too low or high. It is the same thing as aliasing. If you take few samples then the resulting model (projectile path, object movement etc.) will be inacurate. Example: if you throw a ball up and sample it 3 times in midair the maximum height achieved probably won't be one of the samples so in a game you wouldn't be able to jump as high. I hope this answered your question...wish I was better at explaining things sorry.
My Home: Apartment6 [apartment6.org]
not much "real world" in that review (Score:4)
All this from the same crew who benchmarks Q3 at 1600x1200, and spits on any card that loses that race (how many ppl have monitors that can do 1600x1200 at 100 hz anyways?)
They rated the Elsa card as SuperFantasticGetOneOrDie, yet the identical Powergene card was rated as "bleh," for those "those on an extremely tight budget" (the powergene is only $10 less than the elsa.) But according to the reviewer "you don't get a name brand" with the powergene, so stay away unless you are ghetto. Reality check anyone? BOTH cards are stock reference designs, except for a possible future tv in/out module for the elsa.
Also, by reading these "shootouts" one would get the impression that quake 3 is the only game on the market. If they benchmark some other game, it has to be a quake clone. I play the Quake series to death, but I also play strategy games like Homeworld (which can bring a video card/cpu to its KNEES during intense battles.) Where is the benchmark on some non-FPS game?
How about image quality? I personally turn on FSAA on my Geforce when playing Homeworld at 800x600, because it looks SO much better than 1024/768 without FSAA. If sharky reviewers would play something besides FPSs then stuff like image quality would get ranked way higher.
Anyways, thats the end of my rant. Whenever you read one of these reviews, keep in mind the biases of the reviewer, and remember that they sometimes get caught up in "reviewerland," which is not necessarily connected to the "real world."
Re:Stability? (Score:1)
Umm... that's funny, because I have a GeForce 2 and I have no stability problems whatsoever.
------
Re:This don´t belong on slashdot ... (Score:1)
Re:Sharky is flaky (Score:1)
--
Re:simple graphics techie questions (Score:1)
- screen resolution in pixels
- geometric complexity, i.e., the simplification of curved solids, the use of less complex representations of items in the game etc.
- the presence/absence of special effects that require additional rendering passes before the final image goes to screen (shadows, dynamic lighting effects)
- filtering algorithms with varying computational requirements that affect the subjective quality of textured geometry and lighting.
As far as what is a better solution... well, it's a matter of personal preference. Everybody wants games to look "pretty", and much of the value of today's games seems to lie in the "wow" factor that comes with roaming a reasonably convincing, immersive environment. But in the highly competitive world of first-person 3D shooters such as the Quake series of games, for example, many hardcore players choose to make compromises in order to sustain high framerates and keep the game as RESPONSIVE as possible. I myself (while certainly not among the finest players of such games) maintain 3-4 different configuration files for different moods and purposes. Sometimes I feel like seeing all the eye candy (map development, benchmarking), sometimes I want the competitive edge that come with pure speed - it's all a set of compromises.
<ramble>
Also remember, 3D games are among the most demanding applications available. Games like the original GL Quake drove the video hardware industry forward as much as they responded to available hardware, and consumers continue to demand more and more "cool stuff". OpenGL was originally formulated as a way to represent 3D geometry in a serious engineering/CAD environment, to display work prior to final rendering; if you had told the folks at Silicon Graphics that their libraries would be the basis for applications that demanded full-screen rendering 60 times a second and consumer-level hardware that could actually do it, they would have laughed in your face.
</ramble>
So anyway, yes - there are fairly standard ways to scale the quality of game graphics back in order to sustain framerate. It's a matter of personal preference. I personally can't justify my purchase of one of the original GeForce cards (I had a perfectly functional - no, actually better in terms of 2D desktop graphics - Matrox card with an older 3DFX card for games until a few months ago) EXCEPT that I wanted to play games faster, and have them look cooler. And I'll eventually purchase another card so I can enjoy what I feel to be acceptable performance on the next round of games, and at the same time I'll be able to play Quake III Arena at 1600x1200 with all the options turned up and still get a more-than-usable framerate
Re:Weighted statistics and reviewer bias (Score:1)
How well will it play Unreal Tournament and Quake
How much does it cost?
Will it be stable
Does it use an IRQ(S3 chipsets anyone?)
How well does the company do forward compatibility
I've got a Diamond Stealth III S540 card that costs nothing and works pretty much OK. And as for Linux drivers, check on
http://www.s3.com/default.asp?menu=support&sub_me
Re:Nvidia Custumer Support Suckz (Score:1)
Re:Yawn. (Score:1)
FYI (Score:3)
Yawn. (Score:1)
Mr. Piccolo (probably posting anonymously since this is Mosaic)
Berlin (Score:1)
Maybe it isn't the time yet. Berlin has a long way to go...
Who actually read the article ? (Score:1)
In the end, the only people that could be bothered to read the article, are hard-core gamers, or someone considering upgrading.
Neither of which would be reading Slashdot. 90% of Hard-Core gamers don't really care about Linux performance, because all of the best games run on Windows (let's be honest).
Ah well, it is 15C in Soho, and raining, with a minimum low temperature of 10C
hrrm. (Score:1)
-I- read the article.
Apparently, you at least looked at it, or you wouldn't have preached what was in it.
I think the lure of this article is the fact that it's comparing windows and linux graphics. It's important because, unfortunately (as you said), many awesome games are windows only. And articles like this show the capabilities of games/graphics under linux, as well as the community interest in having those capabilities...
*shrug*, not trying to convince you you're wrong, just another viewpoint...
Re:Berlin (Score:2)
Re:#nvidia on irc.openprojects.net/irc.linux.com (Score:2)
Yes, they were. They explained to me why the drivers wern't open source, and they had a good reason.
These days, a company has to watch its back when it comes to patents. Some little snot could sue them into oblivion for a broken patent. Unfortunatly, a product as complex as a graphics card can infringe on any number of patents, and it slips through the cracks. Apparently all their cards rely on some technology that is already patented. They didn't admit this outright, but they hinted strongly. It was an honest mistake, but they can't afford to take any chances. Opening the drivers would reveal the infringed patent, and that's a Bad Thing(tm). To make up for it, they have put a lot of time and effort into their drivers. You can see that by their performance.
Not using the DRI API, however, was a technical decision that I don't agree with. They didn't feel it was adequate. Personally, I'd rather lose 5% of performance for standards-based software, but I don't run the company and I havn't bought a new card from them in three years because I'm so happy with the one I have. So, I don't feel that I have the authority to tell them what to do. Now, if you want them to do what you tell them to, you better either buy a card or buy some stock. Otherwise, you have no right.
Dave
'Round the firewall,
Out the modem,
Through the router,
Down the wire,
Obviously not you (Score:1)
It's also somewhat interesting that the card with (by far) the greatest performance under Linux has closed source drivers.
I'm neither a hard core gamer (I occasionally play Quake III) nor considering upgrading but I still found it an interesting read. If it doesn't interest you then why not just shut up and move on?
Re:Sharky is flaky (Score:1)
Nobody is interested in ATi. With the exception of their TV output, which I hear is excellent, their products are passable at best. ATi is what you throw into a cheap machine, or into a server that won't be running X (or any other GUI).
And speaking of flakiness, ATi is the king. They have awful, awful drivers for Windows, and if their open source X servers are stable, it means some small group of hacker hobbyists are better than the ATi programmers, and/or that the server uses limited acceleration features.
The AiW series doesn't support Linux.
More pages = more ad impressions.
More ad impressions = More money.
--