GeforceFX (vs. Radeon 9700 Pro) Benchmarks 388
Obiwan Kenobi writes "Looks like they guys at Maximum PC got lucky -- they scored the first ever GeforceFX benchmarks via an Alienware prototype machine. Two 'marks to notice: The Geforce FX scored 209 FPS in Quake 3 (1600x1200x32) and 41fps in 3dMark Game4 demo, while the Radeon 9700 Pro attained only 147fps in Quake 3 yet came back with 45fps in the 3dMark test. It seems that the GeforceFX is the clear leader in pure processing power, but in memory bandwidth the 9700 Pro is still king."
Finally, a decent frame rate. (Score:5, Funny)
Re:Finally, a decent frame rate. (Score:3, Interesting)
More importantly is how this will translate into capacity for future games. Doom 3 will take considerably more muscle than Q3 does.
Re:Finally, a decent frame rate. (Score:5, Interesting)
There's nothing in that single number to say that the higher average fps doesn't suffer from a number of wildly varying large peaks and valleys in performance, while the lower number could be much more steady, with relatively low variance in the number across its peaks and valleys.
It only makes you wish that these benchmarks, especially the "real world" Quake 3 tests, had a graph of fps throughout the test to see how performance was at any particular point.
Re:Finally, a decent frame rate. (Score:2)
Maybe it would be a good idea for benchmarks to start reporting even just the standard deviation along with the average fps. I'd certainly find it useful/informative.
Re:Finally, a decent frame rate. (Score:2, Insightful)
Re:Finally, a decent frame rate. (Score:2)
My monitor refreshes at 72Hz. What am I getting out of 100+ FPS? Sure, the raw power is impressive, but we dont use it.
Re:Finally, a decent frame rate. (Score:3, Interesting)
There used to be, and very well may still be, archives of game demos of people demonstrating these jumps and trying to one-up each other. Almost all of the people I know who contributed to these archives believed this was true.
That said, I have never been that hardcore of a 1v1 or Team DM player to care about this and as such haven't researched its truth myself. It very well may just be some bullshit that spread because enough people blindly passed it on like I am. If someone wants to correct me, please do.
Re:Finally, a decent frame rate. (Score:5, Insightful)
However, I do question the point where much of the work for games is put into the engine, and little of it put into the gameplay itself. I realize that we're almost at the point where one company is making the engine, and another company licenses that engine to make the game, so the responsibility of good gameplay is on the shoulders of the latter company. However, it seems that a lot more of the games that have been published of late focus more on the realism and the 3d-modeling that the playability of the game, and the continuous push to up the abilities of rendering does not seem to allow the developers of new games to step back and think about gameplay.
The other problem is that right now, with the specs we're getting on Doom3 and other games, it sounds like another focused hardware upgrade cycle if you want to play these games reasonably. Sure, you can drop the screen resolution, and there's probably hundreds of tweaks you can apply to the engine to cut back details, but older, but still viable cards, will have problems. I know people don't want to develop for outdated systems, but there is a point where you have to include a reasonable amount of backwards compatibility to allow non-power gamers to play new games as well. One of the reasons that Half-Life and CS sold so well was that the game was optimized for play on a previous generation of processor/vidcards compared to the average system that was sold. (HL/CS, also, IMO, excels at it's gameplay as mentioned above). I know a lot of PC game writers are of the opinion that the gaming market will only move forward when vidcard makers put out new features into cards, and then when PC game makers follow up by using those new features in predominate titles, but the PC gaming market is just not healthy right now, and to make games that require the latest-and-greatest hardware will limit sales further and may push this part of the market into a slump, while console gamers will continue to see more improved titles.
Again, I'm not against improvements in 3d rendering tech and pushing polys as fast as possible; it's the game makers themselves that need to realize what the average hardware of their target audience is going to be and not just to focus on how pretty the game looks.
Re:Finally, a decent frame rate. (Score:5, Insightful)
Re:Finally, a decent frame rate. (Score:2, Interesting)
I don't see why this is bad? I personally dislike 3D Games since they all look alike. If you want you can still play great games with older Hardware, the whole simulation, build-up scene for instance. And most likely your system could even handle games such as DarkAge of Camelot [darkageofcamelot.com] or Everquest [everquest.com], wich are games with a focus on gameplay and not grafics. I agree, there are a lot of crappy games out there with really stunning grafic fx, but i don't care about them (anymore). I let my friends play them and when one diamond among them is found i consider wether it's worth the hardware upgrade. The last game i did this for was Return to Castle Wolfenstein. Had to upgrade for 20$ and get a TNT2 to play it at decent fps. Now i wonder what doom3 brings. Is it worth the upgrade to Radeon/GeForceFx? I don't know. Maybe I will keep waiting for WarhammerOnline [warhammeronline.com]until I upgrade. But someone will betatest for me and then i can still stick with my XentorTNT2/32MB [thetechzone.com]and keep playing Anno1503 [anno1503.de] or DarkAges [darkages.com].
Re:Finally, a decent frame rate. (Score:2)
Re:Finally, a decent frame rate. (Score:5, Funny)
Re:Finally, a decent frame rate. (Score:2)
Re:Finally, a decent frame rate. (Score:3, Funny)
Now say the eyelid weighs only a few grams - say 0.01kg. ( I have no idea really, but those eye lashes seems quite heavy.) Say that we move them at a constant acceleration for 1/2cm, then constant deacceleration (seems a fair enough model.) We need the other to go from fully open to fully closed in 1cm/70kmph =
We require that force again to slow down, and then we have to open the eye again. (I'm assuming things like gravity cancel out etc).
So a total force of 1555*4N = 6220N. This is over a period of 0.001secs, so a total of 6220*0.001 = 6.2Watts are used. Say it is 90% efficent (muscles aren't perfect convertor, there will be friction despite the eye being very well lubricated,etc) so you will get 0.62W in waste heat.
That's not that much heat, although you would probably need to blink a lot more, since the blinking wouldn't be as effective. Since a blink takes about (guessing) 1/2 sec, let's say you would need to do 0.5/0.001 blinks = 500 blinks. Say you blink every 10 secs (I have no idea really), that would be 50 of these quick-blinks per second, so now our output heat is 50*0.62 = 31W - which would sting like a bitch
Re:Finally, a decent frame rate. (Score:2)
(now try to find a monitor with a 170hz refresh rate)
Re:Finally, a decent frame rate. (Score:2)
I am correct in that your game's fps cannot be faster than the gun in your crt, right?
Keep this in mind.. (Score:4, Insightful)
I'll wait for the final hardware and drivers before I decide which to buy.
optimized? (Score:2)
I seem to recall a long standing argument about GFX card drivers being 'optimized' to perform well in the standard performance tests e.g. Quake 3.
Couldn't find a link on google though.
Re:optimized? (Score:2)
too lazy to find a link, though.
Re:optimized? (Score:2)
But also keep in mind that optimizing for a specific game isn't necessarily a Bad Thing, so long as it doesn't hurt the visuals or quality. For example, if you know a certain game doesn't need/use certain features of the card, and by disabling them you improve performance, then why not. (ATI, however, vastly cut down on the texture quality in the game itself to get their increases - tsk tsk).
Re:Keep this in mind.. (Score:4, Interesting)
Come on, read the article.
The reluctance of NVidia to allow them to test the higher levels of AA is more telling if you ask me.
Re:Keep this in mind.. (Score:4, Informative)
Re:Keep this in mind.. (Score:2, Interesting)
Then how do you explain the substantial performance boost with new releases of Nvidia's Detonator driver package over the years? I remember one particular release improving my quake3 FPS substantially a few years ago.
Re:Keep this in mind.. (Score:2)
How do you explain the crappy performace of nVidia's most recent linux drivers?
Dinivin
Re:Keep this in mind.. (Score:2)
The release drivers are rarely substantially different from the "early beta" drivers... unless the beta drivers had massive amounts of debugging enabled, but that's generally not true either since video card makers want developers to have an actual reference platform and can send debug mode drivers if needed.
The speed improvements for the Detonator drivers have come over time as nVidia has refined the drivers. The speed improvements rarely coincided with the relase of a new card, but instead came 1-2 months afterwards. Or even in between product cycles. Hardly a case of beta vs release drivers.
Re:Keep this in mind.. (Score:5, Insightful)
The reasons for a speed increase aren't always related to the graphics card itself, but can be due to the motherboard chipset, type of RAM, BIOS, or even a specific game or app itself. These tweaks will change how the card communicates with these in specific circumstances, which can be vary greatly between different consumers' machines.
Since the graphics card industry is hugely competitive right now, it's in their best interest to spend a lot of time tweaking their drivers to the max.
The reason consoles don't worry too much about it is because they have a standard set of hardware (read: one graphics card - no competing card that customers can benchmark against) that ALL game developers must work with. This also simplifies game development because they know the exact config and driver set that EVERY user will be using.
Even though I'm sure they COULD tweak the drivers (forgetting the expense of distributing a firmware patch), they'd prefer to leave the tweaking in the game code. Besides, you can't easily benchmark the various consoles against each other, whereas the graphics card folks for PCs know that every performance site and magazine is going to use the exact same hardware config and same game to test their card against all others.
competition (Score:5, Interesting)
What I am surprised about though is that prices are so high for graphics cards still even with relatively good competition in the marketplace. I mean even the Parhelia debuted at like $400 didn't it?
It always seemed to be that the benefit of having AMD competing with Intel, was that I could get a really good CPU pretty cheap. (Though now it seems AMD is taking it easy for awhile, so that benefit may have been short-lived.) Yet I don't see the competition driving video card prices down.
There's some evil conspiracy afoot here, I know it!
Re:competition (Score:5, Insightful)
It's mostly due to high-end video chipsets costing so much, plus the added expense of the uber-fast memory that these cards require, but mostly, it's driven by ultimate demand for these products. Everyone needs a P4/Athlon XP, but only a few people need the absolute fastest display adapter out there. As a result, fewer units get produced, as fewer units will actually sell. Combine that with the already higher cost of producing core logic that's 1.5-2x the transistor count of high-end CPUs and RAM that's 2-3x faster than desktop stuff, and you've got a recipe for pricey hardware.
Also, don't forget that most products these days are priced at what the market will bear. People will pay $400 for the fastest thing on the block, so that's what they sell for. My general rule of thumb is to wait a month or two after the new, fast, whiz-bang product, then buy whatever card has the fewest problems and costs $300.
AMD have NOT lost the CPU war (Score:5, Insightful)
Furthermore, there's no app or game available on this earth, and there probably won't be for at least two years to come, where the speed difference between an AXP/2800+ and a P4/3GHz is big enough to really mean anything to anyone other than the fanatical overclocking crowd, who will spend any amount of money just to have the fastest stuff on the market, only to use it for stuff like playing Counter-Strike, which uses perhaps 20% of the total CPU and graphics card capacity. Well, if you're into that sort of stuff, sure. Get a P4 and enjoy having the fastest CPU there is
For the rest of us, who base our computer purchases on common sense, for speed, stability and price, the obvious choice is still the Athlon XP.
Besides, the Pentium IV still has a pretty fucked up design. See this page [emulators.com] if you don't know what I'm talking about. I always laugh at people who whine that Windows is poorly designed, only to praise Intel CPU's in the next breath.
Anyone care to disagree? Remember, modding me down is so much easier than posting an intelligent reply.
Re:AMD have NOT lost the CPU war (Score:3, Informative)
Re:not as many units? (Score:2, Insightful)
Even when you take into account a lot of the graphics workstations which may be running some more exotic processor, both ATI and nVidia make high-end workstation cards too (though I'm not sure who 3dlabs is owned by these days).
And yes I know there are some *really* budget PCs out there that ship with onboard graphics by companies who primarily manufacture chipsets, but these PCs I'm pretty sure make up a small number of PC sales. Usually the "budget" PC still ships with graphics by nVidia or ATI, they just package one of their lower-end cards.
Re:not as many units? (Score:2)
As for the shipping CPU type, Intel has, by far, the lion's share of the market. PC/Mac ratios run what, 50:1, and of the PC's shipped, some 80% of them are Intel? And I think I'm being conservative. The ratios there are probably much worse (for the little guy).
Re:not as many units? (Score:2)
Re:not as many units? (Score:3, Informative)
You are neglecting several other key brands of cards which are used in some cases way more then NVidia / ATI. Matrox for instance is used primarily for Digital Editing, and general 2D Graphics work because of it's fabulous image quality. 3D Labs makes great 3D CAD/Imaging (as in Production Rendering) cards which give all sorts of shader/gl extension benefits not scene on regular cards. Evans and Sutherland make good Cad cards. SGI makes good rendering cards, same as Sun.
Nvidia and ATI are good gaming cards, but they are not the only manufacturers of video cards. Their cards are built for gaming. They may work your latest pirated copy of 3d studio max/maya/animation master/lightwave/truespace, but it doesn't mean it's good at it. Far from it actually.
Re:not as many units? (Score:4, Informative)
The rest of them use either Intel Extreme Graphics (which are ok (HP 753N, Compaq 6350US)) or a lower grade intel card. They also use cards like the S3 ProSavage (HP 523N, Compaq 6320US)).
Sony uses crap S3 (or is it SiS?) video cards in all of their desktop computers. Which is disappointing. For their laptops they use mainly ATI Radeon derivatives.
Suprisingly, you are much more likely to find an ATI or nVidia card on a HP/Compaq/Toshiba laptop than a desktop, probably because their is still some profit margin left in laptop computers (unlike desktops). Compaq's mid-range 1500 series is all ATI Mobility Radeons, as is HP's mid-range 4000 series and high-end 5000 series. Toshiba's base 1410 series uses Geforce 4 420 Go chipsets, their mid-range 1900 series uses the 440, and their high-end 5200 series uses the 460.
And yes... in case your wondering, I do currently sell computers =)
Don't wait.... (Score:2)
I wouldn't be surprised if ATI has something oodles better than the FX if/when it ever ships.
Re:Don't wait.... (Score:2)
Re:Don't wait.... (Score:4, Informative)
Re:Don't wait.... (Score:3, Informative)
nVidia vs Everyone else (Score:3, Interesting)
I'll still bet money the GF FX will be the dominant card come final release.
Re:nVidia vs Everyone else (Score:2, Interesting)
Ready for what - making some optimizations to the drivers to make them look good in some random benchmark?
Ofc the drivers are beta, and the final release will probably be faster, but these figures look more realistic than the figures Nvidia told when the card was announced!
And tbh, unless things change drasticly(they seldom do), this card will probably not be much more than 20-25% faster than an radeon 9700 - but the radeon will probably be a bit more than 20-25% cheaper at that time!
Besides - who cares about those $400+ gfx cards? No sane person would buy them anyway, but instead go for a Ti4200 or Radeon 9500 Pro - value for money you know...
Re:nVidia vs Everyone else (Score:4, Insightful)
these figures look more realistic than the figures Nvidia told when the card was announced!
This is true. Marketing is evil. Evil evil evil.
this card will probably not be much more than 20-25% faster than an radeon 9700
When DirectX 9 is out the door, it will not only be faster, but look better. Much better. I suppose I'll catch flack for buying into the hype, but I've been blown away every time.
Besides - who cares about those $400+ gfx cards?
You're right about that last part. For me, it's just about keeping track of who's leading the industry and what technology my next card will have, when I buy it in a year for $120 :)
DX 9 (Score:4, Interesting)
Hate to break it to you, but having just spent some days researching this, I concluded that there is nothing that the GeforceFX will support that the Radeon 9000 series won't. nVidia's web site may say differently, but that doesn't make it so.
The FX may do it faster (though this remains to be seen, of course) but it probably won't do it with better image quality. If anything, I'd say ATI cards have historically produced nicer output where there's any difference at all.
Hell, even the drivers for the Radeon 9700 are getting good reviews. I thought the season of miracles was a couple of weeks ago. ;-)
Re:nVidia vs Everyone else (Score:2)
Well maybe, but I doubt if ATI will sit still. The 9700 is a damn fine card and they've still got like 3 months to cook up a 9900 or something to combat the GFFX. They're normally just leapfrogging each other, but I think nVidia dropped the ball when they GF FX didn't ship in november like it was supposed to. ATI's last 2 generations of products (8500 & 9700) have been pretty damn good, and they've gotten into the 6 month dev groove that nVidia used to have (before the GFFX) Well, competition is good, and I'm on a iBook anyway, so I can just drool.
Re:nVidia vs Everyone else (Score:2)
nVidia has one hell of a reputation to maintain, I don't think they'd let it drop for no reason.
It is entirely possible they screwed up the FX of course. If so, they're going to have to explain some devastating returns to accounting.
3dMark 2001 is not the end all be all (Score:4, Interesting)
But it is actual game performance that is important with most people, so while you may get better 3dmark scores, most people aren't running that a whole bunch to see those nifty graphics, they'd rather be running games.
Also, don't forget to mention that all these tests were run with 2xFSAA on.
Q3 Framerates are WITH 2xAA turned on (Score:4, Insightful)
It's not simply the frame rate, but what's actually being generated in that frame.
=r
Re:Q3 Framerates are WITH 2xAA turned on (Score:2)
Re:Q3 Framerates are WITH 2xAA turned on (Score:2)
I figure Nvidia did enough internal testing to know they're safe on 2xAA but want to do further improvement before going higher.
From the Article.... (Score:4, Interesting)
And if we grab your nuts and twist, you'll confirm this? And if we threaten to cut them off... I think you'll scream just about anything...
So let me guess, they know what's coming from ATI... But like they said, its not about bandwidth, its about GPU processing power, so how will a even bigger pipe that ATi isn't filling going to help....
Who cares (Score:5, Funny)
while the Radeon 9700 Pro attained only 147fps
So what they are saying is that even at a ridiculous resolution, either card is capable of a higher framerate than your monitor, and your eyes.
Re:Who cares (Score:4, Interesting)
With frame-rate to spare, you can achieve really silky-smooth images by syncing with the monitor's refresh. This prevents those ugly redraw lines that can occur from the next frame being drawn right as the display refreshes. So, having a rate higher than your refresh can be useful.
Re:Who cares (Score:2)
if you actually have the card do faster (by disabling wait for refresh) it will look teared.
nVidia and driver performance (Score:4, Informative)
Plus, if this is the first of the GigaPixel cores, then there should definitely be more in it, and the fact that it's down on memory B/W shouldn't make much of a difference.
Re:nVidia and driver performance (Score:2)
maximumpc website (Score:2)
Re:maximumpc website (Score:2)
It *is* a nice screwdriver (Score:3, Interesting)
I actually own one of those Snap-On screwdrivers. I got it years ago as a repayment for a favor I did for a mechanic friend of mine. At the time I thought the repayment was not, shall we say, commensurate with the debt owed. But then I started using it and realized that he was actually giving up something terribly valuable. It is the best scrtewdriver I've ever used, hands down. I've had it for like 10-12 years now and can't deal with other drivers. The grip, in particular, is what does it -- it works so well that it's very easy to strip threads and actually break screws if you use cheap hardware. But if you have to drive a deck screw into a 2x4 by hand, there's no other tool. It's wholly unsuited for PC uses, however.
The only problem I had with MaxPC recommending it was the fact that the tip is *incredibly* magnetic. Like, lots and lots for a plain old screwdriver. You can shove the driver into a bucket of screws and the thing will come out absolutely festooned with screws. It will do the Jedi force screw pickup trick from about an inch away, which is annoying until you get used to it (and then it becomes handy). It's probably got a real rare earth magnet in the tip to make it so strong (and expensive). And it's the last tool I would use to screw a motherboard into a case with. Even it the tip wasn't very magnetic, it's just not a good driver for really delicate work.
As far as MaxPC getting paid to shill them, I don't think so. Snap-On has their target audience pretty well sewn up and probably doesn't need the handful of PC owners willing to pay $100 for a tool to increase/maintain their sales. They have trucks that drive around to mechanics and they have drivers/sale people that know their routes and they protect their customer loyalty fiercely. Because they haven't really set up their distribution model as a "normal" retail channel, courting a couple hardcore PC geeks is definitely not their market and doing so through a computer magazine would not be a wise decision for them to make.
Besides, I've seen MaxPC absolutely trash a product whose ad is on the facing page. They're notoriously cruel, in fact, and I think they tend to err on the side of being a little too mean (eg, they'll ding a perfectly decent video card because it doesn't have like a TV out port -- forgetting that this features might not be something everyone wants or uses and brings the price of the card down). I've never seen them with an obviously bum recommendation and I'd trust their review over those of any Ziff-Davis publication in a flat second. I was a little amazed at their recommendation at first, but it was not because of their jounalistic integrity.
-B
Hm... (Score:3, Funny)
The GeForce pushes over 200 frames per second.
My monitor refreshes 75 times a second.
Tell me again why I want a top-of-the-line graphics card?
-JDF
Re:Hm... (Score:2)
You dont. You want a new monitor + video card. (Score:2)
Re:You dont. You want a new monitor + video card. (Score:2)
Re:You dont. You want a new monitor + video card. (Score:2)
Not at all. Just because you have less-evolved vision, doesn't mean the rest of us suffer with the same handicap. I see a noticable difference between 75 and 85 Hz. That's pretty much my upper bound though.
Try looking at the display out of the corner of your eye. The retina's persistence of vision is lower there, and it makes the difference more apparent.
Re:Hm... (Score:5, Interesting)
And, yet, even with Q1 running at 1000 fps UT2k3 only runs at 140 fps. Wonder what something with even more complexity than that would run at... oh, look, there's a benchmark that only did 41 fps.
Or go look at CodeCult.com's Codecreatures, which does a lovely 6 fps on a 2.5 GHz P4 w/ Radeon 9700 at 1600x1200 anti-aliased and ansitropic filtering. And it still doesn't look real.
Until we have holographic imaging that's indistinguishable from reality the cards aren't there yet. If you don't need/want it, then fine, don't buy it. But whining that it's clearly beyond what's needed is, well, stupid.
Re:Hm... (Score:2)
- Spend more time doing other things.
- Make more detailed worlds.
Running the old games at such high speed doesn't make much sense of course. But it does make it possible to created even more detailed games.
Greetings,
Re:Hm... (Score:5, Informative)
I can't talk for other games, but since the Q3 engine is widely used, that may be the case for some of them too. That partly explains the "need" for high-FPS.
Re:Hm... (Score:3, Insightful)
And does the Quake 3 client actually transmit it's own private physics calculations to the server in a multiplayer game? If so, why would the server believe a client's physics over it's own calculations and why have no cheats sprung up to take advantage of this ridiculous security hole? If not, why does the "different physics" matter?
Re:Hm... (Score:3, Interesting)
However, the rate at which this is done is certainly less than 125 times per second. Given a ballistic trajectory (e.g. a player during a jump), the trajectory could be checked by the server but the actual position occupied by the player along that trajectory is updated at the frame rate. At 125 FPS, given the standard height at which each player can jump in Quake, the player actually is able to be during one frame at the apogee of the trajectory, which is not the case at other framerates. Thus, certain items in certain maps for example are only reachable if your framerate is exactly 125 FPS.
Thus, the physics doesn't really change with the framerate. It's the way the "world" is sampled (trajectories, etc.) that is the problem here. And this is done client-side. You can decouple the two in single-player mode (i.e. position updated at 125 Hz, but screenshots generated at 50 Hz), but in multiplayer, by default, servers do not allow this.
Sorry I can't be more precise... Do a search for "Quake 3 trickjumping" to now more about this, since many "trickjumps" in Quake necessitate the 125 FPS framerate.
Re:Hm... -- 2 words (Score:2)
it crawls a measly 20fps on my GeForce ti4600. When this gFX comes out, I'll get one right away, even at 500$. I can afford it and it's tax-deductible. I want one cause I want the best.
What I miss in all these benchmarks (Score:5, Funny)
Re:What I miss in all these benchmarks (Score:4, Interesting)
I won a Number9 Imagine128 card at Comdex back in the early 90s... I distinctly remember being amazed since for the first time ever it was faster to scroll text in a window than it was full-screen.
Nowadays it's a total non-issue of course.
Oh, and I get far better FPS in Nethack. You're just a slow typist
Not very fair (Score:5, Insightful)
ATI might very well ship an improved version around the time GeForce FX ships.
4xAA (Score:5, Insightful)
If the game is running at 100 fps people are going to up the eye-candy, right?
Assuming this is the case, I seem to remember the 9700 getting very similar scores whether the card was set to no AA, 2xAA or 4xAA, i.e. the AA processing was almost (but not quite) 'free'.
I know the benchmarks are very very early and it really needs to get the full treatment from a hardware site, but the important figures IMHO are ones where the card is set to run everything maxed out...I have a feeling the NV30 is not going to be in such a prominent position in that instance...
Another thing to remember (Score:4, Informative)
So even if the GeForce FX is a bit slower for some things, those games that are using full DX9/OpenGL features will get better looking graphics thanks to the increased hardware precision. People using 3D programs like Maya with the Cg plugin will notice the biggest difference especially IMO. And at this point, NVidia's shaders are far better geared to the professional 3D graphics industry than ATi's *current* offering. This might encourage many developers to take advantage of extra GeForce FX features instead of ATi features.
(Source URL for Quote:
http://www.nvidia.com/content/areyouready
word choice (Score:3, Funny)
Only?!?
4xFSAA, Anisotropic filtering? (Score:5, Interesting)
Hard to imagine a 'serious review' site would neglect to test these features. I don't give a crap about 400 average FPS in quake, but I do care if it drops to 14 with all the enhancements turned on. But then they were trying to make the GeForceFX look like it's leaps and bounds better.
I'd imagine it's still the case - the 9700 is still the bandwidth king. Personally, I don't care about faster (when its already faster than my monitor can display and brain can process). My next upgrade will be motivated because it will look better.
The GeforceFX isn't something thats going to leave the 9700 in the dust - it's something that should have come out 6 months ago to compete head-to-head with ATI.
At any rate, after putting together a couple of cheap flex-atx pcs with onboard S4s (shared memory - Shuttle FV25 in case anyone cares), I'm surprised at how little GPU horsepower is needed to actually play most games.
Even UT2k3 is playable on these little guys (albeit not 1600x1200 with all the goodies turned on, but playable). I'm pretty sure my "outdated" radeon 64vivo will play Doom 3 when it goes gold.
Anyhow, my point is that cards have been displaying 'fast enough' for awhile - I mean we don't measure a cards performance in polygons anymore. They need to "look better", as in more natural, smoother, more TV-like.
Re:4xFSAA, Anisotropic filtering? (Score:4, Informative)
I can still tell which arcade machines use 3dfx-derived/built chips; the graphics just look different. I want my games to look amazing; what good is moving grass if its pixelated??
Nvidia Driver Code is the reason, not hardware (Score:3, Insightful)
Has anyone ever seen Nvidia Driver code? It is littered with benchmark/Game specific code.
So basically what Nvidia has done is do as little processing as possible when certain apps are running, or optimize for those specific apps.
So there benchmarks are good if you are running those apps, but bad if not.
www.mycal.net
Re:Nvidia Driver Code is the reason, not hardware (Score:3, Insightful)
That's a pretty serious allegation. Can you back it up? How did *you* get access to the code? Can you provide evidence? Moreoever, how does the code detect that the game is running? It can't be simply executable name, given that the Quack3 fiasco took place when ATI tried this stunt.
No disrespect intended, but a claim like that does not stand on its own.
Quite the difference... (Score:3, Funny)
...Did anyone test the cards running quack3?
dx8 vs dx9 (Score:3, Insightful)
Who cares? 3dMark test is designed around directX 8, while both of those cards are designed to take advantage of directX 9. Wait until the next 3dMark release then you have a valid test.
Re:dx8 vs dx9 (Score:4, Insightful)
Fast, but not fast enough (Score:4, Insightful)
I don't believe this...sorry... (Score:4, Interesting)
By the author's own words, this was no review. There are no 6x, 8x FSAA tests, at all, although these are supposed capabilities of GF FX--there are no screen shots for comparison--in otherwords, there is absolutely nothing to prove this ever took place. There are no anisptropic filtering tests, we don't know what cpu system the Radeon 9700 benchmarked on--nothing--absolutely nothing of interest that you would normally see in a real review is present. Even if you believe the author--he says unapologetically he was under direct duress by nVidia as far as what he was permitted to show AND SAY.
Already people on the Rage3D forums are talking about how much slower the 9700P speeds are in this promotional propganda piece than they themselves can get with their systems at home.
Also....what, pray tell, would Alienware be doing with a NVIDIA beta prototype? As a small OEM I would expect that if anything Alienware would have an OEM beta version of the card--possibly. Certainly not a nVidia version of a prototype card! If nVidia needs Alienware to beta test its upcoming card this must mean nVidia hasn't even finished the prototype reference design yet and nVidia's OEMs haven't even begun production!
Here's what I think it is: a paid-for promotional piece which is designed to deter people from going ahead and buying an ATI 9700 Pro. What it most certainly is not is an actual review of the product--by the words of the author himself. What I still can't get over is that these are the very same benchmarked programs nVidia was handing out in October!
When nVidia starts sending out cards to reviewers with driver sets and saying, "Have at it--review it any way you like!" that's when I'll start listening.
Re:I don't believe this...sorry... (Score:3, Informative)
We looked at this board during the second week in December. It was a very early board, and simply didn't run a large number of applications. The situation is a pretty common one for print pubs. Since we have a lead time that ranges from 2-6 weeks between the time we write stuff and the time that magazines get to readers, we occasionally take a look at preview hardware with special terms negotiated with the the vendors in advance. Unlike some other mags, we ALWAYS make it abundantly clear that this is a preview, not a full review. Furthermore, we always make it clear when a vendor specified we run specific benchmarks in these previews. Naturally, in our full reviews, we run whatever benchmarks we please at whatever resolution we like.
Anyway, Alienware wanted nVidia to get this sample in time for our preview story, but the drivers were very raw. In order to make our deadline and get an early look at the board, we agreed to only run a small subset of benchmarks, with a big huge disclaimer that said "Hey, nVidia would only let us run these benchmarks", which we did.
A nVidia rep hand-delivered the board up the same day that the Alienware system arrived, watched me install it, installed the drivers, watched me set up and run the benchmarks, then pulled the card, obliterated the drivers and went on their way. After that, we restored the pre-nVidia hard drive image and benchmarked the Radeon 9700 in the exact same machine.
We don't run other people's benchmarks in Maximum PC. If you see a number on the website or in the magazine, it was run by a staff member in our lab. Period.
I can't understand why you'd think this is a positive thing for nVidia. The overall tone of the story was that the performance is a little weak for something that will cost an arm and a leg, and take up two PCI slots. Heck, it doesn't even beat the 6 month old Radeon 9700 in the programmable shader test, which is the only one that really matters in my eyes. I don't care about 30 more fps in Quake 3. I want a card that will be fast in programmable shader games later this year, and the GeForce FX doesn't appear to be that.
Will Smith
Framerate jokes and VisSim vs. PC Games (Score:4, Insightful)
One quick point to address all the 150 fps in Quake jokes:
Frame rate consistancy is what is most important (by far). A game that runs at 30 fps solid will also feel better than a game that runs at 60 fps some of the time but then bops back and forth between 30 fps and 60 fps.
The VisSim industry has done a better job of saying "we only need 60 Hz (fps) but we better never ever see you dip below that or you are out!" This forces hardware and software to be optimized for locking at 60 fps. I can tell you that a 60 fps Air Force flight simulator will always feel higher performance than a soupped up PC running Quake at 100 fps but dipping down to 50 fps or worse when things get hairy.
The biggest evidence of this issue being unimportant in PC gaming is the number of people or games that run with vertical blank (vblank) synchronization turned off. This is wrong wrong wrong in my opinion but most gamers are willing to live with enormous visual artifacts from partially completed frames to get that max fps and lowest input latency when things tough on the system.
So, to all those that mock high fps benchmarks, I challege you to post information on a recent 3D game, gfx card, system, and config that allows you to play with all the gfx features on (or those that are important to you) with vsync on using a 60 Hz display and only double buffering which locks at 60 fps solid without ever dipping below that.
That is when things have become fast enough for _that particular_ game.
Products like the NV30 and R300 help push the bar but are still not overkill. Take the above challenge and now turn on 16x multi-sampled FSAA (same as an SGI Onxy/IR), 8x anisotropic filtering (often more important than FSAA), 1600x1024 (the native resolution of my DFP), 128bit pixel depth (which NV30 can do before scan out), and include very complex vertex and fragment (pixel) programs. With all of that, turn Vsync on (as it should be) and have this entire combination run at 60 fps per second regardless of what is going on in the game at any and every given moment.
When we can do all of that, we are finished. :-)
The problem then becomes the content creators who continue to push the envolope. GeForce FX is launching with a demo that has Jurasic Park/Toy Story quality rendering tied to real-time dynamics and feature film quality animation. However, it is not quite to the level of Gollum in LOTR: Two Towers. Imagine Doom4 with 50 characters on the screen that all look like Yoda or Gollum. My point is that there is always room and applications for higher performance.
100 of these running around locked at 60 fps is the new goal: http://notendur.centrum.is/~czar/misc/gollum.jpg
Re:What's the big deal? (Score:2, Informative)
The point isn't getting Q3 to 400fps but new generation games over 100fps AVERAGE.
Re:What's the big deal? (Score:2)
Re:What's the big deal? (Score:2, Informative)
Isn't this more complicated than just Hz? (Score:2, Informative)
Remember, XBox, PS2, Gamecube and all the other consoles are designed to output to *INTERLACED* devices (ie, your TV). So whilst they are outputting 50 times a second, they're only outputting half the scan lines each time the scan down the screen.
It's my guess that monitor designers have a hard time calibrating there monitors for the best "non-flicker" effect. A designer never really knows what frequency the monitor is going to be run at. Certainly, if they could guarantee that a monitor is always going to be run at a specific rate, they could design the phosphor so that it only begins to fade (significantly) after 1/74th of a second later. I imagine that would have far more effect on flickeriness (I like the sound of that word).
I'm guessing here, but I'm guessing to convince your brain that animation is fluid, you need around 30hz or so (similar to TV and film). I imagine convincing your brain that something is flicker free is a combination of frequency, phosphor fade time and all sorts of other magic.
Re:Isn't this more complicated than just Hz? (Score:3, Interesting)
You're absolutely spot-on about the phosphor persistence, however. therefore you shoud always run your CRT monitor at it's recommended or near maximum refresh rate - I run my Sony G400 at 100Hz, which is nice because it allows QuickTime to sync my 25fps video up every fourth frame. The interlace aspect is wrong, too. certainly, the PS2 generates 50 (or 59.94) full frames every second, the limitations of TV mean that it can only show half of each frame, but it renders them nonetheless.
As for the X-Box and GC, I believe they have progressive output modes in addition to normal 2:1 interlace, and can therefore give you full res frames with a suitable monitor.
Re:Isn't this more complicated than just Hz? (Score:3, Interesting)
That's because people like the Director of Cinematography know what they can and cannot shoot given those 24fps.
As a counter-example, try watching Pulp Fiction again, in the theater, when they first go into Jackrabbit Slims. Tarantino does this camera move from right to left where the flicker is HORRIBLE. Most of the time, they work hard to avoid problems like this - that's why you don't normally notice them.
Also, keep in mind that a TV signal has 2 half-frames per full refresh, so effectively they get 60hz.
Re:What's the big deal? (Score:2)
Re:What's the big deal? (Score:5, Insightful)
And even in a game, having FPS over say, 70 is useful because the frame rate will vary. When there's suddenly much action on the screen the frame rate will drop...
Re:What's the big deal? (Score:4, Informative)
Re:What's the big deal? (Score:2)
This myth needs to be put to rest already. It's trivially easy to tell the difference between 30 and 60fps. Period. It has always been this way. And 60fps is much nicer for very high-speed action games (it doesn't matter in other cases). Beyond 60fps, though, diminishing returns kick in very quickly.
And realize that this is a *benchmark*, not insistence that 300fps is better than 290.
Re:What's the big deal? (Score:2)
either way, you still know it's a picket fence, and which way the camera is panning, which, in my opinion, is really all that matters.
Re:What's the big deal? (Score:2)
I also love reading that "people can't tell the difference between more than 256 shades of grey". Grrrr...
Re:Good opportunity for a little question :) (Score:2)
Re:fps (Score:3, Interesting)
The content is on film at 24 fps and the projectors double-shutter the film to have it flash at 48 fps, reducing the flashing of yester-year.
BTW, It is only when the camera pans to you REALLY notice the 24 fps content.
I hate the fact that the new digital projection standards (and HDTV for that matter with 1080-24p) are designed around this ancient frame rate.