NVIDIA Geforce 2 Review 129
maniack writes: "NVIDIA lifted the ban on Geforce 2 benchmarks and specs at midnight, and Anandtech right away posted an article on the card. They put it up against all of its competitors, including the Viper 2, the Rage Fury MAXX, the Voodoo 4 and 5, and several flavors of the old Geforce including a 64 MB DDR card. The 32 MB DDR Geforce 2 GTS ripped the competition apart in almost every benchmark including the texture heavy Q3 Quaver. The Geforce 2 was the top performer in both high-end and low-end systems. The article also explores the performance hit cause by full scene anti-aliasing. Sharkyextreme also has a review, as does Hot Hardware. "
video card hype (Score:2)
does anyone know if the geforce 2 will have the same linux support the geforce does? if so, i'm a little wary. but you have to give them points for performance...
Oh god, the chance to first post! :) (Score:3)
I'm just not sure.
Furthermore, we've come to the point that the extra rates only support more on the screen, rather than an incredible clarity. I've seen some nice pictures, but it's still light years away from anything I would call beautiful.
Yet the biggest delimiter isn't the card anymore, but the artistry. There just aren't enough artists, and it's not possible to put enough great artists on most teams to make something spectacular. That might be the next frontier. Even if we can get life-like quality, the game will still only be as good as the artist behind it.
(I wonder if I gave up the first post by now.
Tops (Score:1)
Linux support? (Score:1)
And I know that "Linux isnt for gamers" but the whopping 10fps i get with my TNT2 is too damn annoying.
Re:Oh god, the chance to first post! :) (Score:4)
There is the following point however... the faster 3D cards become, the more inexperienced people can bring more creative games that are actually playable to the market.
This is made possible by the recent (explaining all the latest hoo-hah about Gfx cards) introduction of on-card transform and lighting.
Quake is scaled down for speed purposes, the faster the cards become, the more complex scenes may be generated without loss of FPS.
GeForce was the first with the GPU (tm?) and although it's not fully utilized yet, it still bodes quite well.
Things like FSAA and Motion Blur may cause thing to enter the realm of 'beautiful' pretty soon.
My 2c
Re:Oh god, the chance to first post! :) (Score:1)
Re:Linux support? (Score:1)
Another point... The Geforce DDR is already quicker than the soon-to-be-releases paralel VooDoo 3 (aka VooDoo 5). There is no reason to go Voodoo using performance/price considerations.
The only place where the Voodoo does well, is in Unreal, and possibly only because it gets shipped with new Voodoo cards. (-ponder-)
The Voodoos are old generation, programs are already optinised for them. Nothing even uses +/- 80% of the GeForce's yet... I can't wait... (no, really, I can't!)
Re:Oh god, the chance to first post! :) (Score:1)
Yeah, I went from a 4MB ViRGE and a PowerVR PCX2 to a TNT2u - on an otherwise identical machine, the fps at 800x600 in Forsaken went from roughly 15 to around 100
Cheers,
Tim
Re:Oh god, the chance to first post! :) (Score:1)
Re:Oh god, the chance to first post! :) (Score:1)
Maybe not, but nobody's twisting your arm. I think it's great that consumer 3d-graphics advances at this speed. Too bad the target platform for most developers is in the Voodoo2/TNT to Voodoo3/TNT2 area, as they need as large a customer base as possible.
There are still a lot of people with TNT/Voodoo2's out there, and a GeForce2 GTS i certainly a great upgrade.
A penny for your thoughts.
Re:Oh god, the chance to first post! :) (Score:1)
I only have a 100/450 MHz PIII.
But I am running in 800x600 in Lightmap mode... try that. (Also try putting stencil shadows on, and we'll see it crawl... I also had I Ultra...
These are things that the Geforce, and the Geforce 2 will to extremely well... hence this Thread...
I only need 656 fps to rail well on 160 ping, but one needs a bit of resolution to rail without zooming...
Shurg... it's much of a muchness... maybe you want to discuss this offline (not to be too offtopic)
My R 0 , 2/100
Another review (Score:2)
Fast Cards, Slow Machines (Score:1)
I'm still using a Pentium Pro 200 that I bought in August 1997. The only reason why it's still a viable gaming machine is because of the Voodoo2 card I have in it.
So what should this mean? Not much other than I only wish PCI versions were still available ;-)
--
Re:Oh god, the chance to first post! :) (Score:1)
Oops, sorry, make that 65 fps
Re:Another review (Score:2)
Nvidia linux support (Score:1)
New drivers... (Score:2)
But keep in mind GeForce owners, that you can download the new drivers and get a considerable performance boost for your current GeForce card! =)
Re:G400 and onwards (Score:1)
Anyway, I'm waiting for Matrox's next card out the pipe to see what they have, they have always been good with driver support for any OS, MS or *nix from what I gather.
Re:Oh god, the chance to first post! :) (Score:3)
You know, all these new cards kick a lot of butt (Score:5)
As for picking a video card, I'd just look for the best possible support for your OS of choice - though Nvidia's performance and support under Windows is terrific, their Linux support is awful so no matter how swank the GeForce is it's out of the running to go into my systems. I still dual-boot, but I'd rather not.
ATI and 3Dfx do a better job of supporting Windows/Linux/Mac, so I buy mainly their cards. I'm willing to trade off a few FPS playing Quake III under Windows for that Linux and Mac support. But hey, if you don't mind Windows and you live to frag, then this GeForce 2 sounds pretty darned sweet.
- -Josh Turiel
110 fps (Score:1)
-- Andrem
NVidia vs ATI vs 3DfX (Score:2)
nvidia wins out? (Score:2)
Re:Nvidia linux support (Score:3)
While it's nice they released drivers, they could at least follow standards.
And of course, with closed source drivers, if it crashes you can't tell why, nor can you possibly fix it.
Re:110 fps (Score:1)
Whether or not they will make use of this ability is the question. I sure would, simply because it would look good.
Does it really matter (Score:1)
Getting a new graphics card is worse than buying a new pc these days, a newer/faster/better model is being announced when the one that was just announced has finally been released. So you wait for the new cards only to find that new cards are around the bend and so fourth and so on.
Prices, power consumption are increasing (Score:3)
The other disturbing trend is the power consumption is getting much worse. Whatever happened to the "faster, smaller, less power" mantra? The Voodoo 5, for example, needs to have a hard drive power cable plugged into it. The GeForce 2 is in the same ballpark, if not worse. Yes folks, hardware engineers can do whatever they want without limits on power consumption or price. Now how about getting back on track?
Another Great Review Is Up (Score:3)
Sharky Extreme [sharkyextreme.com] has a great review up too, also technical in nature. I read it, and as I recall, it was about 30 pages, pretty in depth.
One of the biggest points is that current x86 cpus are not fast enough to outrun the graphics card in low res. When tested with a 1 GHZ Athlon, and an 866Mhz P3, the graphics card doesnt fare much better in low res than does the original GeForce. It is essentially a barrier for games, created by realease dates :-)
Also of note, the business practices of NVIDIA are scrutinized, such as their 6 month release intervals, which seem to be resulting in their being king of the hill rather freqently.
Supposedly, the ATI Radeon MAXX will be the only thing remotely close to the nv15 (GeForce2 GTS). However, the only thing expected to defeat the GeForce2 (NV15) will be the NV20.
For those of you who haven't had the time to read the reviews, they're going to come out with the NV15 VERY soon. Oh, and the 1 ghz athlon cant keep up with it, as mentioned. At the same time there will be 128MB versions of the original GeForce, geared towards workstations. Soon after, there will be 64mb versions of the GeForce2. Shortly after that, we will see the mobile gforce, NV11, a 3d card for laptops. 6 months from now, nvidia will introduce us to the nv20.
IMO things are shaping up very nicely in the graphics arena. We are not just seeing more frames in our games, but many additional features, thus letting people from hardcore gamers running at 640x480 in low detail, to those that desire 32bit quality and large detail wanting to realize all that our technology can bring us, be satisfied with one card, regardless of the company producing it
Re:nvidia wins out? (Score:1)
Don't you feel like a complete moron (Score:1)
Re:Prices, power consumption are increasing (Score:2)
Another benefit of the die shrink is that the GeForce 2 GTS consumes close to half of the power as the original GeForce, putting it at between 8 - 9W versus the 16W for the GeForce.
Need I say more...?
-
Re:Fast Cards, Slow Machines (Score:1)
Anandtech's article indicated that the GeForce2 would come out in a PCI version (cause NVIDIA can't stand the thought of 3DFX having a market to itself
From anandtech:
And taking a page from 3dfx's book to success, NVIDIA will be offering the GeForce 2 GTS in both AGP and PCI configurations
-
How much video power do you need? (Score:2)
I had this problem a few months back. I used to have an ATI Rage Fury w/ 32RAM. It did the job pretty well, though I never bothered to benchmark it or check my FPS.
Anyway, I kept reading how hot the TNT2 cards were, and everyone was telling me that they blew everything away. So, being hardware geek I am, I went out and dropped $250 for one. Did it make a difference? Not an appreciable one that I could see. HL Team Fortress Classic seemed to run a little smoother, but for the most part, I couldn't tell any difference.
I managed to hold off on the Geforce cards, but now that the new voodoo's and geforces are coming out, the temptation will be pretty great. Still, it seems to me that the gaming industry needs to be careful not to sacrifice substance for style. Take Q3 and Unreal:Tournament, for example. Two of the best looking games ever, but is the gameplay really that impressive? Yeah, the single-player mode is not the focus of these games, as they're built for on-line play, but it's mostly just the "run and frag" variety, with a little Capture the Flag thrown in for good measure. I played the demo's for both of these, and by the time the actual games came out, I was back to playing Starcraft and Alpha Centauri. Let's hope that game designers don't forget to make good games in their quest for the most impressive looking environment.
Re:MS X-Box (Score:1)
You think wrong.
They are getting something brand new and MS have paid them $200 million in advance for R&D.
A penny for your thoughts.
Re:MS X-Box (Score:1)
Re:Prices, power consumption are increasing (Score:5)
I personally have a TNT 1 card (I bought it because it was good enough, and cost $70) in my dual 500MHz system with 256MB of RAM. I get around 20fps, 8fps if anything interesting happens. Oh, yeah, that's at 512x384. I guess I need to get some time away from school to try those new drivers, but I'm seriously considering going non-NVidia if I buy another card.
What's so bad about 6-month release intervals? (Score:2)
The driver issue - That's a different story. There's no reason not to go open-source. Maybe their hardware does some really neat stuff that might be revealed by open-source drivers. Well, that's what hardware patents are for. If they haven't patented whatever is so damn revolutionary about their card that they can't release source, then they're stupid.
Whatever secrets they have, they'll become obsolete soon anyway. In this market, the value of IP degrades pretty quickly.
Re:nvidia wins out? (Score:1)
As the proud owner of a laptop running the prestigious Trident CyberShite9397 3D-Decelerator, I can definitely say that they pose serious competition in the market. Microsoft Hellbender never looked so good.
The true heart-breaker is watching this pathetic chip struggle with Imperium Galactica II (like watching Peewee Herman trying the Ironman contest). And yes, I do intend to upgrade, once some enlightened notebook maker brings out a model with a decent 3D-chip.
One question - Nvidia are supposed to be releasing a mobile version of the NV11. Anyone any info on this, or the NV11 itself?
Drivers! (Score:1)
Mike
Unanonymous Coward
No one will ever need MORE than 640k?? (Score:2)
Today, that might be true. But as another poster pointed out, faster cards allow less 'optimization experienced' coders to make their ideas workable.
Maybe with built in T&L, explicit support for 3D, and the slew of 'overkill' functionality, we will see some truly remarkable new ideas develop.
Sure, the human eye can't see much difference in the ultra-high frame rates. But, when you have a whole lot of 3D shapes moving independently on the screen, mutating as they go, the lower end cards will start to chop, while the top-dogs will run smooth.
I certainly don't want my Lawnmower Man experience screwed up by BitBlt redraws. And that's exactly what such high-end hardware will make possible (or at least more likely). Fully immersive VR with complete freedom of motion - granted, on a screen it will always look crappy. The VR goggles (or whatever) is just the other side of the coin to the very shinny graphics cards first side.
video specs (Score:1)
One thing these video card reviews never seen to talk about much is how the game looks (which video cards porduce a better picture)... Are they all the same?
Seond thing is they all have there own fancy thing to make them different (anti-aliasing, T & L, bump mapping) . Are these features being taken advantage of by OpenGL, or are they just useless add-ons unless the games add specific support for these new features.
Also how can you get 100 fps on a screen running at 85 hz?
I'll stick to my G400 Max for a while (Score:2)
Re:Low-end? (Score:1)
I think I speak for all us 486-ownin' people when I say
=)
Re:They haven't really thought it through (Score:1)
I like the NVIDIA Geforce series (Score:1)
Re:video specs (Score:1)
I don't know for sure about bump mapping, but I'm guessing yes. Antialiasing and T&L has been there for ages.
Also how can you get 100 fps on a screen running at 85 hz?
You can't. But the issue here is to have enough overkill so that you never drop _below_ the screen refresh rate, no matter how much the action heats up. Also, if the card can crank out 100 fps on todays games, it should also be somewhat future-proof.
A penny for your thoughts.
At least /. has is not favouring Tomshardware. :) (Score:1)
Re:video card hype (Score:1)
Keeping up the id's (Score:2)
2 points.
A) We will continue to need fast 3D for quite a while. What you said, all in the name of realism. I want my perspective-correct shadows, I want my realistic fog, my bezier and NURB surfaces, etc.
The world is just too complex, and today we don't have the horsepower to accurately model it, so we approximate it: badly.
B) Thankfully photo-realism isn't the end-all and be-all. Real-time Cartoon rendering is starting to pick. Check out some the latest issues of Game Developer.
Cheers
Re:Another Great Review Is Up (Score:2)
thus letting people from hardcore gamers running at 640x480 in low detail, to those that desire 32bit quality and large detail.
This past weekend, I had the opportunity to see what kind of difference three generations of processor and video card can mean side-by-side. At one end is my box (P3 @ 560, DDR GeForce), in the middle is a P3 500 with a Voodoo3 3000, and at the other end is a K6-2 450 with the same Voodoo card. Setting up Q3:A under Win98 on all three machines for a LAN party gave me a chance to see just how each system would run with the same settings.
My results (YMMV):
Both Voodoo machines ran the default settings (800x640) around 42FPS, and tinkering with textures and detail levels could alter this from 38-45 on both systems. Clearly, the Voodoo was overmatched by even the K6-2. The GeForce machine gave 66 FPS.
Turning on all the bells and whistles (everything cranked up, 32 bit colors and textures, etc, etc), the GeForce still put out 59FPS at 1280x1024. The Voodoo machines returned 37 and 33 FPS, respectively.
Am I going to upgrade from my Elsa Erazor X2 to a GTS or Voodoo 4/5 or Radeon? Nope. Not until Q1 2001, anyways (or I land a job that pays well enough to support my technowhoring *g*). Would the other guys? Probably when the NV20 is released and the NV15 takes a price cut.
Rafe
V^^^^V
Framerates (Score:2)
Yes, it's curve of decreasing returns...
The jump from 10 fps to 30 fps is much more eye pleasing then a jump from 60 fps to 80 fps.
Anything over 60 (to 72) and you won't be able to tell the difference.
HOWEVER, we do need obscene frame rates, so we can apply full scene anti-aliasing.
Think of difference the texture filtering makes: linear (usually software), or bi-linear filtering.
Basically instead of needing 4x the resolution with linear filtering, we can achieve a look of a MUCH higher resolution via the bi-linear filtering.
Cheers
Re:MS X-Box (Score:1)
This should be ironic (Score:3)
Compare this to about a year and a half ago, when the TNT came out. "Sure, they support 32-bit color and higher texture sizes, but we have more FPS! No true gamer cares about how good their games look, they just want more FPS!"
The sad thing is, I think 3dfx knew this would happen--that's why they've been pulling away from emphasizing the performance of the Voodoo 5, and instead hyping the full-screen anti-aliasing.
On a side note, it now seems that the Voodoo 4 (the single VSA-100 chip) has no hope of seeing the light of the retail market. Some OEMs _might_ pick it up, but considering that the Voodoo 5 5500, it might be a bit of an embarrassment to release the Voodoo 4.
~=Keelor
Re:Wrong! A libertarian perspective (Score:1)
Read the Declaration of Independence.
We, therefore, the Representatives of the united States of America.
It's uSA (lowercase u, since united is an adjective), not USians.
Re:Linux support? (Score:1)
Re:Prices, power consumption are increasing (Score:3)
~=Keelor
Re:110 fps (Score:1)
As much as your favorite game requires (Score:2)
In essence, this boils down to a matter of taste. You seem to be saying that all of the games which really require that level of graphic support aren't really your cup of tea, but for tens of thousands of people, it is precisely their cup of tea.
On the RTS front, while it was certainly a playable-as-hell game, I heard plenty of people complain that their brand new, whopping fast machine was limited to 800x600 in Starcraft, just because Starcraft couldn't go any higher. Myth was the first (fairly) recent game to really start to reverse that trend: Homeworld tried to completely stand it on its ear.
As magnificent a game as I thought Homeworld was, I really did feel that it was limited to some degree by the constraints of the technology: many strike craft + many ion cannons = big framerate losses.
So, in other words, pay attention: I'm predicting that newer RTS games will will benefit from better tech more than their predecessors did.
Re:MS X-Box (Score:1)
Re:Prices, power consumption are increasing (Score:2)
Incidentally, 3dfx claims that this is because motherboard makers skimp on voltage regulator quality, so the motherboard apparently doesn't supply the right voltage. To this I say: Yeah, if you draw current over specifications, the voltage will drop, of course.
Re:You know, all these new cards kick a lot of but (Score:1)
Or, even better... U2W SCSI 8^) Still a sight better than ATA/66 for throughput and CPU usage. The ATA bus structure just isn't as flexible (2 drives/channel?!). Plus, easily attachable external devices.
It does cost more, but the performance boost is well worth it.
Re:Nvidia linux support (Score:1)
Hey, let's give the company a chance. This time last week, I wasn't sure we'd see drivers at all until July -- beta or not. Kudos to Jim (et al.) at NVIDIA for working hard to get yesterday's drivers out.
I expect the company will be doing more for and with the free software community... just give them a little time to adjust.
--
Re:Don't you feel like a complete moron (Score:1)
Yes. I am a moron, if that's what you were getting at.
How silly! (Score:2)
Beta drivers are beta because they haven't been qualified, certified, tested, whatever. Heck, even non-beta drivers have bugs and problems! So I'd think if NVIDIA had beta drivers for Linux, that by the very definition of beta, they haven't tested it thoroughly enough to guarantee anything under Linux.
Which goes back to poor Linux support, given how many generations of cards have come and gone under Linux now...
-AS
Re:video specs (Score:1)
32bpp? what are you smoking? (Score:1)
Also, I would be more than curious to see what type of horsepower he has under the hood of the machine -- I have a roomate that just purchased an Athlon 700 (from a k6-2 450) and that alone was worth way more than if he had just bought the next best card out there (he runs v3 3000, as do I...Unreal Tournament at 1600x1200 even at 16bpp never looked so good.....).....
the source.. (Score:2)
Actually, from what I understand, they can't release the code due to NDAs with other companies whose technology they use. Or something along those lines. Wonder if that'll change anytime soon. Prolly not.
Re:You know, all these new cards kick a lot of but (Score:2)
Linux is more then just the x86 chips. As far as I know the Nvidia cards won't work on PPC flavor or on the Alpha flavor of Linux. I don't want to leave out our *BSD friends either. It can easily be argued though, that the games that are available for Linux right now are pretty much all x86 closed source. The fact remains though that if any of the other platforms that Linux runs on or the *BSD people want to do anything with these cards they are SOL. Would it really give competitors and advantage to release the register programming interface for their cards?
Molog
So Linus, what are we doing tonight?
Don't forget though... (Score:2)
The biggest problem, of which no-one has actually mentioned, is that nVidia is working so hard on getting new silicon out the door every six months that they just don't have the people to work on the drivers. This leaves the people with current GeForce cards a little annoyed after shelling out over 200 quid for the card. Agreed, the card performs well and I am happy with it, but the card has functionality that isn't even implemented in the drivers yet! Why do hardware houses insist on releasing their products before their driver has the capability to exploit the hardware built into the card?
My next card will probably not be another nVidia, unless, of course, they're support and drivers are improved drastically. This is exactely the problem ATI faced, luckily they had a large OEM base that supported them.
To make my point more valid, I used to work for the company behind the SuperFX chip for the Super Nintendo (used of Starfox and a few others), which progressed onto a core for the old Cyrix chip (the one with the graphics built into the chip). I know that when the silicon is being designed, the old drivers are nowhere near as supported due to the fact that people are working on the software for the new silicon.
c0rarc
Re:How silly! (Score:2)
Are you just purposely looking for any excuse you can find to bash nVidia? Perhaps looking to gain a few karma is the meantime? I have NEVER before heard ANYONE say that because something is beta, it could not possibly be stable. You know perfectly well that beta software can be very usable. Obviously, they are calling it beta because it is the first release, and you should never call your first release anything other than beta.
If that isn't enough for you, I have a GeForce 256 and I am happily getting the same framerates in Quake 3 that I normally get in Windoze. Hell, even my 2D performance has been doubled by the new drivers. And guess what? I have not had a crash yet.
------
Pot. Kettle. Black. (Score:2)
And at the same time nVidia PR was saying, "Sure, the V3 has more FPS than the TNT, but we have 32-bit color and higher texture sizes!". When nVidia T&L was announced, 3dfx it would be a while before T&L was properly supported. When 3dfx announced the T-buffer, nVidia claimed that gamers would prefer the T&L speed boost to the prettier FSAA (Quote nVidia PR rep, who asked if 3dfx's VSA-100 stood for "Very Slow Architecture" in a public interview).
These two companies have been bashing each other constantly. 3dfx uses their "PR specialist" Bubba (His real name) Wolford (sp?), while nVidia 's attack dog is Derek Perez. Open your eyes, *all* corporate PR divisions are full of it, some are just a little better at convincing you of the contrary (As nVidia seems to have done to you).
Re:video specs (Score:1)
Well, Sir. I dissagree.
Let's say your're running at 100Hz and by disabling v-sync you get a steady 200fps. That means you'll get two halves of a frame in one screen refresh 100/2 + 100/2 = 100 whole frames/sec.
A penny for your thoughts.
Re:Oh god, the chance to first post! :) (Score:1)
Anyway, my Monitor is runs at 85 freq. It's pretty clear.
Re:As much as your favorite game requires (Score:1)
Well, yes, I'll admit that First Person Shooters, typically the graphic kings of the game world, aren't my favorite type of game. There is a certain thrill in running around Q3 like a madman, killing at will. But for the most part, I prefer a more strategy-filled game.
Starcraft does indeed have limited graphics, but you've got to remember that the game is approaching 3 years old. The fact that the game is still actively played by literally thousands of people, both on BattleNet and by themselves, I believe is testament to it's superior gameplay.
Anyway, the point I was trying to make was that game designers lately seem preoccupied with making their games prettier and flashier, rather than better. Take Force Commander, for example. Lucas Arts tried to put the RTS genera in a 3D environment, and failed miserably. The game is filled with 3D models and backgrounds, and you can rotate your camera to every concievable angle. These features, however, make it almost impossible to effectively issue orders to your units and follow the action. Camera control is a mess, the interface is clunky, and the actual gameplay unexciting. I've also been playing Star Trek: Armada, and while it is a far better game than Force Commander, it also sufferes from the same bad camera angles and tired, repetitive gameplay.
I've not had a chance to download the Earth 2150 demo to try it out, hopefully it will be good. I've also heard good things about HomeWorld and the upcoming Halo. And I guess bad games just come with the territory: there wouldn't be good games without bad ones. Still, it's frusterating to buy a game, get all the cool and neat graphic tricks in the world, and absolutely no play value.
Re:Nvidia linux support (Score:1)
I don't entirely agree with your DRI comment, either... but that's ok 8^)
Re:Oh god, the chance to first post! :) (Score:1)
Any decent monitor will support 110hz refresh or better at 640x480. Check out this low end viewsonic monitor: G655 15" [viewsonic.com]. If you want something larger there is this very nice PS790 19" [viewsonic.com](I have one of these) and the totally outrageous P817 21" [viewsonic.com]. Both support better then 110hz at 1024x780, and don't even list the refresh rate at 640x480. Ofcourse if you insist on getting that $200 19" monitor, you get what you pay for. There is much more to monitors then just the numbers. Better, more expensive monitors last longer and, more importantly, look better.
TV and Film have Motion Blur. (Score:3)
Newer video cards ARE beginning to incorporate motion blur, which will help enormously. But it is cheaper to simply up the framerate, at least up to a certain point. (Which I don't think we have really reached) Motion blurring sounds like a very computationally intensive thing to do.
So there are reasons to go to 100fps-- if the frames are clear, it will take many more of them to approximate the effect that motion-blurred TV or film produce at 30fps.
I like my GeForce (Score:1)
Now to install some cool games and have some phun with it!
now that was a good troll (Score:1)
The sales of these cards is strictly a matter of business. What people do with them afterwards, and what they decide to do because of them is strictly a matter for the individual, rather than the "state" to decide. Any other course of action will simply be another step along the way to the iron fist of totalitarian government.
By the same reasoning, people should be able to buy thermonuclear weapons, a deadly biological viruses, nerve gas, and gun add-ons to let people shoot down police helocopters(obscure Simpsons reference). Sure, whatever.
Re:They haven't really thought it through (Score:1)
Death toll caused by 3D Graphics Acceleration: in zero digits.
Case closed.
/.
but... but.. look at the screenshots.. (Score:1)
WXP's Isle of Morg [sharkyextreme.com]
Planet Moon's Giants: Citizen Kabuto [sharkyextreme.com]
Computer Artworks' Evolva [sharkyextreme.com]
i'm not quite finished wetting myself.
My kingdom for good drivers! (Score:4)
The one thing that is most important to me, however, is support. I don't mean telephone, but platform. I recently got burnt on buying a Viper 2000 because they refuse to make NT drivers with any sort of hardware acceleration. Then Linux runs into the same problem. I was sold by their web site when I was deciding on the card for my new computer. Their web site turns out to be a flat out lie. And if there's one S3 developer out there reading this, I have a size 12 boot that has your name all over it.
So now I'm incredibly leary of these game cards coming out with all these whiz bang features but will probably only develop for WinBlows 2000. I need drivers for NT4 because that's where the software is these days, and I need Linux drivers because that is where the graphics software is going and where I create most of my custom software. So when a company now comes out saying they're going to support this and that, but don't have the drivers to back it up, I'm just going to wait.
This summer, I will probably just buy a GeForce I. Because now they have released the drivers for it under Linux (it isn't open source but I don't really care) and they've always had stellar NT support. I know people here like their drivers open source and their cards to be screaming fast, but I just want one that works as advertised and fits into my meager budget.
Taos
so how long befor 3d web? (Score:2)
i know there are tons of arguments against a 3d interface, but so were there against 3d games when Ultima Underworld came out! if i can move around with absolutely no effort under Q3A or UT, why can't a Q3A level be a web site?? sure it's faster to see all in one page, but if slashdot looked like a Q3A level (with news posted in floating billboards and sections looking like houses, buildings, huts and spaceships) would you log on? i would...
ah, but there's the matter of download speed. well if evey other person has downloaded flash plugins, and realplayer, and are now actually using it, couldn't they download a set of very compressed textures so that when you log on all you download is the wireframe file and changing images, wouldn't this be comparable with downloading html and changing images???? c'mon programming gods, it can't be that hard right? besides, if all you have to download is a 3d browser, which already amount to many megs, you could send all basic textures there couldn't you?
whatdoyouthink people?
========================
Re:I like my GeForce (Score:1)
Re:Linux support? (Score:1)
NVidia will continue to have crappy performance and unstable drivers in X until they shape up and get with the program. Wether or not they theoretically perform well under some windows benchmark doesnt interest me in the least.
Re:Wrong! A libertarian perspective (Score:1)
First off: Boy, do I want a GeForce 2 to replace my TNT1! OK, now for the off-topic content...
"Pro-gun" goes not equal "pro-violence." That's an insulting bit of intellectualy laziness.
The FACTS of life are somewhat UNPLEASANT. The FACT is that some day, I may need to use violence to protect myself or my family. A gun is the best tool for that. Do I look FORWARD to that? No. Do I prepare for it? Yes. Isn't it wise to set yourself up to WIN a confrontation that could othewise result in your death?
(If you think that there is never a call for violence, that there is always a peaceful solution, then you don't want to argue about guns, you want to argue about violence and the right to self defense, an altogether different topic which guns are only a facet of.)
And the above poster has it right when he says that guns are an important part of protecting our freedom. I'm not saying we need an armed revolt now -- but can anyone guarantee that we won't in 100 years? 500?
Ultimate authority flows from the barrel of a gun. If the people don't have some "authority" of their own someday they'll lose big to an invasive government.
Re:Low-end? (Score:1)
Bigger, Better, Faster, More!!!!!!! Need More!!!!! (Score:1)
Need more (insert hardware here i.e. RAM, MHz, etc. etc.)!!!.
flatrabbit,
peripheral visionary
Re:Prices, power consumption are increasing (Score:1)
I also have a TNT 1 card in my single 500MHz system with 128MB of RAM, and I can tell you that this is entirely a driver issue. I average about 50fps in Q3, with a range of between 25 and 100 depending on what's going on. I'd suggest this card for hardcore windows gamers, but for a linux or dual boot system, the G400 is the best choice, and a card that will perform well under either OS.
----
Dave
Purity Of Essence
Re:Prices, power consumption are increasing (Score:1)
BTW, thanks for the cool info on the power consumption.
Christopher
Re:Linux support? (Score:1)
GeForce 2 already supported in Linux! (Score:2)
At least, that's what Nick Triantos of nVidia, the guy responsible for the Linux drivers, just told me. Apparently, the drivers released yesterday have full support for the GF2. For once, we appear to have drivers for a new product before the Windoze people! :)
If that thing really is in stores on friday... hell yeah...
------
Fix the damn drivers... SMP is still unstable (Score:3)
On top of that, Leadtek won't supply the Control Panel display settings stuff for NT that they have under Win98, so my gamma settings are too low (better than under Linux though where 3D games are unusable due to the darkness).
Come on guys, fix the drivers. The fastest card in the world isn't much good if I can't use it.
Re:Fast Cards, Slow Machines (Score:2)
Re:This should be ironic (Score:2)
As for Glide, being as Glide is now open-source, I don't think that 3dfx could really say that have that up on nVidia.
~=Keelor
Re:so how long befor 3d web? (Score:2)
Re:Don't forget though... (Score:2)
nVidia and customers (Score:2)
The chip itself looks impressive but then again so do the Radeon256 and VSA-100. ATi and nVidia have bump mapping which is a really rad looking effect if it's used in a game while the VSA-100 had the T-buffer that lets you do all sorts of video effects. I'm not sure yet which card is going to get my money. It will probably be the one that supports the platforms I'm cruising around on.
Re:Oh god, the chance to first post! :) (Score:2)
This is stupid... (Score:2)
Supported drivers are the goal, though some take that to mean binary, some take that to mean source, some take that to mean phone call support. Whatever.
So being satisfied with beta is pointless. Beta is only temporary, transitory, towards final release.
We, as consumers, should not be expecting beta stuff, in general, yet that is what we have with Windows, Netscape, and a whole raft of other software and products.
-AS