Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software

NVIDIA Geforce 2 Review 129

maniack writes: "NVIDIA lifted the ban on Geforce 2 benchmarks and specs at midnight, and Anandtech right away posted an article on the card. They put it up against all of its competitors, including the Viper 2, the Rage Fury MAXX, the Voodoo 4 and 5, and several flavors of the old Geforce including a 64 MB DDR card. The 32 MB DDR Geforce 2 GTS ripped the competition apart in almost every benchmark including the texture heavy Q3 Quaver. The Geforce 2 was the top performer in both high-end and low-end systems. The article also explores the performance hit cause by full scene anti-aliasing. Sharkyextreme also has a review, as does Hot Hardware. "
This discussion has been archived. No new comments can be posted.

NVIDIA Geforce 2 Review

Comments Filter:
  • i wouldn't be surprised if now the average Joe PC Buyer had absolutely no idea which video card to buy. wow-the Radeon, the voodoo5, and the geforce 2 all in one week. i'm kind of overwhelmed with the power. hopefully someday the video card market slows down...but who am i kidding? i'm not even sure if i want it to slow down.
    does anyone know if the geforce 2 will have the same linux support the geforce does? if so, i'm a little wary. but you have to give them points for performance...
  • by Eneff ( 96967 ) on Wednesday April 26, 2000 @02:05AM (#1109940)
    Now seriously, I do believe our obsession with video cards has been a bit extreme lately. Is it great that there are cards that blow the socks off anything out even a year before? Sure. Is it worth 300 dollars a year for an extra few FPS in Quake 3?

    I'm just not sure.

    Furthermore, we've come to the point that the extra rates only support more on the screen, rather than an incredible clarity. I've seen some nice pictures, but it's still light years away from anything I would call beautiful.

    Yet the biggest delimiter isn't the card anymore, but the artistry. There just aren't enough artists, and it's not possible to put enough great artists on most teams to make something spectacular. That might be the next frontier. Even if we can get life-like quality, the game will still only be as good as the artist behind it.

    (I wonder if I gave up the first post by now. :)
  • I'm in the market for a new video card and I run linux and would like to continue to do so. Now then, if nVIDIA fails to release adequate linux drivers for the GeFORCE2 then this will be no competition, Voodoo5 all the way. Either way i'll wait about 2-3 weeks after the Voodoo5 is on the shelves before i make a decision.

    And I know that "Linux isnt for gamers" but the whopping 10fps i get with my TNT2 is too damn annoying.
  • by Domini ( 103836 ) on Wednesday April 26, 2000 @02:18AM (#1109943) Journal
    Agreed, Quake is already running at 110 fps on my PC (and don't tell me 30 fps is good enough... that has been discussed in the past)

    There is the following point however... the faster 3D cards become, the more inexperienced people can bring more creative games that are actually playable to the market.

    This is made possible by the recent (explaining all the latest hoo-hah about Gfx cards) introduction of on-card transform and lighting.

    Quake is scaled down for speed purposes, the faster the cards become, the more complex scenes may be generated without loss of FPS.

    GeForce was the first with the GPU (tm?) and although it's not fully utilized yet, it still bodes quite well.

    Things like FSAA and Motion Blur may cause thing to enter the realm of 'beautiful' pretty soon.

    My 2c
  • I agree that it is not worth extra $300 for extra 5-10 fps. (I can't tell the difference.) But, kudos to nVidia to push the limits higher all time, or we'd be stuck playing mode-x games with VGA cards! I'd get Nv20 or NV25 or whatever since I got GeForce. It was a big jump from Riva128.
  • The GeForce 1, although not open-source, has adequate drivers already. I have no reason to believe that GeForce 2 won't... chances are that it may even work with the same drivers....

    Another point... The Geforce DDR is already quicker than the soon-to-be-releases paralel VooDoo 3 (aka VooDoo 5). There is no reason to go Voodoo using performance/price considerations.

    The only place where the Voodoo does well, is in Unreal, and possibly only because it gets shipped with new Voodoo cards. (-ponder-)

    The Voodoos are old generation, programs are already optinised for them. Nothing even uses +/- 80% of the GeForce's yet... I can't wait... (no, really, I can't!)

  • ...since I got GeForce. It was a big jump from Riva128.

    Yeah, I went from a 4MB ViRGE and a PowerVR PCX2 to a TNT2u - on an otherwise identical machine, the fps at 800x600 in Forsaken went from roughly 15 to around 100 :-)

    Cheers,

    Tim
  • by Anonymous Coward
    Just 110fps? Hey, if you want to seriously play Q3A, turn it back down to 640x480, 16-bit color, as little detail as possible and turn off the 3D items. I'm getting 158fps on my PIII-550/UltraTNT2 machine.
  • Sure. Is it worth 300 dollars a year for an extra few FPS in Quake 3?

    Maybe not, but nobody's twisting your arm. I think it's great that consumer 3d-graphics advances at this speed. Too bad the target platform for most developers is in the Voodoo2/TNT to Voodoo3/TNT2 area, as they need as large a customer base as possible.

    There are still a lot of people with TNT/Voodoo2's out there, and a GeForce2 GTS i certainly a great upgrade.


    A penny for your thoughts.
  • Not bad...

    I only have a 100/450 MHz PIII.

    But I am running in 800x600 in Lightmap mode... try that. (Also try putting stencil shadows on, and we'll see it crawl... I also had I Ultra...

    These are things that the Geforce, and the Geforce 2 will to extremely well... hence this Thread...

    I only need 656 fps to rail well on 160 ping, but one needs a bit of resolution to rail without zooming...

    Shurg... it's much of a muchness... maybe you want to discuss this offline (not to be too offtopic)

    My R 0 , 2/100
  • Gamepsot has a review also, for the less technically inclined.
  • I'm certainly impressed by the numbers the new boards are producing, but I just want to express my stance on this:

    I'm still using a Pentium Pro 200 that I bought in August 1997. The only reason why it's still a viable gaming machine is because of the Voodoo2 card I have in it.

    So what should this mean? Not much other than I only wish PCI versions were still available ;-)

    --

  • >> I only need 656 fps to rail well on 160 ping,

    Oops, sorry, make that 65 fps
  • Sorry, link is here [gamespot.com], my bad
  • You are all saying how bad Nvidia support is..Well they just released new drivers yesterday..and I heard they were comprable to the windows one...So hopefully their new products will be supported that well also..
  • Fortunately nVIDIA has moved to a unified driver architecture, which is going to find the right driver, and release drivers much easier. Hopefully this will help in getting drivers for other OSes out faster.

    But keep in mind GeForce owners, that you can download the new drivers and get a considerable performance boost for your current GeForce card! =)

  • I agree with this. The dual head gaming machine that was at spring Comdex from Penguin Computing was very nice looking, and with XFree4 ability to have more than one video card, imagine dual dual head cards for 4 monitors. Or would that be quad head then? :)

    Anyway, I'm waiting for Matrox's next card out the pipe to see what they have, they have always been good with driver support for any OS, MS or *nix from what I gather.

  • by ChadM ( 102789 ) on Wednesday April 26, 2000 @02:44AM (#1109957)
    My friend just got a 21 inch moniter yesterday and we loaded Q3 at 1152x864, 32bpp. it still looks killer, but the main problem with a voodoo3(which he has) is how high the res can be set to. he is currently running it on 1280x1024 but is waiting for a GeForce2(probably will buy one on its first day out) so we can run it at 1600x1200 and possibly higher if supported(yes its that big. i drooled). if i could see quake 3 and unreal tournament in 1600x1200/32bpp and full screen anti-aliasing without lag i might just have to steal his moniter. my point being that it's not necessarily about frames per second as much as supporting a higher resolution and still having enough horsepower to render the whole thing.
  • by jht ( 5006 ) on Wednesday April 26, 2000 @02:45AM (#1109958) Homepage Journal
    So Nvidia will be king of the hill again for now, at least until the next announcement cycle. Maybe it's me, but I think that video cards have gotten so fast that at this point even the lowest-end cards (anything more potent than the integrated video in the i810 chipset) have more than enough horesepower to handle any users' typical 3D needs (including very enjoyable gaming). Right now the money for a supreme video card is arguably better spent on tons o' RAM and an ATA-66 drive/controller combo for faster performance in everyday apps.

    As for picking a video card, I'd just look for the best possible support for your OS of choice - though Nvidia's performance and support under Windows is terrific, their Linux support is awful so no matter how swank the GeForce is it's out of the running to go into my systems. I still dual-boot, but I'd rather not.

    ATI and 3Dfx do a better job of supporting Windows/Linux/Mac, so I buy mainly their cards. I'm willing to trade off a few FPS playing Quake III under Windows for that Linux and Mac support. But hey, if you don't mind Windows and you live to frag, then this GeForce 2 sounds pretty darned sweet.

    - -Josh Turiel
  • Ok, so frame rates on modern graphics cards are a little ludicrous. What games seem to need now is more polygons. I can't wait to see whether Evolva lives up to the expectations: because the landscape and creatures are computer generated, the amount of detail can be increased to match what your system can handle - rather than the upper limit being dependent on how much effort the 3D designers put in.
    -- Andrem
  • This stuff is really getting out of hand. Regardless of what performance these companies are claiming for their new cards the fact remains that there is a bottleneck in getting all that graphical data to the screen. AGP 2X,4X has proven worthless. Developers haven't even started addressing advances in graphics cards that are two years old. As graphics chipset manufacturers continually leapfrog each other on new features there are very few real tangible benefits for consumers. Personally I use a TNT. It offers excellent 2D and 3D performance and I see little improvemetn from newer cards. For business apps the Matrox Millenium2 still leads on crystal clear resolution and color. I just wish these video chipset manufacturers would start producing their own games so we consumers could actually start taking advantage of all these new features. Beyond that I think consumers interests are best served by getting a good deal on the card that was hottest two years ago. For a fraction of the price you get as good or better performance and better stabilty.
  • but what about trident?
  • by Microlith ( 54737 ) on Wednesday April 26, 2000 @02:53AM (#1109962)
    They're still not playing nice. They didn't release source and the drivers don't use DRI in Xfree86 4.0

    While it's nice they released drivers, they could at least follow standards.

    And of course, with closed source drivers, if it crashes you can't tell why, nor can you possibly fix it.
  • That's what the point of the hardware T&L is. Supposedly it would allow developers to use far more polygons than previously possible.

    Whether or not they will make use of this ability is the question. I sure would, simply because it would look good.
  • Which one you get anymore? They're all faster than you currently will ever need. I think these days it comes down to feature set, and I admit that I am interested in ATI's new board (gah, end of summer release????) with the DTV and HDTV onboard chipset.

    Getting a new graphics card is worse than buying a new pc these days, a newer/faster/better model is being announced when the one that was just announced has finally been released. So you wait for the new cards only to find that new cards are around the bend and so fourth and so on.
  • by Junks Jerzey ( 54586 ) on Wednesday April 26, 2000 @03:12AM (#1109965)
    For a while it looked like you'd be able to run out and get any good 3D video card for $79 and that would be that. Now with these newer cards we're back to the $250 and up range. Seeing as how even cards from two generations back haven't been pushed to the limit yet, this is mostly an outlet for people who want bragging rights.

    The other disturbing trend is the power consumption is getting much worse. Whatever happened to the "faster, smaller, less power" mantra? The Voodoo 5, for example, needs to have a hard drive power cable plugged into it. The GeForce 2 is in the same ballpark, if not worse. Yes folks, hardware engineers can do whatever they want without limits on power consumption or price. Now how about getting back on track?
  • by db_cooper ( 102715 ) on Wednesday April 26, 2000 @03:16AM (#1109966)

    Sharky Extreme [sharkyextreme.com] has a great review up too, also technical in nature. I read it, and as I recall, it was about 30 pages, pretty in depth.

    One of the biggest points is that current x86 cpus are not fast enough to outrun the graphics card in low res. When tested with a 1 GHZ Athlon, and an 866Mhz P3, the graphics card doesnt fare much better in low res than does the original GeForce. It is essentially a barrier for games, created by realease dates :-)

    Also of note, the business practices of NVIDIA are scrutinized, such as their 6 month release intervals, which seem to be resulting in their being king of the hill rather freqently.

    Supposedly, the ATI Radeon MAXX will be the only thing remotely close to the nv15 (GeForce2 GTS). However, the only thing expected to defeat the GeForce2 (NV15) will be the NV20.

    For those of you who haven't had the time to read the reviews, they're going to come out with the NV15 VERY soon. Oh, and the 1 ghz athlon cant keep up with it, as mentioned. At the same time there will be 128MB versions of the original GeForce, geared towards workstations. Soon after, there will be 64mb versions of the GeForce2. Shortly after that, we will see the mobile gforce, NV11, a 3d card for laptops. 6 months from now, nvidia will introduce us to the nv20.

    IMO things are shaping up very nicely in the graphics arena. We are not just seeing more frames in our games, but many additional features, thus letting people from hardcore gamers running at 640x480 in low detail, to those that desire 32bit quality and large detail wanting to realize all that our technology can bring us, be satisfied with one card, regardless of the company producing it

  • Trident? Are they still around? *smirk*
  • after drolling over dozens of benchmark graphs?
  • From Anandtech:

    Another benefit of the die shrink is that the GeForce 2 GTS consumes close to half of the power as the original GeForce, putting it at between 8 - 9W versus the 16W for the GeForce.

    Need I say more...? :-)


    -
  • Not much other than I only wish PCI versions were still available ;-)

    Anandtech's article indicated that the GeForce2 would come out in a PCI version (cause NVIDIA can't stand the thought of 3DFX having a market to itself ;-).

    From anandtech:
    And taking a page from 3dfx's book to success, NVIDIA will be offering the GeForce 2 GTS in both AGP and PCI configurations


    -
  • Now seriously, I do believe our obsession with video cards has been a bit extreme lately

    I had this problem a few months back. I used to have an ATI Rage Fury w/ 32RAM. It did the job pretty well, though I never bothered to benchmark it or check my FPS.

    Anyway, I kept reading how hot the TNT2 cards were, and everyone was telling me that they blew everything away. So, being hardware geek I am, I went out and dropped $250 for one. Did it make a difference? Not an appreciable one that I could see. HL Team Fortress Classic seemed to run a little smoother, but for the most part, I couldn't tell any difference.

    I managed to hold off on the Geforce cards, but now that the new voodoo's and geforces are coming out, the temptation will be pretty great. Still, it seems to me that the gaming industry needs to be careful not to sacrifice substance for style. Take Q3 and Unreal:Tournament, for example. Two of the best looking games ever, but is the gameplay really that impressive? Yeah, the single-player mode is not the focus of these games, as they're built for on-line play, but it's mostly just the "run and frag" variety, with a little Capture the Flag thrown in for good measure. I played the demo's for both of these, and by the time the actual games came out, I was back to playing Starcraft and Alpha Centauri. Let's hope that game designers don't forget to make good games in their quest for the most impressive looking environment.
  • I think M$ released the tech specs and they stated that the X-box will have a geforce1 chipset driving the graphics.

    You think wrong.
    They are getting something brand new and MS have paid them $200 million in advance for R&D.


    A penny for your thoughts.
  • Naaa, X-box is supposed to have an nVidia NV25, thats another generation and a half after the geforce2 (NV15). You might, however, still be right about the success of the X-box.
  • by RevRigel ( 90335 ) on Wednesday April 26, 2000 @03:33AM (#1109974)
    Actually, due to the move from .22u to .18u process, the GeForce2 uses half as much power as its predecessor. I've heard that NVidia will be shortly releasing the NV11 for laptops, which will use 30% less power than the GeForce2, as well. 3dfx is just going down the tubes. They did a technical seminar at my University (of Texas at Austin) two weeks ago, and had a Voodoo 5 5500 on hand. The FSAA made no noticeable difference, the speed wasn't very impressive, and they even went so far as to call the people working on Mesa 'a bunch of hippies', with no connection to OpenGL at all (i.e. they implied that it was as separate from OpenGL as Glide).

    I personally have a TNT 1 card (I bought it because it was good enough, and cost $70) in my dual 500MHz system with 256MB of RAM. I get around 20fps, 8fps if anything interesting happens. Oh, yeah, that's at 512x384. I guess I need to get some time away from school to try those new drivers, but I'm seriously considering going non-NVidia if I buy another card.
  • I don't see how a 6-month release interval can be considered a dishonest business practice in any way at all, as long as the released products aren't of shoddy quality. It's simply called keeping ahead of your competition. If they can't keep up technically, then too bad for them.

    The driver issue - That's a different story. There's no reason not to go open-source. Maybe their hardware does some really neat stuff that might be revealed by open-source drivers. Well, that's what hardware patents are for. If they haven't patented whatever is so damn revolutionary about their card that they can't release source, then they're stupid.

    Whatever secrets they have, they'll become obsolete soon anyway. In this market, the value of IP degrades pretty quickly.
  • Yup, Trident are still in the game, threatening to release their CyberBlade according to this site [ixbt-labs.com] (which currently seems to be /.ed)

    As the proud owner of a laptop running the prestigious Trident CyberShite9397 3D-Decelerator, I can definitely say that they pose serious competition in the market. Microsoft Hellbender never looked so good.

    The true heart-breaker is watching this pathetic chip struggle with Imperium Galactica II (like watching Peewee Herman trying the Ironman contest). And yes, I do intend to upgrade, once some enlightened notebook maker brings out a model with a decent 3D-chip.

    One question - Nvidia are supposed to be releasing a mobile version of the NV11. Anyone any info on this, or the NV11 itself?

  • You can all quit complaining about nvidia's non-open source driver right now! Due to the way they integrate their linux and windows code base, we should be able to get drivers almost immediately for this amazing piece of hardware (I hope, cuz I can't wait to get my hands on one, I skipped the geforce generation and am still with my tnt and tnt2's.)

    Mike

    Unanonymous Coward
  • They're all faster than you currently will ever need.

    Today, that might be true. But as another poster pointed out, faster cards allow less 'optimization experienced' coders to make their ideas workable.

    Maybe with built in T&L, explicit support for 3D, and the slew of 'overkill' functionality, we will see some truly remarkable new ideas develop.

    Sure, the human eye can't see much difference in the ultra-high frame rates. But, when you have a whole lot of 3D shapes moving independently on the screen, mutating as they go, the lower end cards will start to chop, while the top-dogs will run smooth.

    I certainly don't want my Lawnmower Man experience screwed up by BitBlt redraws. And that's exactly what such high-end hardware will make possible (or at least more likely). Fully immersive VR with complete freedom of motion - granted, on a screen it will always look crappy. The VR goggles (or whatever) is just the other side of the coin to the very shinny graphics cards first side.
  • Couple of things..
    One thing these video card reviews never seen to talk about much is how the game looks (which video cards porduce a better picture)... Are they all the same?

    Seond thing is they all have there own fancy thing to make them different (anti-aliasing, T & L, bump mapping) . Are these features being taken advantage of by OpenGL, or are they just useless add-ons unless the games add specific support for these new features.

    Also how can you get 100 fps on a screen running at 85 hz?

  • A few months back when I was looking at upgrading my system, I had a choice between several different cards. I decided that Matrox was the most clued-in company with respect to open source and Linux drivers and I voted with my wallet. The G400 Max is plenty fast for the assorted OpenGL games and apps on Linux, and I have the source to the drivers.
  • I think I speak for all us 486-ownin' people when I say

    What's AGP?

    =)

  • That makes absolutely no sense. First of all there are lots of benign games that have high polygon counts and frame rates. What about all of your friendly neighborhood racing games? Secondly a much more sensible solution to this problem and one already in practice is to have a certain age requirement to by the games. It doesnt' make any sense whatsoever to prevent innovaton in hardware. Until the graphics card itself does anything to promote violent behavior. If anything playing quake III on an out of date graphics card would make me want to be more violent.
  • It's not better or worse than 3dfx, ATI or the others it's just different, and that's why it's so good :)
  • (anti-aliasing, T & L, bump mapping) . Are these features being taken advantage of by OpenGL, or are they just useless add-ons unless the games add specific support for these new features.

    I don't know for sure about bump mapping, but I'm guessing yes. Antialiasing and T&L has been there for ages.

    Also how can you get 100 fps on a screen running at 85 hz?

    You can't. But the issue here is to have enough overkill so that you never drop _below_ the screen refresh rate, no matter how much the action heats up. Also, if the card can crank out 100 fps on todays games, it should also be somewhat future-proof.


    A penny for your thoughts.
  • by Anonymous Coward
    the GeForce2 should be able to run on the same drivers as the GeForce. nVidia uses a unified driver to run all of their cards, TNT on up that is.
  • > That might be the next frontier. Even if we can get life-like quality.

    2 points.

    A) We will continue to need fast 3D for quite a while. What you said, all in the name of realism. I want my perspective-correct shadows, I want my realistic fog, my bezier and NURB surfaces, etc. :)
    The world is just too complex, and today we don't have the horsepower to accurately model it, so we approximate it: badly.

    B) Thankfully photo-realism isn't the end-all and be-all. Real-time Cartoon rendering is starting to pick. Check out some the latest issues of Game Developer.

    Cheers
  • You wrote:
    thus letting people from hardcore gamers running at 640x480 in low detail, to those that desire 32bit quality and large detail.

    This past weekend, I had the opportunity to see what kind of difference three generations of processor and video card can mean side-by-side. At one end is my box (P3 @ 560, DDR GeForce), in the middle is a P3 500 with a Voodoo3 3000, and at the other end is a K6-2 450 with the same Voodoo card. Setting up Q3:A under Win98 on all three machines for a LAN party gave me a chance to see just how each system would run with the same settings.

    My results (YMMV):

    Both Voodoo machines ran the default settings (800x640) around 42FPS, and tinkering with textures and detail levels could alter this from 38-45 on both systems. Clearly, the Voodoo was overmatched by even the K6-2. The GeForce machine gave 66 FPS.

    Turning on all the bells and whistles (everything cranked up, 32 bit colors and textures, etc, etc), the GeForce still put out 59FPS at 1280x1024. The Voodoo machines returned 37 and 33 FPS, respectively.

    Am I going to upgrade from my Elsa Erazor X2 to a GTS or Voodoo 4/5 or Radeon? Nope. Not until Q1 2001, anyways (or I land a job that pays well enough to support my technowhoring *g*). Would the other guys? Probably when the NV20 is released and the NV15 takes a price cut.

    Rafe

    V^^^^V
  • > Agreed, Quake is already running at 110 fps on my PC (and don't tell me 30 fps is good enough... that has been discussed in the past)

    Yes, it's curve of decreasing returns...

    The jump from 10 fps to 30 fps is much more eye pleasing then a jump from 60 fps to 80 fps.
    Anything over 60 (to 72) and you won't be able to tell the difference.

    HOWEVER, we do need obscene frame rates, so we can apply full scene anti-aliasing.

    Think of difference the texture filtering makes: linear (usually software), or bi-linear filtering.
    Basically instead of needing 4x the resolution with linear filtering, we can achieve a look of a MUCH higher resolution via the bi-linear filtering.

    Cheers
  • Does anybody actually know how NV ratings are measured? Is a NV20 chip twice as good as an NV10 chip? Is a NV10 chip 10 times as good as an NV1 chip? - Ekapshi.
  • by Keelor ( 95571 ) on Wednesday April 26, 2000 @04:14AM (#1109991)
    It ought to be interesting to see the 3dfx PR machine at work now. The true irony will be when they take the inevitable road of, "Sure, the Geforce GTS has a couple more FPS, but we have anti-aliasing, which improves the way games that are out now look!"

    Compare this to about a year and a half ago, when the TNT came out. "Sure, they support 32-bit color and higher texture sizes, but we have more FPS! No true gamer cares about how good their games look, they just want more FPS!"

    The sad thing is, I think 3dfx knew this would happen--that's why they've been pulling away from emphasizing the performance of the Voodoo 5, and instead hyping the full-screen anti-aliasing.

    On a side note, it now seems that the Voodoo 4 (the single VSA-100 chip) has no hope of seeing the light of the retail market. Some OEMs _might_ pick it up, but considering that the Voodoo 5 5500, it might be a bit of an embarrassment to release the Voodoo 4.

    ~=Keelor

  • > This is our right as USians

    Read the Declaration of Independence.

    We, therefore, the Representatives of the united States of America.

    It's uSA (lowercase u, since united is an adjective), not USians.
  • GeForce2 will run on the same drivers as the GeForce (which, consequently run on the same drivers as the TNT2Ultra, TNT2, and TNT). So, assuming that the beta drivers nVidia released yesterday are adequate, then the GeForce2 drivers will be as well.
  • by Keelor ( 95571 ) on Wednesday April 26, 2000 @04:19AM (#1109994)
    Okay, other posters have pointed out the fact that the Geforce 2 seems to be going the right direction (with respect to power consumption, at least). However, I'd like to take a moment to agree with you--and point out the best part. Last I heard, the Voodoo 5 6000 (the one with four VSA-100 chips) won't have " a hard drive power cable plugged into it." Nope, instead, it will just have an external 110 power supply that will plug into the back. How's that for power consumption?

    ~=Keelor

  • evolva hurts my brain to play. first of all, it has a wireframe mode where you get to see the actual polygons which are being texture mapped. there are lots and lots of polygons on the monsters, almost enough to make them look solid in wireframe mode. seriously, that many. no kidding. when not using wireframe mode to cheat like a vicious loser so I could see through walls, I noticed that all the monsters looked more realistic and moved more fluidly than I have seen in any other action game, they actualy appeared to have curved bodies, as opposed to the shaded boxes that make quake people and monsters. evolva is good. I loved the game until the end of the second mission and it told me " congratulations for completing the demo of evolva" and I felt like a junkie whose needle was just yanked away form him mid shootup.
  • In essence, this boils down to a matter of taste. You seem to be saying that all of the games which really require that level of graphic support aren't really your cup of tea, but for tens of thousands of people, it is precisely their cup of tea.

    On the RTS front, while it was certainly a playable-as-hell game, I heard plenty of people complain that their brand new, whopping fast machine was limited to 800x600 in Starcraft, just because Starcraft couldn't go any higher. Myth was the first (fairly) recent game to really start to reverse that trend: Homeworld tried to completely stand it on its ear.

    As magnificent a game as I thought Homeworld was, I really did feel that it was limited to some degree by the constraints of the technology: many strike craft + many ion cannons = big framerate losses.

    So, in other words, pay attention: I'm predicting that newer RTS games will will benefit from better tech more than their predecessors did.

  • The NV ratings are just a conveinent way to name R&D projects. It lets them have 4 variations that they can play with, the NV-11 is based on the NV-10, and one could assume that an NV-12. NV-13, and NV-14 would also be based on the NV-10 chip. As for x-box, as I've said before, nVidia has yet to announce what chip will be in the x-box, it will probably be an entirely different chip from anything we've seen yet.
  • I think he meant the 5500 model will have a hard drive power cable plugged into it (I believe that's so, don't remember for sure, I just know the 5500 I saw didn't have any external power), but only the quad processor board (the 6000) will have an external power supply.

    Incidentally, 3dfx claims that this is because motherboard makers skimp on voltage regulator quality, so the motherboard apparently doesn't supply the right voltage. To this I say: Yeah, if you draw current over specifications, the voltage will drop, of course. :) We've all heard that sort of thing before from 3dfx. 'Oh, 32 bit is a waste of resources and completely useless. Look! Look! T-buffer!!! It's l33t!' Translation: Please wait until our engineers get around to catching up with the competition.
  • >Right now the money for a supreme video card is arguably better spent on tons o' RAM and an ATA-66 drive/controller combo for faster performance in everyday apps.

    Or, even better... U2W SCSI 8^) Still a sight better than ATA/66 for throughput and CPU usage. The ATA bus structure just isn't as flexible (2 drives/channel?!). Plus, easily attachable external devices.

    It does cost more, but the performance boost is well worth it.


  • Hey, let's give the company a chance. This time last week, I wasn't sure we'd see drivers at all until July -- beta or not. Kudos to Jim (et al.) at NVIDIA for working hard to get yesterday's drivers out.

    I expect the company will be doing more for and with the free software community... just give them a little time to adjust.

    --


  • Yes. I am a moron, if that's what you were getting at.
  • The drivers are beta, but they work just fine?

    Beta drivers are beta because they haven't been qualified, certified, tested, whatever. Heck, even non-beta drivers have bugs and problems! So I'd think if NVIDIA had beta drivers for Linux, that by the very definition of beta, they haven't tested it thoroughly enough to guarantee anything under Linux.

    Which goes back to poor Linux support, given how many generations of cards have come and gone under Linux now...

    -AS
  • you most certainly can get 100fps on a screen set a 85hz. you disable wait for v-sync on your card, it will put the new frame to the screen as soon as is ready, regardless of weather or not the last frame is done being transfered. you get a tearing effect that looks very ugly but if you are going for raw FPS then its easy to live with.
  • not to rain on your parade, but you cannot run a v3 in 32 bpp mode... it just doesn't have it!

    Also, I would be more than curious to see what type of horsepower he has under the hood of the machine -- I have a roomate that just purchased an Athlon 700 (from a k6-2 450) and that alone was worth way more than if he had just bought the next best card out there (he runs v3 3000, as do I...Unreal Tournament at 1600x1200 even at 16bpp never looked so good.....).....
  • Actually, from what I understand, they can't release the code due to NDAs with other companies whose technology they use. Or something along those lines. Wonder if that'll change anytime soon. Prolly not.

  • The fastest 3d video available now for linux is from Nvidia.

    Linux is more then just the x86 chips. As far as I know the Nvidia cards won't work on PPC flavor or on the Alpha flavor of Linux. I don't want to leave out our *BSD friends either. It can easily be argued though, that the games that are available for Linux right now are pretty much all x86 closed source. The fact remains though that if any of the other platforms that Linux runs on or the *BSD people want to do anything with these cards they are SOL. Would it really give competitors and advantage to release the register programming interface for their cards?
    Molog

    So Linus, what are we doing tonight?

  • Being a curren GeForce DDR owner, I feel that there are a few concerns that must be made clear.

    The biggest problem, of which no-one has actually mentioned, is that nVidia is working so hard on getting new silicon out the door every six months that they just don't have the people to work on the drivers. This leaves the people with current GeForce cards a little annoyed after shelling out over 200 quid for the card. Agreed, the card performs well and I am happy with it, but the card has functionality that isn't even implemented in the drivers yet! Why do hardware houses insist on releasing their products before their driver has the capability to exploit the hardware built into the card?

    My next card will probably not be another nVidia, unless, of course, they're support and drivers are improved drastically. This is exactely the problem ATI faced, luckily they had a large OEM base that supported them.

    To make my point more valid, I used to work for the company behind the SuperFX chip for the Super Nintendo (used of Starfox and a few others), which progressed onto a core for the old Cyrix chip (the one with the graphics built into the chip). I know that when the silicon is being designed, the old drivers are nowhere near as supported due to the fact that people are working on the software for the new silicon.

    c0rarc

  • Are you just purposely looking for any excuse you can find to bash nVidia? Perhaps looking to gain a few karma is the meantime? I have NEVER before heard ANYONE say that because something is beta, it could not possibly be stable. You know perfectly well that beta software can be very usable. Obviously, they are calling it beta because it is the first release, and you should never call your first release anything other than beta.

    If that isn't enough for you, I have a GeForce 256 and I am happily getting the same framerates in Quake 3 that I normally get in Windoze. Hell, even my 2D performance has been doubled by the new drivers. And guess what? I have not had a crash yet.


    ------

  • "Compare this to about a year and a half ago, when the TNT came out. "Sure, they support 32-bit color and higher texture sizes, but we have more FPS! No true gamer cares about how good their games look, they just want more FPS!"

    And at the same time nVidia PR was saying, "Sure, the V3 has more FPS than the TNT, but we have 32-bit color and higher texture sizes!". When nVidia T&L was announced, 3dfx it would be a while before T&L was properly supported. When 3dfx announced the T-buffer, nVidia claimed that gamers would prefer the T&L speed boost to the prettier FSAA (Quote nVidia PR rep, who asked if 3dfx's VSA-100 stood for "Very Slow Architecture" in a public interview).

    These two companies have been bashing each other constantly. 3dfx uses their "PR specialist" Bubba (His real name) Wolford (sp?), while nVidia 's attack dog is Derek Perez. Open your eyes, *all* corporate PR divisions are full of it, some are just a little better at convincing you of the contrary (As nVidia seems to have done to you).

  • you most certainly can get 100fps on a screen set a 85hz.

    Well, Sir. I dissagree.
    Let's say your're running at 100Hz and by disabling v-sync you get a steady 200fps. That means you'll get two halves of a frame in one screen refresh 100/2 + 100/2 = 100 whole frames/sec. :-)


    A penny for your thoughts.
  • One would not really want to run in 110+ fps for all applications... but that is a good indication of the strenght of the card, and one can increase the Resolution until one is about at 70 fps. Which is good for most good Quake rail users.

    Anyway, my Monitor is runs at 85 freq. It's pretty clear.
  • You seem to be saying that all of the games which really require that level of graphic support aren't really your cup of tea, but for tens of thousands of people, it is precisely their cup of tea.

    Well, yes, I'll admit that First Person Shooters, typically the graphic kings of the game world, aren't my favorite type of game. There is a certain thrill in running around Q3 like a madman, killing at will. But for the most part, I prefer a more strategy-filled game.

    Starcraft does indeed have limited graphics, but you've got to remember that the game is approaching 3 years old. The fact that the game is still actively played by literally thousands of people, both on BattleNet and by themselves, I believe is testament to it's superior gameplay.

    Anyway, the point I was trying to make was that game designers lately seem preoccupied with making their games prettier and flashier, rather than better. Take Force Commander, for example. Lucas Arts tried to put the RTS genera in a 3D environment, and failed miserably. The game is filled with 3D models and backgrounds, and you can rotate your camera to every concievable angle. These features, however, make it almost impossible to effectively issue orders to your units and follow the action. Camera control is a mess, the interface is clunky, and the actual gameplay unexciting. I've also been playing Star Trek: Armada, and while it is a far better game than Force Commander, it also sufferes from the same bad camera angles and tired, repetitive gameplay.

    I've not had a chance to download the Earth 2150 demo to try it out, hopefully it will be good. I've also heard good things about HomeWorld and the upcoming Halo. And I guess bad games just come with the territory: there wouldn't be good games without bad ones. Still, it's frusterating to buy a game, get all the cool and neat graphic tricks in the world, and absolutely no play value.
  • I'm all for open-source, but heck, as long as they provide a driver, and it ends up being as good as the driver for another OS, then we should applaud them. Continue to coax them for open source drivers, yes, but don't write them off just yet. There's a lot of the driver that you would really need a good hardware spec for, and if they want to keep that under wraps, there's not a whole lot of performance gain that open source can easily provide. People will just say "well, I don't have the spec, so I won't work on this", and it won't be any better than before. You might be able to get some more stability out of it (definitely a good thing), but you still have to respect the steps they are taking now. Sooner or later they will come around, but if you don't support them, they won't support you.

    I don't entirely agree with your DRI comment, either... but that's ok 8^)

  • Any decent monitor will support 110hz refresh or better at 640x480. Check out this low end viewsonic monitor: G655 15" [viewsonic.com]. If you want something larger there is this very nice PS790 19" [viewsonic.com](I have one of these) and the totally outrageous P817 21" [viewsonic.com]. Both support better then 110hz at 1024x780, and don't even list the refresh rate at 640x480. Ofcourse if you insist on getting that $200 19" monitor, you get what you pay for. There is much more to monitors then just the numbers. Better, more expensive monitors last longer and, more importantly, look better.

  • by raygundan ( 16760 ) on Wednesday April 26, 2000 @05:33AM (#1110015) Homepage
    The reason that 30fps (or lower for movies) looks acceptable is that the filming process produces motion blur. The motion blurred image is much closer to what our eyes get from reality than an every-single-frame-is-crystal-clear rendering from a video card. To produce a similar effect from a typical 3D card, you need enough more frames that your eyes can't see them all and produce the blurring on their own. (Like real life) It seems obvious that you need at least 2x the frames to get a blurring effect between them, since you have to have 2 frames to blur between.

    Newer video cards ARE beginning to incorporate motion blur, which will help enormously. But it is cheaper to simply up the framerate, at least up to a certain point. (Which I don't think we have really reached) Motion blurring sounds like a very computationally intensive thing to do.

    So there are reasons to go to 100fps-- if the frames are clear, it will take many more of them to approximate the effect that motion-blurred TV or film produce at 30fps.
  • I just picked one up this past weekend. Wouldn't have done it without the big tax return! (Thank you Uncle Sam!).

    Now to install some cool games and have some phun with it!

  • But I do have to comment on one thing:
    The sales of these cards is strictly a matter of business. What people do with them afterwards, and what they decide to do because of them is strictly a matter for the individual, rather than the "state" to decide. Any other course of action will simply be another step along the way to the iron fist of totalitarian government.

    By the same reasoning, people should be able to buy thermonuclear weapons, a deadly biological viruses, nerve gas, and gun add-ons to let people shoot down police helocopters(obscure Simpsons reference). Sure, whatever.
  • Death toll caused by strong governments: in eight digits.

    Death toll caused by 3D Graphics Acceleration: in zero digits.

    Case closed.
    /.

  • Shiny's Sacrifice [sharkyextreme.com]
    WXP's Isle of Morg [sharkyextreme.com]
    Planet Moon's Giants: Citizen Kabuto [sharkyextreme.com]
    Computer Artworks' Evolva [sharkyextreme.com]

    i'm not quite finished wetting myself.
  • by Taos ( 12343 ) on Wednesday April 26, 2000 @05:49AM (#1110020) Homepage
    I'm a 3D Animator/Programmer and a student so I find myself looking at the game cards to find which one is the best solution for my work. I can't afford the top of the line GL card that just gives me the raw crunching power I need. Therefore, fill rates and all the other bells and whistles don't phase me at all.

    The one thing that is most important to me, however, is support. I don't mean telephone, but platform. I recently got burnt on buying a Viper 2000 because they refuse to make NT drivers with any sort of hardware acceleration. Then Linux runs into the same problem. I was sold by their web site when I was deciding on the card for my new computer. Their web site turns out to be a flat out lie. And if there's one S3 developer out there reading this, I have a size 12 boot that has your name all over it.

    So now I'm incredibly leary of these game cards coming out with all these whiz bang features but will probably only develop for WinBlows 2000. I need drivers for NT4 because that's where the software is these days, and I need Linux drivers because that is where the graphics software is going and where I create most of my custom software. So when a company now comes out saying they're going to support this and that, but don't have the drivers to back it up, I'm just going to wait.

    This summer, I will probably just buy a GeForce I. Because now they have released the drivers for it under Linux (it isn't open source but I don't really care) and they've always had stellar NT support. I know people here like their drivers open source and their cards to be screaming fast, but I just want one that works as advertised and fits into my meager budget.

    Taos
  • Once every videocard has some form of 3d in it with driver support for all OS's (like what happened when all cards had 2d acceleration and drivers for everyone), which should happen in what, 18 months?, do we expect to see some killer app for 3d Web or VRML 3 or something? i would think so.

    i know there are tons of arguments against a 3d interface, but so were there against 3d games when Ultima Underworld came out! if i can move around with absolutely no effort under Q3A or UT, why can't a Q3A level be a web site?? sure it's faster to see all in one page, but if slashdot looked like a Q3A level (with news posted in floating billboards and sections looking like houses, buildings, huts and spaceships) would you log on? i would...

    ah, but there's the matter of download speed. well if evey other person has downloaded flash plugins, and realplayer, and are now actually using it, couldn't they download a set of very compressed textures so that when you log on all you download is the wireframe file and changing images, wouldn't this be comparable with downloading html and changing images???? c'mon programming gods, it can't be that hard right? besides, if all you have to download is a 3d browser, which already amount to many megs, you could send all basic textures there couldn't you?

    whatdoyouthink people?
    ========================
  • Uncle Sam didn't help you, he just didn't screw you as much as you thought.
  • Go with the Voodoo. I've been down the binary only driver path once, and between total bugfests, X crashes, lockups, lacking features and incompatibility, no way am I doing that again.

    NVidia will continue to have crappy performance and unstable drivers in X until they shape up and get with the program. Wether or not they theoretically perform well under some windows benchmark doesnt interest me in the least.

  • First off: Boy, do I want a GeForce 2 to replace my TNT1! OK, now for the off-topic content...

    "Pro-gun" goes not equal "pro-violence." That's an insulting bit of intellectualy laziness.

    The FACTS of life are somewhat UNPLEASANT. The FACT is that some day, I may need to use violence to protect myself or my family. A gun is the best tool for that. Do I look FORWARD to that? No. Do I prepare for it? Yes. Isn't it wise to set yourself up to WIN a confrontation that could othewise result in your death?

    (If you think that there is never a call for violence, that there is always a peaceful solution, then you don't want to argue about guns, you want to argue about violence and the right to self defense, an altogether different topic which guns are only a facet of.)

    And the above poster has it right when he says that guns are an important part of protecting our freedom. I'm not saying we need an armed revolt now -- but can anyone guarantee that we won't in 100 years? 500?

    Ultimate authority flows from the barrel of a gun. If the people don't have some "authority" of their own someday they'll lose big to an invasive government.
  • Did 486 motherboards even have PCI?
  • Ok, this is getting a little ridiculous. I just bought a higher end system last summer and it is already a low end system. It's a good thing that I am not one of those people that must have the top of the line....yeah right....I'm already setting aside money to buy a new system this summer I'm spending more money on my hardware than on any other hobby, no wait it isn't a hobby anymore it's an addiction.

    Need more (insert hardware here i.e. RAM, MHz, etc. etc.)!!!.


    flatrabbit,
    peripheral visionary
  • I personally have a TNT 1 card (I bought it because it was good enough, and cost $70) in my dual 500MHz system with 256MB of RAM. I get around 20fps, 8fps if anything interesting happens

    I also have a TNT 1 card in my single 500MHz system with 128MB of RAM, and I can tell you that this is entirely a driver issue. I average about 50fps in Q3, with a range of between 25 and 100 depending on what's going on. I'd suggest this card for hardcore windows gamers, but for a linux or dual boot system, the G400 is the best choice, and a card that will perform well under either OS.
    ----
    Dave
    Purity Of Essence
  • Yikes-o man! Get the latest drivers, stat! I have a Celery 333 with 64 MB RAM and an 16MB AGP TNT 1, and I get close to 40 fps at 800x600 using OpenGL or Direct3D in Unreal and Q3. It doesn't drop much below 25 or 30 fps even when the action heats up...

    BTW, thanks for the cool info on the power consumption.

    Christopher
  • No they arent, not for linux anyway, you must be thinking of the winblows drivers. And another thing, if you got that horrendous beta version to work properly on your linux box i wanna know how.
  • At least, that's what Nick Triantos of nVidia, the guy responsible for the Linux drivers, just told me. Apparently, the drivers released yesterday have full support for the GF2. For once, we appear to have drivers for a new product before the Windoze people! :)

    If that thing really is in stores on friday... hell yeah...


    ------

  • by Malc ( 1751 ) on Wednesday April 26, 2000 @09:08AM (#1110040)
    I have a Leadtek GeForce DDR in my dual P2 450 machine. I upgraded it to two processors in February at which time Win NT started locking up several times a day. I recently upgraded the graphics drivers to version 3.72, and that seems to have saved NT. However, Quake 3 (when played in SMP mode) crashes (and sometimes locks up my machine) - I would guess the MTBF (mean time between failure) is about five minutes for Q3 with "r_smp 1".

    On top of that, Leadtek won't supply the Control Panel display settings stuff for NT that they have under Win98, so my gamma settings are too low (better than under Linux though where 3D games are unusable due to the darkness).

    Come on guys, fix the drivers. The fastest card in the world isn't much good if I can't use it.
  • I'm going to want a Voodoo5 for just this reason- I have a voodoo2 too! ;) PCI version will be available, and it will allow me to stick with my old machine (a Powermac, btw) and enjoy FSAA which I really like the idea of. I'm not a FPS guy, I'm an 'X-Plane' fan ;) plus, even with FSAA it should give a moderate speed increase on the voodoo2 which has only (*sigh* 'only') 12 megs :)
  • From what I've read, it looks like the Geforce's FSAA support will be slightly less stellar than 3dfx's--which is why I didn't mention it. To be perfectly honest, I could be entirely wrong. I'll wait for both before buying one.

    As for Glide, being as Glide is now open-source, I don't think that 3dfx could really say that have that up on nVidia.

    ~=Keelor

  • Alas poor VRML, for I knew him well. A couple years ago VRML was a big (sorta) thing on the web, people had the brand spanking new Pentium processor and 16-bit graphics. The problem with these was in-window 3D rendering. It is alot easier to render a scene full screen than it is to render a real-time scene inside a window. Even now it requires alot of processing horsepower to render a scene in another app's window. A full screen 3D browser might fix this problem though...
  • No kidding, it's almost May and nVidia has yet to release new drivers for their TNT2 chips for Win2k. I'm stuck using the most raw crappiest general TNT2 driver ever written, it's a real bitch to use for much of anything.
  • Yeah we all know nVidia's chips can crank out the fps. What is more important to me as a customer of theirs is support, if I buy their card I would really like continued support even after the chip is a few months old. nVidia has this balls out release schedule that makes them pump out new silicon every six months, I can't think of many people that upgrade that often. I've been waiting for three months for decent drivers for my TNT2 Viper v770 (which I bought in August) under Win2k. From 3dfx's website it looks like the V3 has Win2k and Mac OS drivers for their cards. That brings up the issue that nVidia is neglecting a market they could really cash in on, Macs. With the Sawtooth chipset Macs have come alot closer to being really kickass gaming machines, especially combined with the G4 processor.
    The chip itself looks impressive but then again so do the Radeon256 and VSA-100. ATi and nVidia have bump mapping which is a really rad looking effect if it's used in a game while the VSA-100 had the T-buffer that lets you do all sorts of video effects. I'm not sure yet which card is going to get my money. It will probably be the one that supports the platforms I'm cruising around on.
  • Actually, the Geforce 2 can't support full scene anti-aliasing at resolutions higher than 1280x1024x16 or 1024x768x32 because of the way the nVidia FSAA works. The video card renders the screen at twice the resolution (there's also a mode in ehich it only renders twice the resolution one way-i'm not sure whether vertical or horizontal) and shrinks it down to the original size. So FSAA at 1600x1200 would mean the video card rendering at 3200x2400, which it can't do.
  • All my original statement was that beta drivers is not fine.

    Supported drivers are the goal, though some take that to mean binary, some take that to mean source, some take that to mean phone call support. Whatever.

    So being satisfied with beta is pointless. Beta is only temporary, transitory, towards final release.

    We, as consumers, should not be expecting beta stuff, in general, yet that is what we have with Windows, Netscape, and a whole raft of other software and products.

    -AS

The goal of Computer Science is to build something that will last at least until we've finished building it.

Working...