Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Technology

More on the GeForce 3 177

Tom has extensive poop on the GeForce 3. Some history, and a ton of why this card may or may not be worth the $600 price tag. Lots of stuff on the Vertex Shaders (which look to be a truly amazing idea). Very worth a read if you're interested in state-of-the-art 3D, or just wanna sneak preview at what next year's FPS will be doing.
This discussion has been archived. No new comments can be posted.

More on the GeForce 3

Comments Filter:
  • by CoreWalker ( 170935 ) on Tuesday February 27, 2001 @06:32AM (#398716)
    I don't care what the context is, I'm always a little apprehensive about clicking on a link labeled as 'extensive poop'.
  • How much does a basic PC cost these days? Not a whole lot more than the G3, right? How long will it be before video card manufacturers are throwing in a free PC with the purchase of one of their cards?

  • Yeah, but with mac you are paying 3 or 400 bucks more start with (when compared with pc's), so you are still paying A LOT just for a graphics card.

    Plus, there is the plain fact that if you buy a whole computer you are spending ~1500 bucks, which is over twice what buying the card alone would cost, even considering its inflated price. So if all you really want is that card buying a mac just to save money or have the card a month sooner is pretty dumb.

  • All i really want to understand is what the difference is between a $600 Geforce3 card [tomshardware.com] and a $1000 Oxygen GVX 420 card [62.189.42.82] Why couldn't or shouldn't i pay $1000 for the higher priced card if it can do the same things a $600 card can do. And if an Oxygen card can't do what a Geforce 3 card can do. Why not? See the problem here is that i simply don't know enough about video cards and apparently i'm too busy to RTFM
  • As a society, yes we do need to take a look at our pursuit of material goods. But I don't see much connection with buying this particular card and the third-world situation. Because most of "our basic consumer goods and foodstuffs" come from third-world countries, we shouldn't buy this card? Now, if it was shown that nvidia was exploiting third-world workers in sweatshop conditions ala Nike, then I would see your point.
  • Don't get me wrong here. I love nvidia's work, have a GeForce2 Pro in my computer as we speak, and think they do a great job. My worry is that MS & nvidia might be collaborating to an extent where they create a problem similar to the wintel situation of not so long ago, where the government had to step in to stop the high level of exclusionary collaboration in that relationship. Software + Hardware = a lock on the market.

    All that has to happen is that they become good enough buddies such that MS is customizing DirectX & their OS to work best with nvidia processors and nvidia works with MS to customize it's hardware to work best with DirectX & Windows. Together they end up creating an "industry standard" that through patents & IP locks out or inhibits competitors. Anyone else thinking something like this is possible or already in the works? I'd hope that nvidia, being as innovative as it is, will examine history and try not to let itself get tied too closely to this beast.

    Oh, and as for the price thing, I'm avowed not to pay more for my GPU than my CPU. $5-600? As much as a technophile that I am, it's just too dear for me. I'll be sticking to a $250 maximum, thanks.

  • You can get an ATI Radeon 32Mb. DDR for like $80 now.. (maybe even less..) And ATI supports OpenSource... so there is a DRI driver in development.
  • Please SLOW DOWN the video card technology! Not everyone can keep up. :( Thanks.

    Well, at least this didn't have to be moderated to get to +2... it's just your standard whiny rant.

    Whine, whine, whine, I can't afford the new stuff, so therefore no one else can, so all you video chip makers put a sock in it. First of all nVidia wouldn't be doing this if they wouldn't make any money. Rest assured, sales will be brisk right from the start. Second, upgrades don't come _all_ that often. You have the latest crop of cards (most expensive of which comes in at around $500-600). Did you whine then? Then you have the generation before that (or half generation), the TNT2 (I myself have TNT2 Ultra), and that was, what, 12-18 months ago. I bet you whined then too. And of course before that still you had Voodoo 3, Voodoo 2, and a bunch of other also-runs.

    You see, this sort of progress waits for no one, except the marketing department. Are you also complaining about the speed race in the CPU market? I upgrade my CPU every 3 years, roughly. I went from 50 to 200 to 500 to probably 2000+ MHz (in another year) with more than enough performance to match. I bet you're complaining about that too, hmm?

    Your problem is (and others like you) you have the 'old' generation of cards, which have suddenly been 'obsoleted' by this announcement from nVidia. Seeing as you probably paid a good bundle of money for that priviledge, you now feel cheated. Or maybe you just got a 'new' card and the box is still laying in your trash can, in which case you feel cheated as well. I mean, man, here I bought a brand spanking new card for at least $300 and it was just a waste of money! Grow up. The vast majority of people will never buy the latest and greatest, they can't afford it or don't have other prerequisite hardware to support it. Just wait 3-6 months after release and do what everyone else does: buy reasoable performance at a reasonable price.

  • good luck. while the xbox is generally built around PC architecture, the graphics chip is not going to be sitting on a standard AGP card.

    The better question is how hackable will the XBox be..Would one be able to get a more full featured OS than the cut down Win2k kernel running on it? And get support for SVGA monitors and other standard peripherals needed to turn this into a really nice workstation?

  • Does "same driver package" eq "same driver .dlls"?

    I've seen it where a vendor packages the same installation with multiplatform installs, but its usually a bunch of different files for seperate platforms. This usually means that while an update will take place for a "current" device, the updates seldom get made for older/existing devices.

    It'd be an impressive act of hardware/software engineering if each new nVidia video card were actually driver-compatible with the old card's drivers, only required new drivers to get new features. But I'm guessing it doesn't work this way.
  • I like what he says about the way nvidia name their products : he says GeForce2 was only a speed bump, and that GeForce3 is something completely new. He wishes GeForce2 had been named something else...

    But he did/is doing the same thing with Doom : Doom 2 was nothing more than an add-on to Doom, and Doom 3 will be _completely_ different. Hehe...
  • If Nvidia can release new products every 6 months, why should they stop? Releasing new cards doesn't make your card any slower. People out there, however, are willing to buy new cards, so nvidia keep making them.

    And there's no way that the gf3 will still be $600 a year from now. I'm guessing ~$250 OEM or less. And the release of the gf3 means all of the other cards out there now will be CHEAPER! Yes, the GF3 card that cost $400 a year ago is far below $200.

    Fast release cycles make make people feel like they've got outdated technology, but really, they just mean that people get blazing fast products for a fraction of the price.
  • No. Actually NVidia came up with the ideas and then forcefed them to microsoft for inclusion in DX8. And, no, not any card maker can make a card that supports those features. If they could, why aren't they? Only ATI and NVidia make cards that support both standards (OpenGL and DirectX) to the letter and are fast.

    NVidia, as they pushed microsoft to expand DX, also created new OpenGL extensions for these new features. These are NOT Microsoft's ideas. They are almost soley NVidia's.

    Justin Dubs
  • If the primary area where you would benefit from improved performance is games (which is true for a lot of us as most office apps have not been CPU limited for a while now), then the benefit of $600 spent on a GF3 is likely to be greater, and the cost less, than upgrading to a P4 and new motherboard.

    Not that I'm going to be buying a GF3 at that price...
  • "I find it a little worrying that so much of the work that has gone into the GeForce3 has been implementing unprecedented new features such as these vertex shaders, rather than improving more general stuff such as fillrate or transformation and lighting. This leads me to believe that Nvidia's goal with this chipset is not to improve the 3D gaming experience of their customers, but rather to lure developers into using these (admittedly excellent) new features."

    welcome to the wonderful world of feature bloat. Nvidia is probably thinking along the same lines as Microsoft. New formats for Word with every new version means you have to have the latest version of Word (and not WordPerfect or... woah, i can't think of any other word processors, have they really killed them all!!!???) to open it. now that Nvidia is the undisputed leader (in terms of market cap) they will definitely fight to keep the lead with the same tactics MS has used successfully for years.

    as gaming becomes more and more the primary entertainment for humans in the 21st century, you can bet we'll have to deal more and more with this crap. if the money is there, the pigs will come.
    --
  • Does your $250 CPU come with 64MB of DDR RAM? Did you notice that the GPU has more transistors, and is on a .15u process? Are you aware that nVidia doesn't have the economies of scale that AMD, Moto and Intel enjoy? Just wondering...

    No (32MB), yes, and yes. This doesn't prevent the two from colluding to corner the 3D graphics market so they can maintain primacy as a platform for game programming, which despite what many home users like to profess is a major reason they buy computers.

    I think my post may have been a bit off-topic since it was about my own worries regarding the level of collusion between MS & nvidia. I was just wondering if anyone else was having the same thoughts though. It wouldn't be the first time a hardware & software company have colluded to reinforce each other's market share to the detriment of innovation and consumer's pocketbooks (wintel, anyone?).

    And what the hell does "economies of scale" mean? Each computer worth its salt sold these days comes with a graphics card just as it comes with a CPU. Hence, nvidia should have the same "economies of scale" (that is, available market and scope of exposure) for its product as a CPU company. I actually wouldn't take issue with nvidia gaining a monopoly in the GPU market - they do a good job and I love their products. What I'd be against though, and what's against the law, is colluding with others such as MS that gives nvidia's GPU a priveleged relationship to DirectX and/or the OS that ends up excluding other GPU companies.

    Such a state would effectively end the possibility of competition in this market, which'd be a shame. It'd be as if MS engineered Windows such that it ran 50% slower on an AMD vs. Intel CPU.

  • Increasing fillrate is pointless, when things are already so memory-bound.

    Exactly. Anand's GeForce3 preview [anandtech.com] puts it this way: "even the GeForce2 GTS with its 800 Mpixels/s fill rate was only capable of a 300 Mpixels/s fill rate in a real world situation."

    If you look at the GeForce2 MX reviews, you'll see that it barely edges ahead of the SDR GeForce at higher resolutions, and falls well behind the DDR GeForce. Forget about doing old-fashioned FSAA on an MX.

  • by andr0meda ( 167375 ) on Tuesday February 27, 2001 @08:25AM (#398733) Journal

    So it`s ok to be aware of market tactics principles and to approve or disapprove, but to actually NOT buy a graphics card which is superior to others because it makes the other businesses go slower ?

    That`s definately NOT the right angle. In fact, if nVidia gets to sell this card as hot as the previous 2 versions, it can set (and raise) the standard in 3d lighting again, and frankly that`s what I want. Ofcourse monopolistic situations like e.g. Soundblaster are absolutely bad to competition (and quality) in the market, but that`s because a Soundblaster (could als have used Windows here) is a good product, just not top of the line. That doesn`t mean there are no other soundcards out there which are actually better, only that you`ll have to pay more for those and go out and look for them.

    I support ATI and to a lesser extend, Matrox, because they are the only rivals left in the field. But If I had to buy a card today, it wouldn`t be either of those 2, because I simply want a 'standard' compliant full fledged and top of the line game experience, not the feeling that I did something good for market competition. In the end I might financially regret that choice, but if nVidia creates the best cards at the end of the day, I`m only happy to spend some cash on them. If someone else can do the same or top them AND has less expensive cards, obviously that`s a thing to consider. But today I cheer for nVidia, as I have more pro than con.

  • I'd check out the Anandtech article if I were you. It looks like they put a lot of work into improving T&L and fillrate.

    "The GeForce3 is thus the first true GPU from NVIDIA. It still features the same T&L engine from before, however now it is fully programmable."

    Programmable is very good, maybe now we'll actually get to see hardware T&L in a game, rather than a press release.

    Also mentioned in the Anandtech article: "This is what gave the GeForce2 it's "Giga-Texel" moniker since it was now able to boast a peak fillrate of 1.6 Gtexels/s. The GeForce3 takes this one step further by now being able to do quadtexturing in a single pass, offering a theoretical peak fillrate of 3.2Gtexels/s although it seems like it's definitely time to stop quoting theoretical fillrate numbers in this industry."

    I'd say that this is quite an improvement. I'll stop quoting now.

  • I agree with you entirely. I too have a Voodoo3 (2000 PCI) and well, it hasn't had a game thrown at it yet that it hasn't been able to handle. Quake3? Smooth enough for me. Unreal Tournament? Again, runs fine. Hell, my room mate has a Voodoo Banshee and he enjoys Unreal Tournament just fine. But then he doesn't care about how many FPS he's getting. And neither do I. I'll upgrade when everything starts to come to a screeching halt.

    On an entirely different note I'm trying to get my old Voodoo Rush Card to work with linux but am not finding to much info on it. Anyone got any points? It would be *much* appreciated, even if the answer is "no".
  • I've heard a lot of people saying "wait a couple months, the price will drop."

    There's no reason for that to happen, now that Nvidia has no real competition. Why not keep prices on the GF3 high? People who want it that badly will pay it, those who won't will have to buy some other Nvidia product if they want to keep up with the pack. Nvidia wins either way.

  • by Stormie ( 708 ) on Tuesday February 27, 2001 @06:35AM (#398737) Homepage

    I find it a little worrying that so much of the work that has gone into the GeForce3 has been implementing unprecedented new features such as these vertex shaders, rather than improving more general stuff such as fillrate or transformation and lighting. This leads me to believe that Nvidia's goal with this chipset is not to improve the 3D gaming experience of their customers, but rather to lure developers into using these (admittedly excellent) new features.

    How is this a bad thing, I hear you ask? Well, it looks to me like an "embrace & extend" tactic. If the developers use vertex shaders to make their games look cooler, then other 3D chipmakers have to either scramble to provide the same features, or all the cool new games will run like ass on anything non-Nvidia. Only Nvidia can get away with a tactic like this because of their present dominance of the market. Witness ATI's Radeon - they added some very innovative features (like all the z-buffer accelerating) tricks but they were all dedicated to improving performance with current software. They couldn't introduce radical new features because nobody would use them, supported as they were only by a minority chipmaker.

    If you don't want to see the 3D industry completely monopolised by a single player, avoid the GeForce3, and avoid any games written to depend on its features. Support chipmakers that are seeking to make everything run better, like ATI and PowerVR.

  • I gotta say, I haven't bought myself a real 3d-powerhouse video card. But being the techie I am, I do obsess over them & I just have to give nVidia some immense props for all the work they did on this card. I haven't had a chance to read over all the tech previews (nor have I heard of there being any official benchmarkings of actual cards yet), but it sounds like they've definitely put a lot of work into this card.
    I do agree that it's really really insane for a video card to cost $600. But the fact is, if you don't want to pay $600, you don't have to. Right now, a GeForce 2 will more than get you by, & with the release of the GeForce 3, the cost for a 2 should hopefully drop dramatically (I know I have my fingers crossed! :-) And there are people out there willing to pay the $600 for the video cards, otherwise companies like nVidia wouldn't be making them (simple laws of supply and demand). And to those people that do buy the cards, I have to thank them, simply b/c it lowers the price on cards that more than suit my needs.
    As for the technology, hopefully once I make it through my classes today, I'll have some time to sit down & really try to grasp all this 3d-techno-wonder stuff that nVidia's apparently whipped up for us all. Even if I don't buy a GeForce 3, I still love the engineering efforts that went into designing it and love trying to learn about what those efforts produced.
    So, I guess what I'm trying to say is, that these cards do apparently have a place in the market, even if it's not for you or me, and it actually benefits us as well, by bringing more powerful cards down into our budgets. Plus the fact that it just gives all us techies lots of new reading material. So, I don't see any reason to complain. :-)
    -pepermil
  • First of all, I had a TNT and I'm ordering a Radeon SDR, so I'm not living on the cutting edge too.

    But still you're wrong: processing more than 1024*768 pixels even for screen at this resolution IS interesting.
    Why? Anti-aliasing!

    And why using the number of pixels for the number of polygons?? One is a 2D-number, while polygons lives in 3D.

    I don't know what is the number of polygons needed to render a tree (or worse a landscape) realistically but I think that it is quite huge!

  • Has anyone noticed the shifting of power between the CPU and GPU? Once all we had was a RAMDAC. Now we have SIMD, FPU, programmable GPUs running at faster and faster clocks. Tom's article goes all the way to say GF3 kicks P4's SIMD engine even just in theory, running at 1/5 of the clock.

    Graphics is getting more expensive, too. A $600 graphics card compared to a $300 state-of-the-art processor.

    What about that massive die size and transistor count? Can you argue by this the GF3 is more complex than the CPU now?

    Maybe as process sizes decrease and high DIE size yields increase, it would be better just to stick the GPU and CPU on one chip. Or maybe put them on a processing bus so the GPU will "look" like another CPU? Somethings isn't right when the most complex part of the system isn't even on the system bus. I would predict this is the last card which is satisfied with the AGP 4x bus and the current system configuration.

  • No, newer cards need the newer drivers in order to make use of their new features, but you weren't worried about that, you wanted updated drivers for old cards. This provides this. Yes, the detonators at any version will work for any chipset from TnT to GeForce3, and performance gains and added OS compatibility ARE added to older cards. Yeah, nVidia drivers are sweet as Hell, just wish they would make a real driver for Be, if ATI has a card capable of nicely running Doom3 when that's available that's probably what I'll get...

    -----------------------

  • And who says the folks who buy these cards don't ALSO give to charity? Do you have any facts to back up this?

    There are much richer folks out there than the average hardware geek... why don't you bother them first? Surely the ~$1000 that folks drop on there computers doesn't compare to the $50,000+ that many folks drop on their cars every year, or the $1,000,000+ that some people pay for homes.

    -

    Additionally, you're forgetting that consumers in this country don't buy directly from third-world laborers. We buy from supermarkets, who buy from distributors, who buy from shippers, who buy from farming distributors, who pay the workers. There's no way a consumer can influence this huge chain of sales. There's no chance to "boycott", as we need food from SOMEONE, and all supermarkets I know all behave this way. Unless you have a solution to break this chain, I suggest we worry about domestic problems first.

    And simply sending money over isn't the answer. Most of the aid that goes to other countries gets lost in government, and pouring more money in only makes the gov't richer and more influential.

    Anyway, please try to give solutions instead of crying about the problems.
  • how long will it be before someone ports Linux to the card itself. You could fit 10 in 1U of rack space...

  • Carmack has quite a bit to say on the subject as this .plan update is rather long [...]
    (Score: 2, Interesting)

    Amazing. Moderate up a link to Carmack's .plan.

    It only formed the basis of yesterday's story, entitled: Carmack on D3 on Linux, and 3D Cards [slashdot.org].

  • The problem with accelerating current games is that it is very difficult to do better at high resolutions without faster RAM. The GeForce3 does implement several new features to accelerate current games, such as the Z compression, fast Z clear and so forth.

    But, what the new hardware does give you is the ability to do more per memory cycle, which is key in the future.

    If you automatically say that anyone who innovates is trying to take over the market, then how can we ever get change? In this case, since the API is part of DX8, any video card manufacturer can implement the features. The exact same situation occurred with multi-texture on the Voodoo2, and no-one complained about anti-competitive influences then.

  • If the developers use vertex shaders to make their games look cooler, then other 3D chipmakers have to either scramble to provide the same features, or all the cool new games will run like ass on anything non-Nvidia. Only Nvidia can get away with a tactic like this because of their present dominance of the market

    doesnt this sound familiar to you ? /me ..oO($MS)
  • This is like car and driver saying "Chevy has made a new car, that is only 2 grand, gets 110 mpg, and goes from 0-60 in 4.7 second! Its the greats car we ever seen, full of new features.....but we shouldnt buy them! because that will put ford out of buisness"

    No, it's more like saying "Chevy has made a new car, that is only 2 grand, gets 110 mpg, and goes from 0-60 in 4.7 second, but it requires a special kind of road to drive on, and if we all buy one then the department of transportation will upgrade all the roads, but other cars won't be able to use them, so we'll have to buy Chevy's forever, and once we're dependent on them they can lower their quality and service and we'll have to accept it!"

    I hate poor analogies. But you sounded real cool.

    -Tommy

  • They will lower the cost because they won't make money at the $600 price point.

    $600 is really cheap for a professional card...but it is damn expensive for the much broader consumer market. So Nvidia makes a few bucks off of the prosumer market, creates a buzz surrounding the product, then sells cheaper versions. (making up the lower cost in volume)
  • Well, they've got to compete against better bugdet cards. A GF2 can be had for ~$150 dollars these days, and it's not 1/4 of the speed of the GF3. Only a small fraction of people are willing to pay that much extra for whatever increase the GF3 provides.

    And nvidia still has competition from the Radeon2, and possibly BitBoys, if they ever release something.
  • Probably not worth the price . . . yet. You don't really need such a high- end card to play today's games, and if ou wait a couple of months the price should drop to a more affordable $400 - $500.
  • by InfinityWpi ( 175421 ) on Tuesday February 27, 2001 @06:18AM (#398751)
    Another massive, expensive upgrade, that all the latest games will require you to use (after all, they won't run on old cards 'cause they can't be programmed)...

    Screw it. I'm not paying more than two hundred for a video card. Anyone who'd shell out six hundred for one of these is insane. You can get another box for that much, pre-rebate.
  • The Radeon2 is competition?
  • What? Sir, are you suggesting that spending $600 on a graphics card, to make all those 3d car games/shooters/etc is a waste of money? You`d only waste it on a ps2 and a bunch of dvds!
  • On another note - anyone ever own one of these? I'm talking about the Intergraph Voodoo Rush Extreme. You know I don't give a crap about what kind of performance it gave, the thing was HUGE. I remember opening up the box and thinking damn this thing is gonna rock. And of course it did for the time :-)
  • "And there are people out there willing to pay the $600 for the video cards, otherwise companies like nVidia wouldn't be making them "

    Does that go for the DreamCast too? (I know it was cheaper than $600 but the principle remains)
  • We all (or most of us) go into drool mode when we read about cool new hardware, but has anyone thought to find out about alt.OS support? Specifically Linux? Apple is now an OEM buyer of NVidia cards, which would lead one to believe they will have Mac OSX drivers, for the GeForce3. And since OSX is based on Unix it seems that it would not be too incredibly difficult to port the OSX drivers to Linux if NVidia is too stingy to do it for us. I hope that programmers with more knowledge than me can soon get their hands on a reference board or two, Penguin huggers should no longer be forced to suffer from cut-rate graphics because commercial companies don't want to expend the resources to port drivers to what is a growing and largely under-marketed group of knowledgable computer users
  • from the stuff-to-drool-on dept.
    Tom has extensive poop

    I don't know about you guys, but this certainly made the article seem more than a bit unappetizing.

  • Another problem with video cards is that the performance is becoming optimal anyway. There are 768000 pixels on a screen (a 1024x768 screen that is). At 50fps this is approximatelt 37million pixels per second. So it is intuitively obvious to all that a video card with a performance in excess of 37million polygons per second will not provide any better performance under those conditions. Why pay extra for something you can't see?

    Especially since all the uber-gamers will turn off as many options as they can to get extra fps, in the hopes that they can frag the other guy first.
  • Judging a video card based on framerates is as flawed as judging a processor based soley on the marketed clockspeed. The GPU also handles common effects such as anti-alaising, lighting, fog, textures, and bump maps. High polygon counts and framerates are great, having extra cycles left over to make them look good is even better.

    (But I do agree with you in principle. My $90 GF2 MX card will serve me well for the next 2 years.)

    Neurosis
  • Well, since the Hercules press release [hercules.com] says the Prophet III will be released in March, and since that's in two days, I don't think the Mac will beat it by many many weeks unless it's released in about two weeks ago.

  • This card is meant for games. there are other cards out there are meant for content creation, like the quadro2 based ones (also by nvidia) or the firegl1 that have features like overlay planes that game cards usually lack.

    (except for the nvidia based ones because thier developers have something against overlay planes and ignore the fact that both softimage and maya use them. aritisan (a tool in maya) takes a big performance hit as a result)

    the firegl1/2/3 are solid cards for work, but the linux drivers are still unstable, and closed source (at least for now). hopefully that will change..

  • Really people, the new card is cool! Remember back in the day when that was enough?

    Did the people who created the Altair bitch because there was no pre-existing software for it?
    Of course not you dolts, the hardware hadn't been released yet. OBVIOUSLY the software will take time to develop. It's like you have to bitch about something in order to be cool so you bitch about the fundimental law of the universe that cause must predate effect! Either buy the card for the memory speed up and be happy when the software comes out, or wait until the software arrives and buy the card then. Sniveling is not required!

    You'd think that slashdot was run by nVidia competitors the way you people are complaining about the doom of hardware market. Why doesn't anybody complain about the near monopoly Oracle has on databases? Because Oracle products generally kick ass! Why doesn't anyone complain about the near monopoly nVidia has? The exact same reason. If you don't like that nVidia has the biggest market share, then go make a better card. Otherwise, shut your trap. I would be right with you in complaint if they had a disengenuous monopoly based largely on cheating, but they don't! I have never been unhappy with an nVidia purchase and the market reflects that.

    And for all of you twits who whine that there isn't a driver for your 27 year old, vacuum tube driven, monochrome graphics card, please shut the hell up! You bought a card that came with a driver. That's what you paid for. ANYTHING ELSE YOU GET IS A GIFT FROM THE COMPANY! And that company exists to be profitable, not to feed your whining ass with legacy driver releases. If you want a driver for a new OS with an old card, you have a pair of well known choices: Buy a new card, or write your own damn driver. If you're unwilling to do either of those two things then you're just being a bitch.

    I need a Gin and Tonic.

  • In this case I hope that NVIDIA has applied for a patent early enough, because otherwise Rambus may follow its tradition, copy NVIDIA's design, patent it and then sue NVIDIA.

    no comment needed.
  • nvidia will not give you any spec so youd have to reverse engineer. with that amount of time you could probably implement the full driver for the much more friendly to free software radeon cards.
  • All i really want to understand is what the difference is between a $600 Geforce3 card http://www.tomshardware.com/graphic/01q1/010227/in dex.html and a $1000 Oxygen GVX 420 card http://62.189.42.82/product/card/oxygen_gvx420/oxy gen_gvx420_specs.htm Why couldn't or shouldn't i pay $1000 for the higher priced card if it can do the same things a $600 card can do. And if an Oxygen card can't do what a Geforce 3 card can do. Why not? See the problem here is that i simply don't know enough about video cards and apparently i'm too busy to RTFM
  • This is going to be around $530, and that's about $70 less than the Voodoo 5 6000 was going to cost, for a card that is miles better than the V5 6K. I would have considered a 6K had they been released, and this is a much better value.

    Before I got into gaming, I was into drinking in bars, mostly. $600? Two weeks, tops. Assuming the card lasts more than two weeks and never gives me a hangover, I'd say it's not a bad investment, or at least that I've made worse.
  • If you don't want to see the 3D industry completely monopolised by a single player, avoid the GeForce3, and avoid any games written to depend on its features. Support chipmakers that are seeking to make everything run better, like ATI and PowerVR.

    This has to be one of the dumbest things I've heard all month.

    They have improved "general stuff" like fillrates and T&L.

    Where do you think new features come from? This card will run all your old games, and better than any other card out there. On top of that, it will run all the new games coming out that support features exposed in DirectX 8.0 - which, in case you haven't figured it out yet, is what developers will be using to create those new games.

    And who is to say you have to use vertex lighting? Granted, it won't look as good, but you can keep your old card and use light mapping instead.

    ATI didn't pack any "new" features into their current crop of cards, because the APIs weren't there to take advantedge of them when they were being developed. You can bet your ass they have their R&D departments all over the new funtionality exposed in DirectX 8.0 and are busy creating new cards to go head-to-head with the GeForce3.

    This is a good thing. NVidia has raised the bar, now the others must try and top them. That's how we get better hardware folks, it's not a bad thing.

  • I believe he meant kilohertz, since that's what sampling rates are measured in, not kilobits, which is what we measure (obviously) bitrates in. Which would make much more sense.
  • i actually hope ati gets better market share than nvidia. i know nvidia makes better hardware now, but if they become anything like a "standard" that can only mean trouble for free software. ati is the only real competition for them (unless matrox or some dark horse surprises us). while nvidia may be releasing linux drivers for now, it would not surprise me if down the road they and MS get more entangled and the linux support stops. and of course there the whole closed source thing. never a chance of using another free OS (which is one the major advantages of free software like linux in the first place!)

    and yes, nvidias relations with MS are a small part of that concern. i think its really funny that nvidia bought 3dfx and suddenly the linux drivers are gone from the 3dfx web site and not available from nvidias either, yet all microsoft OSes are still supported...

    my next card will most likely be a radeon (or radeon2 if they are out be then) it will be nice to play with the AV stuff, and maybe even use it for work...

  • by f5426 ( 144654 ) on Tuesday February 27, 2001 @09:01AM (#398770)
    > This leads me to believe that Nvidia's goal with this chipset is not to improve the 3D gaming experience of their customers, but rather to lure developers into using these (admittedly excellent) new features

    I disagree. Their programmable vertex shaders are a very good idea. Of course developers may want to directly access this features, and maybe make games that requires GeForce3. But there are very good sides too.

    First, having programmable vertex shaders can help them implementing better opengl drivers (for instance glTexGen). This will help existing programs.

    Second, a lot of new tricks can be experimented. 128 SIMD instructions is huge. I for one, would love to hack on this a few weeks. My mind blows on all the tricks that can be done basically free. Creative use of the vertex shader will undoubtely be implemented by competition, and would ends up as "standard" open gl extensions.

    (Btw, I don't see any noise function, or texture lookup ops. Maybe I should check more closely).

    > avoid the GeForce3, and avoid any games written to depend on its features

    I don't see a reason to avoid the GF3. Of course, avoiding games that *only* support it is a good idea. In the same vein, we should avoid games that only support DirectX, and games that runs on windows.

    Not very realistic...

    Cheers,

    --fred
  • is a real time Ray Tracer.
  • You're talking past, I'm talking present. My point was that a long time ago professional 3D accelerators cost lotsa money. Comparatively, they still cost lots. nVidia is notching up the prices and entering the professional-grade video card price ranges.

    You can buy a 3DLabs AGP Oxygen GVX1 for around $600. This guy is billed as "Mid & High-end". I couldn't tell you how it stacks up against a GF2 or 3, but there's a price2price reference.
  • You can try going with a company that uses unified drivers.

    For instance NVidia offers the same driver set for everything from the TNT, TNT2, up to the GEForce2 Ultra (and maybe GEForce3).

    Matrox is also famous for offering unified driver sets going back to the original matrox millenium.

    ATI is doing a poor job of supporting Win2000 even with their latest and greatest. There was a recent comparison of Win2000 vs Win98 drivers from ATI and NVidia, and the ATI drivers performed terribly! In some cases 1/2 the speed when tested under Win2000. Plus buggy (some tests could not be completed).

  • I find it a little worrying that so much of the work that has gone into the GeForce3 has been implementing unprecedented new features such as these vertex shaders, rather than improving more general stuff such as fillrate or transformation and lighting.

    Vertex shaders are a more general implementation of T&L. They will eventually replace it and are a huge improvement over it. Pixel shaders are similar, they replace the fixed function blending modes. They are both supersets of the older stuff which will still be accelerated just fine.

    This leads me to believe that Nvidia's goal with this chipset is not to improve the 3D gaming experience of their customers, but rather to lure developers into using these (admittedly excellent) new features.

    How is this a bad thing, I hear you ask? Well, it looks to me like an "embrace & extend" tactic.

    Wrong. The API's for accessing these new features are well documented and well defined. If you had bothered to check up on what exactly these features are you would know this. That's not "embrace & extend". Noone is prevented from implementing these features themselves.

    Of course, NVidia could concievably whack a patent claim on anyone who does implement them (assuming they've got the patents, I haven't got time to check today), but as that would prevent anyone but NVidia implementing the complete DX8 API Microsoft probably wouldn't be too happy.

    Witness ATI's Radeon - they added some very innovative features (like all the z-buffer accelerating) tricks but they were all dedicated to improving performance with current software.

    Read the previews, GF3 includes a lot of features like this. You might also remember something called T&L. Fully backwards compatible... and NVidia were the first to introduce it in a consumer chip.

    Is it just me or is any sort of innovation at all these days getting slapped with the "embrace & extend" label just because (horror!) using it requires dropping some backward compatibility?

  • where the price of a Radeon SDR and DDR is NOT the same.

    Maybe next time you could think that not anybody lives in the US before saying stupid things.

    Yes, I know I can order from the US but if I have a problem with the card, I would be out of luck, plus the shipping fee are not exactly free you know.
  • Where are you getting 0.9.7 drivers? The web and ftp sites only have 0.9.6, 0.9.7 is supposed to have some nice stuff in it, like support for the XRender extensions, so at the moment I'm not using the nvidia driver, but rather the nv driver, because the render extensinos are sweet.
  • I'm not sure your doomsday vision is really that bleak, though. We already have online communities where all the physical artifacts of your existence are missing. You don't know if I'm a one-eyed midget with parkinson's disease. You don't know if I'm a young black male or an old silver-haired grandmother. You only know me for my thoughts, and my ability to express those thoughts.

    Some people may say that we're losing something by interacting in this way -- but what we're gaining is so much better. It used to be people were forced to form communities with those around them -- purely due to geographic coincidence. Now I can form communities with people who think like I do, who appreciate what I appreciate, and who value what I value. All from the comfort of my home. I haven't been shut in -- I've opened up even more!

    Surely as our technology improves, this will continue (note I'm not suggesting better graphics cards will lead to an increase of this effect, just that it's already a beneficial phenomena and this can't harm it in any way). Sure -- if we all had this virtual world, and we all could look however we wanted, you might see some physical prejudices creeping back in.... On the other hand, imagine the joy a wheelchair-bound or paralyzed person might have from moving their avatars around a truly interactive artificial world....
  • OK, I may be out of line here, but I think that any fool that would spend more than they spent on an entire system should be institutionalized... or perhaps that is the problem? Maybe, just maybe there are folks out there that are so caught up in this techno-blizzard of equipment out there that they will pay any price just so they can say: "My X is bigger than yours!" I say fooey. This a prevelant ideology here just as it is anywhere in the "real world". Concider most sects of the religious world they all hate each other as they worship a bigger, better imaginary friend. The OS jehad, the inter-linux factions, the Intel/AMD war... the list goes on and on. Will it ever end? Sure, most likely once man has eradicated his existence.
  • by swb ( 14022 ) on Tuesday February 27, 2001 @06:45AM (#398779)
    I think the biggest problem is driver obsolesence. I have an "ancient" ATI Rage 128 video card (an AIW to be precise) and ATI has never delivered more than a "beta" set of drivers and applications (TV, etc) for the AIW128 cards under Windows 2000. I'm doubting there will EVER be another set of drivers or tuner software for this card from ATI.

    The video card people seem to have like three people that write drivers, and they're always busy writing drivers for the "current" crop of cards, until the cards actually are available, at which point they switch to writing drivers for the "next" slate of cards and the "old" cards simply do not get new or improved drivers written for them. A new OS often means getting stuck with the OEM drivers provided in the OS.

    I'm perfectly happy with my ATI-128 performance in the games I've played it with. I've toyed with hunting down a $120 GEForce2 card, but for the reasons you stated I'm missing the why part other than getting drivers more modern and optimized drivers than I'll ever see for my existing card.
  • Cool, they even have a negative term for being the best at something and then getting better.

    Now, that's cynical.
  • Ha ha commoner! Everybody knows that a yaught costs MUCH more than a mere yacht! Ha ha ha! I bet you don't have a limozine, either!
  • I think that could be a self-fulfilling prophecy of sorts. Game developers must cater to the general user base, and if that base stagnates itself, then it necessarily holds up progress on all but the most cutting-edge games. As long as large strides forward are made in graphics, though, I believe we as a people are largely attracted to the newest and shiniest. Most non-gamers won't see the big graphical improvement from, say, Half-Life to No One Lives Forever unless they see them side-by-side. Once the improvement is more obvious (probably by next year), the cards will continue to sell.

    You also make a good point about the developer resources. But how much of that development time/effort/expense is because the engines cannot be reused? A new engine must be built (or licensed and then probably modified heavily) every couple of years. Once the horizon comes closer and we're gaining the 5% from 90 to 95 instead of the huge leaps from Quake II to Unreal Tournament, a much larger jump in which much of the graphical technology for Quake II was unusably obsolete, possibly these costs will go down and improved games will actually come more quickly, as a tweak to Quake VI is enough to warrant a fanfare and Quake VII isn't necessarily required.

  • Wow, them's some expensive clams.
  • Well, actually, the GeForce2 MX isn't really even a feature-reduced GeForce2 GTS. It's got the same core stuff, sure, but it supports TwinView, where the GeForce2 GTS does not, among other things.

  • Nitpick the second: Tom says there are no shadow-buffering demos out there, but there's one right here [nvidia.com].
    It demonstrates some of the problems and shortcomings of the technique.
  • But you forget to mention that the DOT will publish the specs for the road, and has been upgrading the road on a regular basis, and all cars that expect to drive tout the fact that they're DirectROAD 7 compliant anyway. Besides, what's to prevent somebody from coding up a 'virtual pixel shader' that does it in software? By that logic, nobody should have supported the Voodoo 1 or anything like that! Then EVERYBODY would have to buy an expensive 3d video card!

  • I can't check on this at the moment, but I thought that 0.9-6 had support for XRender...

    Ranessin
  • You don't have to buy the card right now and pay a premium just to get FSAA as the only feature you can play with *NOW*.

    You don't need to buy the latest CPU neither, i.e. paying 300-400$more for only a few 100mhz's more.

    Thing is, that will drive the price of the GTS ultra or GTS 64MB DDR down, which can be used right now for gaming, in a year or more from now the geforce3 will have come down in price (decent gaming board) and will support the games out by then.

    Oh one last thing, there's not only GAMING in the world, that card competes with the big-ass 3D accelerators for CAD/3D modeling/3D animating.... why would I shell out 2000$+ if a Quadro3 (probably) will cost me 1000$ (or better yet, 500$ plus modification like with the Gforce2 :) ).

    That thing must scream in lightwave.
  • Nice one sir. Props to you as well!

    __

  • Definitely my philosophy. I do plan a little, I mean, I bought my ASUS V7700 GeForce2 GTS Deluxe for $405 Canadian (tax included) and damn if it isn't the best video card I've ever had. I know it'll last me a good long while (I was using an i740 before that and it had been a good to me for a long time. I expect this card to play the things I want to play even longer). I bought it with the intention of not having to upgrade my video card for a while. So what if I paid double what I could have paid. I expect it to last twice as long. =)

  • But, take a look at DX8. MS said "here's a DX8 pre-release spec." Nvidia replied "No, it's not. DX8 needs to be THIS." MS said "oh, ok," and changed it. Nvidia, not MS defined DX8. They also knew beforehand that this would happen. So they got to design for DX8 before anyone else knew what it was. A significant advantage. Mind, I like features, and my reaction can best be described as "drooooool," but DX8 is more a description of the GF3 than the GF3 is an implementation of DX8.
  • How did you get "insightful" mod points from that??

    A: You can tell the difference between 30 and 200 fps. Maybe not between 70 and 200, but 30 and 200, yes. And a system that gives you 30 fps in one place will bog down to 10 fps in another. If you can get 70 fps, it will likely only bog down to 30 fps when things get ugly.

    B: Getting tech like this out there allows game developers to push the boundaries even further. Now granted, we didn't need the explosion of colored lights that happened when the voodoo2 came out, but still, the point is good. As the tech grows, the developers can use a toolset much richer than they had before. Look at the differences between Quake 1 and Quake 3. The difference between a Voodoo 1 and GeForce2. Imagine that level of difference from what we have today....

    C: Your example uses 1024x768. Why should we settle for such a lame approximation of reality. My desktop is 1600x1200. I drop to 1024x768 for my gaming, because anything higher causes noticable performance degredation. I used to settle for 512x384. Now I can't imagine going back to that. And in a few years, I won't imagine being forced to go back to 1024x768.

    Nobody's forcing you to buy these new toys. Not everyone needs them. Personally, I can't see spending 10 grand on a home stereo -- after a certain level, they all sound the same to me. But I surely don't say it's "against all common sense" that someone might. I buy my toys, you buy yours, and we'll all live happily ever after.
  • When I clicked through, the first banner ad was a Unicef-Please-Donate-Money-to-Save-The-Children-Fu nd.

    Pretty impressive really. Anyone who might feel a twinge of conscience when following a link to a $600 video card they're thinking of buying is almost immediately comforted with a charity banner where they can assuage said conscience.

  • "I often wonder why people spend an absolute fortune buying the lates video cards when the simple fact is that the card will not be used to its utmost capability for several years"

    Why do people buy SUVs or luxury cars when a Geo Metro or Mini will do? Why do people buy the latest fashions of clothes when they could get last years in sales?

    A lot of people want to play the latest games when they first come out, and have the best machine there. If you don't feel that pressure, then I congratulate you: you're in a sensible minority.

    "Noone can tell the difference between 30fps and 200fps anyway"

    That's so untrue. I can tell the difference between 50 and 60, no problem. After playing at 60 or above, a drop to 50 or below is very noticable and definitely not as smooth and hampers playing ability until one adjusts.

    "Another problem with video cards is that the performance is becoming optimal anyway"

    No. There's a very long way to go yet. With more power, there are so many features they can add. Go and read something about 3D graphics, and you will realise how limited this cards are still.

    "At 50fps this is approximatelt 37million pixels per second. So it is intuitively obvious to all that a video card with a performance in excess of 37million polygons per second will not provide any better performance under those conditions. Why pay extra for something you can't see?"

    You can't base polygon count on pixel count. Some polygons get rendered, but then are obscured by polygons in front of them. So yes, you do have to pay for something you can't see ;)

    "It is like insisting on a 500kbit sampling rate, when 70kbit sampling rates are perfect to the human ear"

    Not all sounds are sensed via the ear.

    In general, I do agree with your sentiment. I bought an original GeForce 256 DDR when it first came out. I'm still trying to justify the expense. If I had waited, I could have got a better GeForce for less. I'll do that again. I'm sure I'll buy a GeForce 3 eventually, but I'll wait until the Ultra model is cheap enough (I just know that there will be several generations of the card).
  • The whole point of the modern "3D accelerator" was to bring 3D graphics to the consumer at modest prices as compared to 3DLabs, SGI, and their ilk. Now, it looks like nVidia is either knowingly or unwittingly attempting to enter that territory by increasing prices to the "professional" 3D range.

    Nobody is going to program exclusively for this card until it saturates the user base. Which, at this price level, ain't gonna happen soon.

    Wonder if the "professionals" will strike back :)
  • I never thought I would see a development of a new feature shot down because it was good. this is stupid. Nvidia is innovating new features and products that people like john carmack [planetquake.com] want. They have the fastest Xfree setup now, even if it is closed source. People are so paranoid m but most orget that most of the profit is made in oem, and a 600 dollar card wont make oems happy.

    This is like car and driver saying "Chevy has made a new car, that is only 2 grand, gets 110 mpg, and goes from 0-60 in 4.7 second! Its the greats car we ever seen, full of new features.....but we shouldnt buy them! because that will put ford out of buisness"
  • As a hardware designer, I'm very pleasantly surprised by the functionality and likely performance of the new chip. The new memory controller architecture and usage optimisations are particularly impressive; the array of new functionality is mind-blowing. However, they still seem to be missing some basic things. The 2D quality is only going to be "the same as GeForce2" - i.e. well below ATI and especially Matrox at high resolutions. There is no multi-monitor support. All the anti-aliasing modes use uniform sampling (which is very bad). The filtering used in anti-aliasing and mipmap generation seems to be plain averaging (i.e. a hideously ugly box filter) - when very little extra effort would permit gaussian or even programmable filters, providing much better image quality. The video features don't stack up to those of ATI's Radeon. As truly stunning as I find the 3D and memory architecture of the card, it's really going to be difficult to justify buying one unless the machine is used exclusively for games. For anyone with a monitor bigger than 17" it really won't cut the mustard for anything else - same as GeForce and GeForce2. I hope they hurry up with the GeForce3 MX and pray that it's not too badly crippled - no-one seems to want to make a card that's suitable for work _and_ high-end 3D gaming, since Matrox released the G400 back in 1999. Sigh.
  • you can get a quadro chip based on this will be released. quadro is nvidias line of "professional" hardware, in other words, meant to get real work done. and still good for games.
  • What if the "objects" are intersecting? How do you sort them in a meaningful way? And how do you properly render these intersecting objects if you've turned off z-buffering? (perhaps you draw intersecting objects separately, with the z-buffer on)
  • I plan on buying five of them, and then having fun by taking them out with my shotgun while sailing on my 120 ft. yaught. Later on, myself and five gorgeous women will laugh about the poor geeks, drink a fine wine, and perhaps top the night off (well not really) with a nice cuban cigar.
  • Some people use all the neato features of a card like this to get faster OpenGL performance out of Lightwave or Maya or some such.

    It is great that we can use it for games too, but that isn't the point for many. I am sure there will be an even more expensive version of this in Nvidia's Quadro line, it'll have greater throughput and more processing power...so it'll get bought. It'll make DOOM 3 scream, but that isn't why you buy it.

    Unless you are a, "soul of independent means."
  • Ditto, but my price point is $100. Of course, running a measly 500 PIII is going to leave me in the dust anyway.

    Hence my purchase of an N64 (and probably a PS2 in another year). It was cheap, I can play it on a 35 inch screen from my couch (my desk chair is comfy, but nothing like my couch). Things rarely crash (weird problem with Gauntlet Legends). I can buy tons of cheap games on eBay and at FuncoLand.

    Did I mention that it's cheap and I can play on a big screen?

    Hey, I love PC games. I got a Voodoo II VERY early. But spending $1000 per year on upgrades is nuts. I've got too many hobbies to keep up this computer upgrade business (ever seen the NOS prices on old Honda motorcycle parts?)

    If that is your thing, by all means, go for it. But as for me, I'll be excusing myself from the party now.

  • by Phinf ( 168390 ) on Tuesday February 27, 2001 @07:20AM (#398831) Homepage
    Keep in mind that the features of the Gforce3 are using are things that make it compatable with DX8. these arent Nvidia's features they are DX8's. any card developer can make a card that supports these features.

    I think that this is a good thing. if we can make features standard in DX8 then diffrent cards will support these same features and maybe we can get game devs to support these features if they are able to be implimanted in diffrent cards.

  • There's a list of previews here [tweak3d.net] Tom's is one of the many out there...
  • by jonathanclark ( 29656 ) on Tuesday February 27, 2001 @10:12AM (#398841) Homepage
    I think one of the coolest things about the GF3 is the ability to get a pixel write count back for a polygon. i.e. you draw a polygon and it tells you how many pixels were "written" to the frame buffer due to alpha and z-buffer test.

    And since you can turn off color and z writes, you can test visibility with no changes to the frame buffer. This is perfect for a portal game where you can cull entire rooms if they are not visible because of things you traditionally couldn't compute. If there is a big fireball in front of your face, or a character/pillar is blocking the view. If you have a few monsters that require a significant amount of time to draw, then you can test to see if they are visible first by rendering a coverage polygon first.

    You can use this to test the visibility of lens flares so they fade smoothly in and out as they go behind other objects.

    You can also use this in game logic in combination with a shadow map to tell how much "in the shadows" characters are. This can make the AI more realistic.

    Getting back pixel write counts from the hardware has a very long latency, so it can't be preformed super frequently - but it's a lot faster than trying to read and process the z-buffer yourself.

    And for those of you not wanting to spend $600 for a GF3, just wait for the Xbox - it's including almost exactly the same hardware for half the price.
  • by mr.nobody ( 113509 ) on Tuesday February 27, 2001 @06:23AM (#398843)

    http://www.bluesnews.com/plans/1/ [bluesnews.com]

    Carmack has quite a bit to say on the subject as this .plan update is rather long (a little too long for a /. comment I think).

  • by pepermil ( 146926 ) on Tuesday February 27, 2001 @06:23AM (#398844)
    In case anyone wants a quick link to the other big reviews...
    Sharky Extreme: http://www.sharkyextreme.com/hardware/articles/nvi dia_geforce3_preview/ [sharkyextreme.com]
    AnandTech: http://www.anandtech.com/showdoc.html?i=1426 [anandtech.com]
    HardOCP: http://www.hardocp.com/articles/nvidia_stuff/gf3_t ech/i [hardocp.com]
    -pepermil
  • No offense, but having worked with this architecture for a while now, I have to say that the NV2x approach isn't an attempt to hamstring the graphics industry. It's an attempt to raise the bar of hardware design and bring the industry to a new level of verisimilitude in graphics rendering. Criticising nVidia for being a monopoly because they have the technical smarts to develop a revolutionary rather than an evolutionary solution just doesn't make any sense.

    With the programmability of the vertex and pixel shaders, graphics applications are now free to create a whole new engine architecture... one that's free from the idea of fixed-format vertex data but instead is purely representation-driven. Because you can pass any binary data that you want to the vertex shader, you no longer have to represent the properties of your surfaces in an implicit format whose characteristics are defined by the fixed capabilities of the hardware. Now that this programmability is available, you can encode surface data in a format that actually stores exactly what you want. The NV2x is the first hardware engine that I feel can be called a "GPU" in more than just name - its capabilities will allow application developers to craft graphics engines that just aren't possible on a card that is "dedicated to improving performance with current software" as you cite you'd like to see. ATI recognize this - witness their Pixel Tapestry technology for pixel shading.

    NV2x is the same kind of advance over GeForce2 that the original 3dfx Voodoo cards were over the prevailing PowerVR and VIRGE chips back in (1997?). You didn't see anybody complaining that 3dfx were trying to lock people into their proprietary technology back then, for the simple reason that everyone recognised the potential that was inherent in the change of focus. It took a while for games to become "3D Card Required"... but I'm 100% certain that nobody wants to go back to Quake II-era rendering. The benefit to the application programmer and the consumer is obvious. NV2x may not be the winning solution in the new space that's opening up - but it's a damn good opening salvo.

    To paraphrase your post: If you don't want to see the capabilities of 3D graphics engines advance beyond the current status quo, avoid the GeForce3. And miss out.

    - Butcher

    P.S. You'll be amazed when you see what we can do with this technology. This is a great time to be a game designer or game player.

  • W does *not* stand for weight, Tom !

    W stands for homogenous coordinates. Pg 204, Computer Graphic Principles, Foley & van Dam)

    (PreDX7 had something called "rhw" which stands for "reciprocal homogenous W")
  • Well, at least the V3 has 32-bit color (right?), so we don't have to go into that particular neck of the woods. Sigh. About your math and the releated issues, I have only two things to add right now:
    • One-pixel polygons are good. Perhaps not in yesteryear's games, which feature huge flat surfaces, but various forms of higher-order surfaces with curves are definitely the trend today. Pixar [pixar.com] render their movies using subdivision surfaces tesselated until each polygon is less than one pixel in the final image. We want that.
    • Multi-pass [lysator.liu.se] rendering is good. Many effects in games are achieved by rendering each pixel more than one time. 3dfx realized this back with the Voodoo2, and added support for multiple textures per pixel. That is good, because it allows you to send a triangle to the hardware once, but get it textured twice. This saves one pass of geometry transform. As soon as an effect requires an additional pass, that reduces your effective polygon throughout quite a lot, of course. So, my point is that even though it might sound excessive based on simple "# of pixels on screen" arguments, huge polygon and pixel fillrates are good, because they allow more passes and thereby more flexibility and coolness in effects.
    I don't think quality issues can be compared straight across between audio and video. Or, rather, I'm not good enough with audio to see where seemingly superfluous performance (oxymoron?) can be put to use.
  • Oh darn, I knew I should have looked that one up. ;^) Well then, I guess I can add a bullet:
    • Modern 3D hardware supports 32 bits per pixel (RGBA with 8 bits per channel), at least 24 bits per pixel of Z (depth) buffer, and minimum 8 bits of stencil. These are all handy to have around when creating good-looking graphics (although more than 8 bits per channel would be a Good Thing).
    Thanks for the heads-up.
  • back when nVidia first announced theier GPU, they touted it as good, not just because it had such amazing rendering capability but also because it took more proceser intensive operations off the CPU. The line I remember both from them and from proponents was "from the CPU to the GPU, freeing up the CPU for things such as AI and physics"

    Well, I for one would really like to see that become a reality. Now I know every one who has ever touched a compiler (and often many who don't even know what a compiler does) will give a different opinion on how this is possible, how it has been done, or how it will never be done, etc. And most say that now programmers need to step up and use the features. (I agree with this one) But my question is, are API's keeping up with it. How is DirectX handling more advanced features (not just is it 'supported' but is it clean and efficient) How about OpenGL? How are the projects for middle ware coming like WINE with DirectX and so forth?

    I am really impressed with the graphics detail and performance out there right now. I personally want to see more stability. While I am aware of the argument (and it does hold some water I admit) about a higher fps giving better all around performance, it is still common for intense graphics scenes to chug on your machine. I wouldn't mind seeing the ability to average out the fps better, as set by the user. Some method on the hardware to reduce the quality or certain methods only if it detects a forrest of high quality polygons and its own slow speed.

    And I would REALLY like to see some better AI, and maybe some api's that make use of hardware driven ai... ok just kidding on that one, but perhaps a set of advanced ai libs with an API isn't too far out there. Tie these in with any of the methods for network distributed processing and you have an amazing LAN party set up. Throw some together on your home server farm and now you have your game server set to go... ahhh. Soon it will be like the old BBS days... but with better graphics and real time interaction.

    Well, end of wish list... maybe the internet fairy will bless this if it is seen and make it real.

  • But spending $1000 per year on upgrades is nuts.

    Why is it people are complaining about "having" to upgrade twice yearly at great expense (supposedly because it's required to run the latest games), and then at the same time they complain that the features won't be used by the latest games for 12-18 months anyway!

    Come on people, make up your mind. If you like it now, buy it now. If you can wait, buy it later and save some money. I just don't understand all this moaning about it.

  • by Namarrgon ( 105036 ) on Tuesday February 27, 2001 @07:53AM (#398881) Homepage
    The programmable shaders are very cool, but don't forget the other features:

    - "Lightspeed Memory Architecture", similar to ATI's HyperZ (but more effective), with an interesting crossbar memory controller & Z compression, requires no support, and makes your existing games run faster.

    - "Quincunx multisampling FSAA", a high-quality, more efficient AA method makes your existing games look nicer at considerably less performance cost than previously possible.

    Increasing fillrate is pointless, when things are already so memory-bound. T&L is improved, in the way that developers have been asking for most: programmability. And as mentioned elsewhere, these are all standard DirectX8 features, so you're not required to be "nVidia compatible", just DX8 compatible, which is expected anyway.

  • that's the kind of upgrade that could really increases my productivity, I mean all those xterms could render way faster with that card in my box. I have to talk to my boss about this.
    --

"If it ain't broke, don't fix it." - Bert Lantz

Working...