More on the GeForce 3 177
Tom has extensive poop on the GeForce 3. Some history, and a ton of why this card may or may not be worth the $600 price tag. Lots of stuff on the Vertex Shaders (which look to be a truly amazing idea). Very worth a read if you're interested in state-of-the-art 3D, or just wanna sneak
preview at what next year's FPS will be doing.
nice link (Score:5)
Buy a video card, and we'll throw in the computer (Score:2)
Re:$350 BTO (Score:1)
Plus, there is the plain fact that if you buy a whole computer you are spending ~1500 bucks, which is over twice what buying the card alone would cost, even considering its inflated price. So if all you really want is that card buying a mac just to save money or have the card a month sooner is pretty dumb.
Re:Video Cards (Score:1)
Re:It's Too Much (Score:1)
NVidia & MS - Too close for comfort? (Score:1)
Don't get me wrong here. I love nvidia's work, have a GeForce2 Pro in my computer as we speak, and think they do a great job. My worry is that MS & nvidia might be collaborating to an extent where they create a problem similar to the wintel situation of not so long ago, where the government had to step in to stop the high level of exclusionary collaboration in that relationship. Software + Hardware = a lock on the market.
All that has to happen is that they become good enough buddies such that MS is customizing DirectX & their OS to work best with nvidia processors and nvidia works with MS to customize it's hardware to work best with DirectX & Windows. Together they end up creating an "industry standard" that through patents & IP locks out or inhibits competitors. Anyone else thinking something like this is possible or already in the works? I'd hope that nvidia, being as innovative as it is, will examine history and try not to let itself get tied too closely to this beast.
Oh, and as for the price thing, I'm avowed not to pay more for my GPU than my CPU. $5-600? As much as a technophile that I am, it's just too dear for me. I'll be sticking to a $250 maximum, thanks.
Re:Driver obsolesence (Score:1)
Re:I'm sick of upgrading! (Score:1)
Well, at least this didn't have to be moderated to get to +2... it's just your standard whiny rant.
Whine, whine, whine, I can't afford the new stuff, so therefore no one else can, so all you video chip makers put a sock in it. First of all nVidia wouldn't be doing this if they wouldn't make any money. Rest assured, sales will be brisk right from the start. Second, upgrades don't come _all_ that often. You have the latest crop of cards (most expensive of which comes in at around $500-600). Did you whine then? Then you have the generation before that (or half generation), the TNT2 (I myself have TNT2 Ultra), and that was, what, 12-18 months ago. I bet you whined then too. And of course before that still you had Voodoo 3, Voodoo 2, and a bunch of other also-runs.
You see, this sort of progress waits for no one, except the marketing department. Are you also complaining about the speed race in the CPU market? I upgrade my CPU every 3 years, roughly. I went from 50 to 200 to 500 to probably 2000+ MHz (in another year) with more than enough performance to match. I bet you're complaining about that too, hmm?
Your problem is (and others like you) you have the 'old' generation of cards, which have suddenly been 'obsoleted' by this announcement from nVidia. Seeing as you probably paid a good bundle of money for that priviledge, you now feel cheated. Or maybe you just got a 'new' card and the box is still laying in your trash can, in which case you feel cheated as well. I mean, man, here I bought a brand spanking new card for at least $300 and it was just a waste of money! Grow up. The vast majority of people will never buy the latest and greatest, they can't afford it or don't have other prerequisite hardware to support it. Just wait 3-6 months after release and do what everyone else does: buy reasoable performance at a reasonable price.
Re:Cheaper to yank the video chip out of XBOX? (Score:1)
The better question is how hackable will the XBox be..Would one be able to get a more full featured OS than the cut down Win2k kernel running on it? And get support for SVGA monitors and other standard peripherals needed to turn this into a really nice workstation?
Re:Driver obsolesence (Score:1)
I've seen it where a vendor packages the same installation with multiplatform installs, but its usually a bunch of different files for seperate platforms. This usually means that while an update will take place for a "current" device, the updates seldom get made for older/existing devices.
It'd be an impressive act of hardware/software engineering if each new nVidia video card were actually driver-compatible with the old card's drivers, only required new drivers to get new features. But I'm guessing it doesn't work this way.
GeForce / Doom naming scheme (Score:1)
But he did/is doing the same thing with Doom : Doom 2 was nothing more than an add-on to Doom, and Doom 3 will be _completely_ different. Hehe...
Re:I'm sick of upgrading! (Score:1)
And there's no way that the gf3 will still be $600 a year from now. I'm guessing ~$250 OEM or less. And the release of the gf3 means all of the other cards out there now will be CHEAPER! Yes, the GF3 card that cost $400 a year ago is far below $200.
Fast release cycles make make people feel like they've got outdated technology, but really, they just mean that people get blazing fast products for a fraction of the price.
Re:Nvidia embracing and extending? (Score:1)
NVidia, as they pushed microsoft to expand DX, also created new OpenGL extensions for these new features. These are NOT Microsoft's ideas. They are almost soley NVidia's.
Justin Dubs
A perspective on the price. (Score:1)
Not that I'm going to be buying a GF3 at that price...
Feature Bloat (Score:1)
welcome to the wonderful world of feature bloat. Nvidia is probably thinking along the same lines as Microsoft. New formats for Word with every new version means you have to have the latest version of Word (and not WordPerfect or... woah, i can't think of any other word processors, have they really killed them all!!!???) to open it. now that Nvidia is the undisputed leader (in terms of market cap) they will definitely fight to keep the lead with the same tactics MS has used successfully for years.
as gaming becomes more and more the primary entertainment for humans in the 21st century, you can bet we'll have to deal more and more with this crap. if the money is there, the pigs will come.
--
Re:NVidia & MS - Too close for comfort? (Score:1)
Does your $250 CPU come with 64MB of DDR RAM? Did you notice that the GPU has more transistors, and is on a .15u process? Are
you aware that nVidia doesn't have the economies of scale that AMD, Moto and Intel enjoy? Just wondering...
No (32MB), yes, and yes. This doesn't prevent the two from colluding to corner the 3D graphics market so they can maintain primacy as a platform for game programming, which despite what many home users like to profess is a major reason they buy computers.
I think my post may have been a bit off-topic since it was about my own worries regarding the level of collusion between MS & nvidia. I was just wondering if anyone else was having the same thoughts though. It wouldn't be the first time a hardware & software company have colluded to reinforce each other's market share to the detriment of innovation and consumer's pocketbooks (wintel, anyone?).
And what the hell does "economies of scale" mean? Each computer worth its salt sold these days comes with a graphics card just as it comes with a CPU. Hence, nvidia should have the same "economies of scale" (that is, available market and scope of exposure) for its product as a CPU company. I actually wouldn't take issue with nvidia gaining a monopoly in the GPU market - they do a good job and I love their products. What I'd be against though, and what's against the law, is colluding with others such as MS that gives nvidia's GPU a priveleged relationship to DirectX and/or the OS that ends up excluding other GPU companies.
Such a state would effectively end the possibility of competition in this market, which'd be a shame. It'd be as if MS engineered Windows such that it ran 50% slower on an AMD vs. Intel CPU.
fill rate vs. memory bandwidth (Score:1)
Exactly. Anand's GeForce3 preview [anandtech.com] puts it this way: "even the GeForce2 GTS with its 800 Mpixels/s fill rate was only capable of a 300 Mpixels/s fill rate in a real world situation."
If you look at the GeForce2 MX reviews, you'll see that it barely edges ahead of the SDR GeForce at higher resolutions, and falls well behind the DDR GeForce. Forget about doing old-fashioned FSAA on an MX.
? That`s just ridiculous (Score:5)
So it`s ok to be aware of market tactics principles and to approve or disapprove, but to actually NOT buy a graphics card which is superior to others because it makes the other businesses go slower ?
That`s definately NOT the right angle. In fact, if nVidia gets to sell this card as hot as the previous 2 versions, it can set (and raise) the standard in 3d lighting again, and frankly that`s what I want. Ofcourse monopolistic situations like e.g. Soundblaster are absolutely bad to competition (and quality) in the market, but that`s because a Soundblaster (could als have used Windows here) is a good product, just not top of the line. That doesn`t mean there are no other soundcards out there which are actually better, only that you`ll have to pay more for those and go out and look for them.
I support ATI and to a lesser extend, Matrox, because they are the only rivals left in the field. But If I had to buy a card today, it wouldn`t be either of those 2, because I simply want a 'standard' compliant full fledged and top of the line game experience, not the feeling that I did something good for market competition. In the end I might financially regret that choice, but if nVidia creates the best cards at the end of the day, I`m only happy to spend some cash on them. If someone else can do the same or top them AND has less expensive cards, obviously that`s a thing to consider. But today I cheer for nVidia, as I have more pro than con.
Actually, they did improve T&L and fillrate (Score:1)
I'd check out the Anandtech article if I were you. It looks like they put a lot of work into improving T&L and fillrate.
"The GeForce3 is thus the first true GPU from NVIDIA. It still features the same T&L engine from before, however now it is fully programmable."
Programmable is very good, maybe now we'll actually get to see hardware T&L in a game, rather than a press release.Also mentioned in the Anandtech article: "This is what gave the GeForce2 it's "Giga-Texel" moniker since it was now able to boast a peak fillrate of 1.6 Gtexels/s. The GeForce3 takes this one step further by now being able to do quadtexturing in a single pass, offering a theoretical peak fillrate of 3.2Gtexels/s although it seems like it's definitely time to stop quoting theoretical fillrate numbers in this industry."
I'd say that this is quite an improvement. I'll stop quoting now.
Re:I use a Voodoo 3 (Score:1)
On an entirely different note I'm trying to get my old Voodoo Rush Card to work with linux but am not finding to much info on it. Anyone got any points? It would be *much* appreciated, even if the answer is "no".
The sad thing is... (Score:2)
There's no reason for that to happen, now that Nvidia has no real competition. Why not keep prices on the GF3 high? People who want it that badly will pay it, those who won't will have to buy some other Nvidia product if they want to keep up with the pack. Nvidia wins either way.
Nvidia embracing and extending? (Score:5)
I find it a little worrying that so much of the work that has gone into the GeForce3 has been implementing unprecedented new features such as these vertex shaders, rather than improving more general stuff such as fillrate or transformation and lighting. This leads me to believe that Nvidia's goal with this chipset is not to improve the 3D gaming experience of their customers, but rather to lure developers into using these (admittedly excellent) new features.
How is this a bad thing, I hear you ask? Well, it looks to me like an "embrace & extend" tactic. If the developers use vertex shaders to make their games look cooler, then other 3D chipmakers have to either scramble to provide the same features, or all the cool new games will run like ass on anything non-Nvidia. Only Nvidia can get away with a tactic like this because of their present dominance of the market. Witness ATI's Radeon - they added some very innovative features (like all the z-buffer accelerating) tricks but they were all dedicated to improving performance with current software. They couldn't introduce radical new features because nobody would use them, supported as they were only by a minority chipmaker.
If you don't want to see the 3D industry completely monopolised by a single player, avoid the GeForce3, and avoid any games written to depend on its features. Support chipmakers that are seeking to make everything run better, like ATI and PowerVR.
Re:Oh, great... (Score:1)
I do agree that it's really really insane for a video card to cost $600. But the fact is, if you don't want to pay $600, you don't have to. Right now, a GeForce 2 will more than get you by, & with the release of the GeForce 3, the cost for a 2 should hopefully drop dramatically (I know I have my fingers crossed!
As for the technology, hopefully once I make it through my classes today, I'll have some time to sit down & really try to grasp all this 3d-techno-wonder stuff that nVidia's apparently whipped up for us all. Even if I don't buy a GeForce 3, I still love the engineering efforts that went into designing it and love trying to learn about what those efforts produced.
So, I guess what I'm trying to say is, that these cards do apparently have a place in the market, even if it's not for you or me, and it actually benefits us as well, by bringing more powerful cards down into our budgets. Plus the fact that it just gives all us techies lots of new reading material. So, I don't see any reason to complain.
-pepermil
Sorry but you're wrong (Score:1)
But still you're wrong: processing more than 1024*768 pixels even for screen at this resolution IS interesting.
Why? Anti-aliasing!
And why using the number of pixels for the number of polygons?? One is a 2D-number, while polygons lives in 3D.
I don't know what is the number of polygons needed to render a tree (or worse a landscape) realistically but I think that it is quite huge!
GPU or CPU? (Score:1)
Graphics is getting more expensive, too. A $600 graphics card compared to a $300 state-of-the-art processor.
What about that massive die size and transistor count? Can you argue by this the GF3 is more complex than the CPU now?
Maybe as process sizes decrease and high DIE size yields increase, it would be better just to stick the GPU and CPU on one chip. Or maybe put them on a processing bus so the GPU will "look" like another CPU? Somethings isn't right when the most complex part of the system isn't even on the system bus. I would predict this is the last card which is satisfied with the AGP 4x bus and the current system configuration.
Re:Driver obsolesence (Score:2)
-----------------------
Re:It's Too Much (Score:1)
There are much richer folks out there than the average hardware geek... why don't you bother them first? Surely the ~$1000 that folks drop on there computers doesn't compare to the $50,000+ that many folks drop on their cars every year, or the $1,000,000+ that some people pay for homes.
-
Additionally, you're forgetting that consumers in this country don't buy directly from third-world laborers. We buy from supermarkets, who buy from distributors, who buy from shippers, who buy from farming distributors, who pay the workers. There's no way a consumer can influence this huge chain of sales. There's no chance to "boycott", as we need food from SOMEONE, and all supermarkets I know all behave this way. Unless you have a solution to break this chain, I suggest we worry about domestic problems first.
And simply sending money over isn't the answer. Most of the aid that goes to other countries gets lost in government, and pouring more money in only makes the gov't richer and more influential.
Anyway, please try to give solutions instead of crying about the problems.
Forget about drivers... (Score:2)
Re:Carmack on GeForce3 (Score:1)
Amazing. Moderate up a link to Carmack's .plan.
It only formed the basis of yesterday's story, entitled: Carmack on D3 on Linux, and 3D Cards [slashdot.org].
Re:Nvidia embracing and extending? (Score:1)
But, what the new hardware does give you is the ability to do more per memory cycle, which is key in the future.
If you automatically say that anyone who innovates is trying to take over the market, then how can we ever get change? In this case, since the API is part of DX8, any video card manufacturer can implement the features. The exact same situation occurred with multi-texture on the Voodoo2, and no-one complained about anti-competitive influences then.
Re:Nvidia embracing and extending? (Score:1)
doesnt this sound familiar to you ?
Re:Nvidia embracing and extending? (Score:2)
No, it's more like saying "Chevy has made a new car, that is only 2 grand, gets 110 mpg, and goes from 0-60 in 4.7 second, but it requires a special kind of road to drive on, and if we all buy one then the department of transportation will upgrade all the roads, but other cars won't be able to use them, so we'll have to buy Chevy's forever, and once we're dependent on them they can lower their quality and service and we'll have to accept it!"
I hate poor analogies. But you sounded real cool.
-Tommy
Re:The sad thing is... (Score:1)
$600 is really cheap for a professional card...but it is damn expensive for the much broader consumer market. So Nvidia makes a few bucks off of the prosumer market, creates a buzz surrounding the product, then sells cheaper versions. (making up the lower cost in volume)
Re:The sad thing is... (Score:1)
And nvidia still has competition from the Radeon2, and possibly BitBoys, if they ever release something.
Probably not worth the price . . . (Score:1)
Oh, great... (Score:3)
Screw it. I'm not paying more than two hundred for a video card. Anyone who'd shell out six hundred for one of these is insane. You can get another box for that much, pre-rebate.
Re:The sad thing is... (Score:1)
Re:Probably not worth the price . . . (Score:1)
Re:I use a Voodoo 3 (Score:1)
Re:Oh, great... (Score:1)
Does that go for the DreamCast too? (I know it was cheaper than $600 but the principle remains)
Support? (Score:1)
Yuck (Score:1)
Tom has extensive poop
I don't know about you guys, but this certainly made the article seem more than a bit unappetizing.
Re:I use a Voodoo 3 (Score:1)
Especially since all the uber-gamers will turn off as many options as they can to get extra fps, in the hopes that they can frag the other guy first.
Re:I use a Voodoo 3 (Score:1)
(But I do agree with you in principle. My $90 GF2 MX card will serve me well for the next 2 years.)
Neurosis
Re:my new Power Mac G4 will have one - for $450!!! (Score:1)
Well, since the Hercules press release [hercules.com] says the Prophet III will be released in March, and since that's in two days, I don't think the Mac will beat it by many many weeks unless it's released in about two weeks ago.
Re:$600 too much ? Not if you have work to do... (Score:1)
(except for the nvidia based ones because thier developers have something against overlay planes and ignore the fact that both softimage and maya use them. aritisan (a tool in maya) takes a big performance hit as a result)
the firegl1/2/3 are solid cards for work, but the linux drivers are still unstable, and closed source (at least for now). hopefully that will change..
I think Slashdot needs more WHINERS! (Score:1)
Did the people who created the Altair bitch because there was no pre-existing software for it?
Of course not you dolts, the hardware hadn't been released yet. OBVIOUSLY the software will take time to develop. It's like you have to bitch about something in order to be cool so you bitch about the fundimental law of the universe that cause must predate effect! Either buy the card for the memory speed up and be happy when the software comes out, or wait until the software arrives and buy the card then. Sniveling is not required!
You'd think that slashdot was run by nVidia competitors the way you people are complaining about the doom of hardware market. Why doesn't anybody complain about the near monopoly Oracle has on databases? Because Oracle products generally kick ass! Why doesn't anyone complain about the near monopoly nVidia has? The exact same reason. If you don't like that nVidia has the biggest market share, then go make a better card. Otherwise, shut your trap. I would be right with you in complaint if they had a disengenuous monopoly based largely on cheating, but they don't! I have never been unhappy with an nVidia purchase and the market reflects that.
And for all of you twits who whine that there isn't a driver for your 27 year old, vacuum tube driven, monochrome graphics card, please shut the hell up! You bought a card that came with a driver. That's what you paid for. ANYTHING ELSE YOU GET IS A GIFT FROM THE COMPANY! And that company exists to be profitable, not to feed your whining ass with legacy driver releases. If you want a driver for a new OS with an old card, you have a pair of well known choices: Buy a new card, or write your own damn driver. If you're unwilling to do either of those two things then you're just being a bitch.
I need a Gin and Tonic.
we all love rambus (Score:1)
In this case I hope that NVIDIA has applied for a patent early enough, because otherwise Rambus may follow its tradition, copy NVIDIA's design, patent it and then sue NVIDIA.
no comment needed.
dont waste your money (Score:1)
Video Cards (Score:1)
Not insane... (Score:1)
This is going to be around $530, and that's about $70 less than the Voodoo 5 6000 was going to cost, for a card that is miles better than the V5 6K. I would have considered a 6K had they been released, and this is a much better value.
Before I got into gaming, I was into drinking in bars, mostly. $600? Two weeks, tops. Assuming the card lasts more than two weeks and never gives me a hangover, I'd say it's not a bad investment, or at least that I've made worse.Re:Nvidia embracing and extending? (Score:1)
This has to be one of the dumbest things I've heard all month.
They have improved "general stuff" like fillrates and T&L.
Where do you think new features come from? This card will run all your old games, and better than any other card out there. On top of that, it will run all the new games coming out that support features exposed in DirectX 8.0 - which, in case you haven't figured it out yet, is what developers will be using to create those new games.
And who is to say you have to use vertex lighting? Granted, it won't look as good, but you can keep your old card and use light mapping instead.
ATI didn't pack any "new" features into their current crop of cards, because the APIs weren't there to take advantedge of them when they were being developed. You can bet your ass they have their R&D departments all over the new funtionality exposed in DirectX 8.0 and are busy creating new cards to go head-to-head with the GeForce3.
This is a good thing. NVidia has raised the bar, now the others must try and top them. That's how we get better hardware folks, it's not a bad thing.
Re:I use a Voodoo 3 (Score:1)
higher hopes for ati (Score:1)
and yes, nvidias relations with MS are a small part of that concern. i think its really funny that nvidia bought 3dfx and suddenly the linux drivers are gone from the 3dfx web site and not available from nvidias either, yet all microsoft OSes are still supported...
my next card will most likely be a radeon (or radeon2 if they are out be then) it will be nice to play with the AV stuff, and maybe even use it for work...
Re:Nvidia embracing and extending? (Score:3)
I disagree. Their programmable vertex shaders are a very good idea. Of course developers may want to directly access this features, and maybe make games that requires GeForce3. But there are very good sides too.
First, having programmable vertex shaders can help them implementing better opengl drivers (for instance glTexGen). This will help existing programs.
Second, a lot of new tricks can be experimented. 128 SIMD instructions is huge. I for one, would love to hack on this a few weeks. My mind blows on all the tricks that can be done basically free. Creative use of the vertex shader will undoubtely be implemented by competition, and would ends up as "standard" open gl extensions.
(Btw, I don't see any noise function, or texture lookup ops. Maybe I should check more closely).
> avoid the GeForce3, and avoid any games written to depend on its features
I don't see a reason to avoid the GF3. Of course, avoiding games that *only* support it is a good idea. In the same vein, we should avoid games that only support DirectX, and games that runs on windows.
Not very realistic...
Cheers,
--fred
The damn thing (Score:1)
Re:Moving into 3DLabs territory (Score:1)
You can buy a 3DLabs AGP Oxygen GVX1 for around $600. This guy is billed as "Mid & High-end". I couldn't tell you how it stacks up against a GF2 or 3, but there's a price2price reference.
Re:Driver obsolesence (Score:2)
For instance NVidia offers the same driver set for everything from the TNT, TNT2, up to the GEForce2 Ultra (and maybe GEForce3).
Matrox is also famous for offering unified driver sets going back to the original matrox millenium.
ATI is doing a poor job of supporting Win2000 even with their latest and greatest. There was a recent comparison of Win2000 vs Win98 drivers from ATI and NVidia, and the ATI drivers performed terribly! In some cases 1/2 the speed when tested under Win2000. Plus buggy (some tests could not be completed).
Re:Nvidia embracing and extending? (Score:1)
Vertex shaders are a more general implementation of T&L. They will eventually replace it and are a huge improvement over it. Pixel shaders are similar, they replace the fixed function blending modes. They are both supersets of the older stuff which will still be accelerated just fine.
Wrong. The API's for accessing these new features are well documented and well defined. If you had bothered to check up on what exactly these features are you would know this. That's not "embrace & extend". Noone is prevented from implementing these features themselves.
Of course, NVidia could concievably whack a patent claim on anyone who does implement them (assuming they've got the patents, I haven't got time to check today), but as that would prevent anyone but NVidia implementing the complete DX8 API Microsoft probably wouldn't be too happy.
Read the previews, GF3 includes a lot of features like this. You might also remember something called T&L. Fully backwards compatible... and NVidia were the first to introduce it in a consumer chip.
Is it just me or is any sort of innovation at all these days getting slapped with the "embrace & extend" label just because (horror!) using it requires dropping some backward compatibility?
Maybein US but I live in France (Score:1)
Maybe next time you could think that not anybody lives in the US before saying stupid things.
Yes, I know I can order from the US but if I have a problem with the card, I would be out of luck, plus the shipping fee are not exactly free you know.
Re:Support? (Score:1)
Re:Blurring of truth and virtual reality (Score:1)
Some people may say that we're losing something by interacting in this way -- but what we're gaining is so much better. It used to be people were forced to form communities with those around them -- purely due to geographic coincidence. Now I can form communities with people who think like I do, who appreciate what I appreciate, and who value what I value. All from the comfort of my home. I haven't been shut in -- I've opened up even more!
Surely as our technology improves, this will continue (note I'm not suggesting better graphics cards will lead to an increase of this effect, just that it's already a beneficial phenomena and this can't harm it in any way). Sure -- if we all had this virtual world, and we all could look however we wanted, you might see some physical prejudices creeping back in.... On the other hand, imagine the joy a wheelchair-bound or paralyzed person might have from moving their avatars around a truly interactive artificial world....
$600 clams (Score:1)
Driver obsolesence (Score:3)
The video card people seem to have like three people that write drivers, and they're always busy writing drivers for the "current" crop of cards, until the cards actually are available, at which point they switch to writing drivers for the "next" slate of cards and the "old" cards simply do not get new or improved drivers written for them. A new OS often means getting stuck with the OEM drivers provided in the OS.
I'm perfectly happy with my ATI-128 performance in the games I've played it with. I've toyed with hunting down a $120 GEForce2 card, but for the reasons you stated I'm missing the why part other than getting drivers more modern and optimized drivers than I'll ever see for my existing card.
Feature Bloat? HAHAHA (Score:1)
Cool, they even have a negative term for being the best at something and then getting better.
Now, that's cynical.Re:$600 too much for you? Can't afford one? (Score:1)
Re:When "blazing fast!" becomes "who cares?" (Score:1)
I think that could be a self-fulfilling prophecy of sorts. Game developers must cater to the general user base, and if that base stagnates itself, then it necessarily holds up progress on all but the most cutting-edge games. As long as large strides forward are made in graphics, though, I believe we as a people are largely attracted to the newest and shiniest. Most non-gamers won't see the big graphical improvement from, say, Half-Life to No One Lives Forever unless they see them side-by-side. Once the improvement is more obvious (probably by next year), the cards will continue to sell.
You also make a good point about the developer resources. But how much of that development time/effort/expense is because the engines cannot be reused? A new engine must be built (or licensed and then probably modified heavily) every couple of years. Once the horizon comes closer and we're gaining the 5% from 90 to 95 instead of the huge leaps from Quake II to Unreal Tournament, a much larger jump in which much of the graphical technology for Quake II was unusably obsolete, possibly these costs will go down and improved games will actually come more quickly, as a tweak to Quake VI is enough to warrant a fanfare and Quake VII isn't necessarily required.
Re:$600 clams (Score:1)
Re: MX (Score:1)
Re:Z Occlusion culling (Score:1)
It demonstrates some of the problems and shortcomings of the technique.
Re:Nvidia embracing and extending? (Score:2)
Re:Support? (Score:1)
I can't check on this at the moment, but I thought that 0.9-6 had support for XRender...
Ranessin
Re:Oh, great... (Score:1)
You don't need to buy the latest CPU neither, i.e. paying 300-400$more for only a few 100mhz's more.
Thing is, that will drive the price of the GTS ultra or GTS 64MB DDR down, which can be used right now for gaming, in a year or more from now the geforce3 will have come down in price (decent gaming board) and will support the games out by then.
Oh one last thing, there's not only GAMING in the world, that card competes with the big-ass 3D accelerators for CAD/3D modeling/3D animating.... why would I shell out 2000$+ if a Quadro3 (probably) will cost me 1000$ (or better yet, 500$ plus modification like with the Gforce2
That thing must scream in lightwave.
Re:Fsking Porser (Score:1)
__
Re:Oh, great... (Score:1)
Re:Nvidia embracing and extending? (Score:2)
Re:I use a Voodoo 3 (Score:2)
A: You can tell the difference between 30 and 200 fps. Maybe not between 70 and 200, but 30 and 200, yes. And a system that gives you 30 fps in one place will bog down to 10 fps in another. If you can get 70 fps, it will likely only bog down to 30 fps when things get ugly.
B: Getting tech like this out there allows game developers to push the boundaries even further. Now granted, we didn't need the explosion of colored lights that happened when the voodoo2 came out, but still, the point is good. As the tech grows, the developers can use a toolset much richer than they had before. Look at the differences between Quake 1 and Quake 3. The difference between a Voodoo 1 and GeForce2. Imagine that level of difference from what we have today....
C: Your example uses 1024x768. Why should we settle for such a lame approximation of reality. My desktop is 1600x1200. I drop to 1024x768 for my gaming, because anything higher causes noticable performance degredation. I used to settle for 512x384. Now I can't imagine going back to that. And in a few years, I won't imagine being forced to go back to 1024x768.
Nobody's forcing you to buy these new toys. Not everyone needs them. Personally, I can't see spending 10 grand on a home stereo -- after a certain level, they all sound the same to me. But I surely don't say it's "against all common sense" that someone might. I buy my toys, you buy yours, and we'll all live happily ever after.
Tom already caters for this kind of troll :) (Score:2)
Pretty impressive really. Anyone who might feel a twinge of conscience when following a link to a $600 video card they're thinking of buying is almost immediately comforted with a charity banner where they can assuage said conscience.
Re:I use a Voodoo 3 (Score:2)
Why do people buy SUVs or luxury cars when a Geo Metro or Mini will do? Why do people buy the latest fashions of clothes when they could get last years in sales?
A lot of people want to play the latest games when they first come out, and have the best machine there. If you don't feel that pressure, then I congratulate you: you're in a sensible minority.
"Noone can tell the difference between 30fps and 200fps anyway"
That's so untrue. I can tell the difference between 50 and 60, no problem. After playing at 60 or above, a drop to 50 or below is very noticable and definitely not as smooth and hampers playing ability until one adjusts.
"Another problem with video cards is that the performance is becoming optimal anyway"
No. There's a very long way to go yet. With more power, there are so many features they can add. Go and read something about 3D graphics, and you will realise how limited this cards are still.
"At 50fps this is approximatelt 37million pixels per second. So it is intuitively obvious to all that a video card with a performance in excess of 37million polygons per second will not provide any better performance under those conditions. Why pay extra for something you can't see?"
You can't base polygon count on pixel count. Some polygons get rendered, but then are obscured by polygons in front of them. So yes, you do have to pay for something you can't see
"It is like insisting on a 500kbit sampling rate, when 70kbit sampling rates are perfect to the human ear"
Not all sounds are sensed via the ear.
In general, I do agree with your sentiment. I bought an original GeForce 256 DDR when it first came out. I'm still trying to justify the expense. If I had waited, I could have got a better GeForce for less. I'll do that again. I'm sure I'll buy a GeForce 3 eventually, but I'll wait until the Ultra model is cheap enough (I just know that there will be several generations of the card).
Moving into 3DLabs territory (Score:2)
Nobody is going to program exclusively for this card until it saturates the user base. Which, at this price level, ain't gonna happen soon.
Wonder if the "professionals" will strike back
Re:Nvidia embracing and extending? (Score:2)
This is like car and driver saying "Chevy has made a new car, that is only 2 grand, gets 110 mpg, and goes from 0-60 in 4.7 second! Its the greats car we ever seen, full of new features.....but we shouldnt buy them! because that will put ford out of buisness"
Still some chinks in the armour... (Score:2)
Re:Still some chinks in the armour... (Score:2)
Re:Z Occlusion culling (Score:2)
$600 too much for you? Can't afford one? (Score:2)
$600 too much ? Not if you have work to do... (Score:3)
It is great that we can use it for games too, but that isn't the point for many. I am sure there will be an even more expensive version of this in Nvidia's Quadro line, it'll have greater throughput and more processing power...so it'll get bought. It'll make DOOM 3 scream, but that isn't why you buy it.
Unless you are a, "soul of independent means."
Re:Oh, great... (Score:2)
Hence my purchase of an N64 (and probably a PS2 in another year). It was cheap, I can play it on a 35 inch screen from my couch (my desk chair is comfy, but nothing like my couch). Things rarely crash (weird problem with Gauntlet Legends). I can buy tons of cheap games on eBay and at FuncoLand.
Did I mention that it's cheap and I can play on a big screen?
Hey, I love PC games. I got a Voodoo II VERY early. But spending $1000 per year on upgrades is nuts. I've got too many hobbies to keep up this computer upgrade business (ever seen the NOS prices on old Honda motorcycle parts?)
If that is your thing, by all means, go for it. But as for me, I'll be excusing myself from the party now.
Re:Nvidia embracing and extending? (Score:3)
I think that this is a good thing. if we can make features standard in DX8 then diffrent cards will support these same features and maybe we can get game devs to support these features if they are able to be implimanted in diffrent cards.
More previews... (Score:2)
GF3 also has "free" visibility testing (Score:3)
And since you can turn off color and z writes, you can test visibility with no changes to the frame buffer. This is perfect for a portal game where you can cull entire rooms if they are not visible because of things you traditionally couldn't compute. If there is a big fireball in front of your face, or a character/pillar is blocking the view. If you have a few monsters that require a significant amount of time to draw, then you can test to see if they are visible first by rendering a coverage polygon first.
You can use this to test the visibility of lens flares so they fade smoothly in and out as they go behind other objects.
You can also use this in game logic in combination with a shadow map to tell how much "in the shadows" characters are. This can make the AI more realistic.
Getting back pixel write counts from the hardware has a very long latency, so it can't be preformed super frequently - but it's a lot faster than trying to read and process the z-buffer yourself.
And for those of you not wanting to spend $600 for a GF3, just wait for the Xbox - it's including almost exactly the same hardware for half the price.
Carmack on GeForce3 (Score:3)
http://www.bluesnews.com/plans/1/ [bluesnews.com]
Carmack has quite a bit to say on the subject as this .plan update is rather long (a little too long for a /. comment I think).
The Other Big Reviews (Score:4)
Sharky Extreme: http://www.sharkyextreme.com/hardware/articles/nv
AnandTech: http://www.anandtech.com/showdoc.html?i=1426 [anandtech.com]
HardOCP: http://www.hardocp.com/articles/nvidia_stuff/gf3_
-pepermil
Leading the way forward (Score:2)
No offense, but having worked with this architecture for a while now, I have to say that the NV2x approach isn't an attempt to hamstring the graphics industry. It's an attempt to raise the bar of hardware design and bring the industry to a new level of verisimilitude in graphics rendering. Criticising nVidia for being a monopoly because they have the technical smarts to develop a revolutionary rather than an evolutionary solution just doesn't make any sense.
With the programmability of the vertex and pixel shaders, graphics applications are now free to create a whole new engine architecture... one that's free from the idea of fixed-format vertex data but instead is purely representation-driven. Because you can pass any binary data that you want to the vertex shader, you no longer have to represent the properties of your surfaces in an implicit format whose characteristics are defined by the fixed capabilities of the hardware. Now that this programmability is available, you can encode surface data in a format that actually stores exactly what you want. The NV2x is the first hardware engine that I feel can be called a "GPU" in more than just name - its capabilities will allow application developers to craft graphics engines that just aren't possible on a card that is "dedicated to improving performance with current software" as you cite you'd like to see. ATI recognize this - witness their Pixel Tapestry technology for pixel shading.
NV2x is the same kind of advance over GeForce2 that the original 3dfx Voodoo cards were over the prevailing PowerVR and VIRGE chips back in (1997?). You didn't see anybody complaining that 3dfx were trying to lock people into their proprietary technology back then, for the simple reason that everyone recognised the potential that was inherent in the change of focus. It took a while for games to become "3D Card Required"... but I'm 100% certain that nobody wants to go back to Quake II-era rendering. The benefit to the application programmer and the consumer is obvious. NV2x may not be the winning solution in the new space that's opening up - but it's a damn good opening salvo.
To paraphrase your post: If you don't want to see the capabilities of 3D graphics engines advance beyond the current status quo, avoid the GeForce3. And miss out.
- Butcher
P.S. You'll be amazed when you see what we can do with this technology. This is a great time to be a game designer or game player.
Argh, one type spotted (Score:2)
W stands for homogenous coordinates. Pg 204, Computer Graphic Principles, Foley & van Dam)
(PreDX7 had something called "rhw" which stands for "reciprocal homogenous W")
Re:I use a Voodoo 3 (Score:2)
Re:I use a Voodoo 3 (Score:2)
from the hype (Score:2)
Well, I for one would really like to see that become a reality. Now I know every one who has ever touched a compiler (and often many who don't even know what a compiler does) will give a different opinion on how this is possible, how it has been done, or how it will never be done, etc. And most say that now programmers need to step up and use the features. (I agree with this one) But my question is, are API's keeping up with it. How is DirectX handling more advanced features (not just is it 'supported' but is it clean and efficient) How about OpenGL? How are the projects for middle ware coming like WINE with DirectX and so forth?
I am really impressed with the graphics detail and performance out there right now. I personally want to see more stability. While I am aware of the argument (and it does hold some water I admit) about a higher fps giving better all around performance, it is still common for intense graphics scenes to chug on your machine. I wouldn't mind seeing the ability to average out the fps better, as set by the user. Some method on the hardware to reduce the quality or certain methods only if it detects a forrest of high quality polygons and its own slow speed.
And I would REALLY like to see some better AI, and maybe some api's that make use of hardware driven ai... ok just kidding on that one, but perhaps a set of advanced ai libs with an API isn't too far out there. Tie these in with any of the methods for network distributed processing and you have an amazing LAN party set up. Throw some together on your home server farm and now you have your game server set to go... ahhh. Soon it will be like the old BBS days... but with better graphics and real time interaction.
Well, end of wish list... maybe the internet fairy will bless this if it is seen and make it real.
Re:Oh, great... (Score:2)
Why is it people are complaining about "having" to upgrade twice yearly at great expense (supposedly because it's required to run the latest games), and then at the same time they complain that the features won't be used by the latest games for 12-18 months anyway!
Come on people, make up your mind. If you like it now, buy it now. If you can wait, buy it later and save some money. I just don't understand all this moaning about it.
Re:Nvidia embracing and extending? (Score:3)
- "Lightspeed Memory Architecture", similar to ATI's HyperZ (but more effective), with an interesting crossbar memory controller & Z compression, requires no support, and makes your existing games run faster.
- "Quincunx multisampling FSAA", a high-quality, more efficient AA method makes your existing games look nicer at considerably less performance cost than previously possible.
Increasing fillrate is pointless, when things are already so memory-bound. T&L is improved, in the way that developers have been asking for most: programmability. And as mentioned elsewhere, these are all standard DirectX8 features, so you're not required to be "nVidia compatible", just DX8 compatible, which is expected anyway.
Gee... (Score:2)
--