Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software

GeForce 3 Review on Adrenaline Vault 96

JWhiton writes: "The lucky folks at the Adrenaline Vault have managed to get their hands on a reference GeForce 3 board and have run it through the standard gamut of tests. Looks like it really performs well at very high res with 32-bit color due to its efficient new rendering techinques. However, it lags behind the GeForce2 in a few areas. You can find the article here." We've already run a few things about the GeForce 3, but the more the merrier.
This discussion has been archived. No new comments can be posted.

GeForce 3 Review on Adrenaline Vault

Comments Filter:
  • by Anonymous Coward
    As sites such as tomshardware.com point out, there is little reason to get buy a GeForce3 now if you are a casual user: the performance benefit in current games is small. The situation is drastically different if you are a developer, however: this card is the first to fully expose pixel shaders, vertex shaders and other cool DirectX8 stuff. These are going to form the graphical basis for 2002 games. The only real benefit of GF3 in current games is Nvidia's fast Quincunx antialising; but combine that with the $500 price tag and you see why most hardware sites are advocating against an immediate purchase.
  • by Anonymous Coward on Sunday April 08, 2001 @05:16AM (#307549)
    Did you at least notice that the bars length was not proportional with the results they are expressing?
    I just see the GF3 beeing less than 10% slower in 2D than a GF2 or Radeon. That's not a big deal at all...
    It could be one if this continues but I am pretty sure this is due to some immature drivers.
  • If anything these benchmarks should show you how little the current hardware T&L is being used and helping performance.

    The Anandtech benchmarks were done some time ago with drivers that are a month to two months old. The feedback from Anandtech, other sites, and game developers have caused some more work to be done on the drivers and the result is this:

    • The so called "performance" hit in Giants was really a Giants bug... the demo shows the hit but the real game has a patch available that greatly speeds up the kyro and makes it look much nicer image quality wise
    • The graphics and performance bugs found in MTBR turned out to be MTBR bugs (notice a theme here?) which were fixed in a patch by the MTBR developers. The game now runs much faster without the noted graphics bugs. (Does the GeForce3 even show textures in this game yet? ;)
    • Much of the Evolva DX8 benchmark penalties have been fixed (driver bug) and the Evolva people are adding features like the 8 layer multitexturing stuff that the Kyro supports but the Geforce 2 falls flat on its face with.

    Some benchmarks with latest drivers and all the common tweaks which don't create combatibility problems:

    http://www.rivastation.com/3dprophet4500-64mb.htm [altavista.com] (use babblefish) Note how the GTS beats the kyro at 16bit but the Kyro beats the GTS at 32bit... at low res's the GTS wins but as the res is bumped up the GTS starts hitting bandwidth limitations which the Kyro doesn't see at all

    MBTR Benchmark [altavista.com] The GTS wins at 16bit, the Kyro wins at 32bit... GASP! What?!?! T&L didn't help the bandwidth limited GTS at 32bit?!?! OH NO!

    MDK2 [altavista.com] MDK2 is about the only game currently where T&L counts... but look at 1280x1024@32bit color the KyroII turns in an acceptible 61fps compared to the Geforce GTS's 56fps. Why? The GTS's DDR powered memory bandwidth is capped while the KyroII's smart way of rendering means its non-DDR bandwidth isn't even sweating. What good is T&L if you can't fit all the textures through your memory bus?

    Full Screen Antialiasing [altavista.com] The Kyro whipes the GTS all over the floor when FSAA is used at decent resolutions... sorry Nvidia fans, you'll have to fork out $500 for nonbroken FSAA.

    And next its time to discuss the myth that the GeForce 2's T&L will be even remotely useful in a year or two... Aquanox (DX8 features) [altavista.com] This benchmark stresses DX8 features such as the programmable T&L and all the cool features Doom3 will be using. The Kyro lays around at 10fps and misterious drops off on the highend (driver bug?). But what is shocking is look how low the GTS scores are. Even at 640x480 its not even "playable". Proof positive that the GeForce 2's onboard T&L goes out the window when DX8 takes off and its in the same boat as the Kyro.

    What does this all mean? It means if you have the cash get the GeForce3... If you don't, don't bother paying all the extra cash for the Geforce2 GTS, Pros, Ultras, etc. As they will be useless with Doom 3 and friends anyways. Do what I'm going to do, pick up a Kyro II for below $150 in a month, use it for a year, and grab a next generation GeForce3 when they are at a reasonable price (> $250-$300 is not reasonable for any video card... bleh)

    I'd like to also note that you obviously don't own Tribes2 or understand how fucked up an engine it is if you dare mention it in the same context as card performance. My G400 is performing much better then VooDoo5s with this game! And sub-$300 GeForce2 owners are constantly bitching about performance as $150 Radeons are beating them consistantly. When you go from quality at 100% to the lowest quality settings and you only get a 10fps boost, something is fucked up.

    Ok.. I'm done ranting... hope someone found this informative :)

  • Sorry folks... I cut and paste all those links and it turns out they use some weird cookie shit... sigh...

    Well anyways to see the benchmarks pop this url: http://www.rivastation.com/3dprophet4500-64mb.htm

    Into http://world.altavista.com [altavista.com] and select German to English. On the site you'll see a sidebar on the right side with links to the individual benchmarks.

  • Going from a $300 Ultra to a $150 Kyro II would be a step back. But going from a $200-250 GTS to the $150 kyro would not.

    See my post above... all indications are that memory bandwidth is much more important then T&L with current games and the Geforce2 end up doing T&L in software mode with the programable T&L features in DX8 (which means the GF2 is in the same boat as the Kyro).

    If you can afford the Ultras or Geforce 3s then go for it. But keep in mind the Ultra becomes as useless as the Kyro when Doom 3 ships ;)

    As for CPU anyone who uses a P3-500 obviously isn't getting great performance no matter what card they have... if you have a p3-500, skip the vid card upgrade... Grab a 1ghz amd chip and mobo for that $200-250 you would have blown on the GTS.

    I'm excited about this card as it gives a "fresh" approach to the problem of limited bandwidth that even the Geforce 3 keeps hitting. If the Kyro 2 sells well, I can't wait to see their next offering that brings in the T&L from their arcade boards, higher clock speeds, more pipelines, DDR, while keeping the tile renderer approach.

    Hmmm... I wonder if the rumors about Creative Labs becoming the second mainstream card manufacturer to adopt the Kyro II are correct... If they are, expect nvidia to start dropping prices on the GTS fast! :)
  • What good will your T&L do if your higher geometic detail games will be maxing out the cards bandwidth? Show me a $300 nvidia card that doesn't suffer from memory bandwith problems at high resolutions and I might believe you :)
  • Perhaps its the programmable T&L having to be emulated in software for the GF2s? Would memory bandwith be the issue (yeah... I'm sounding like a broken record :) See my post below.
  • The bottleneck in Tribes 2 isn't the 3d card. Its the shitty engine. As I said before, have you even played the game? Checked the forums while people try like mad to get more then 30fps at 1024xwhatever on their GeForce2s? Until Tribes2 is fixed we won't know what benefits the engine. Its a fun game with a fucked engine... glad I plopped down my $40... sigh...
  • Small single player worked great for me... I was getting 30fps at 800x600x32 on a G400 (which is great considering how old the damn card is) but the minute I go online to servers which the browser say have 20 ping and 20+ people, etc performance goes to shit. Even if they are not visible for me and I'm staring at the ground I get like 15fps (I was forced to turn it down to 640x480 for the 5fps gain... which in itself says something is seriously wrong) I hope Sierra fixes it... :/
  • 32mb GTS may be for $120 but the 64mb DDR GTSs are nowhere to be seen for that price. Or am I missing something? If you can find a GTS 64mb for $150 from a well known vendor please do so... I would be estatic and would buy it right away :) I've also seen quotes that Doom 3 at 800x600x32 with all the features will run at 30fps on a GeForce3... Though since I havn't seen a single a single interview where he discusses pure performance numbers yet, I'm treating it as a rather depressing rumor and one more reason not to pay more then $150 for a card that will only last a year or two until DX8 is in full force.

  • This is nVidia. They've had fully functional drivers on Linux for a while now.

    I play Team Arena every night under Linux beautifully on my GeForce2 GTS (with my USB Intellimouse Optical, even). Stuff Just Works under Linux now. It's awesome.

    ... now if only Loki would get thier ass in gear and get the Tribes 2 Linux Client out the door...
    --
  • Shut up! Those morons with too much money bring the costs down for the rest of us who are not gullible enough to get the cutting-edge wiz-bang hardware.
    ___
    • I will buy a Geforce 3 someday aswell... probably within the next 8 months.. but it wont be the first generation, otherwise known as "MX" version of the card... I will wait to buy the "Pro" or "Ultra" version of the card wich will cost as much then as this first gen card does now.

    The MX chipsets from nVidia are not first-run chipsets. They're the economy chipset, with half the number of pipelines and neutered in other ways. These are the cards that your average Joe buys, when he doesn't want to spend more than $150 on a card, yet thinks he needs the "latest and greatest". The funny thing is that a GeForce 256 GTS is about the same price as a GeForce 2 MX, yet the GeForce 256 will perform better because it's not been castrated.

    nVidia has various different chipsets that span the market vertically, from the MX for average Joes, to the GO for mobile users, to the Pro and Ultra (where the Ultra is typically a 6-month refresh) for the gamers, to the Quadro for graphics professionals. Check out www.nvidia.com [nvidia.com] for more info on their product lines.

  • by augustz ( 18082 ) on Sunday April 08, 2001 @12:40PM (#307561)
    It makes complete sense to release the card. Anyone who is going to be developing games with an eye towards the future would be STUPID not to go get one of these babys. And I mean, REALLY STUPID. $500? Unless you plan on being a developer at the cheapest, rattiest development spot around, $500 should be in your budget.

    There is a ton of focus on the fact that the GeForce 3 doesn't blow the socks off of the Geforce 2 in current limited games. I'm assuming the same idiots posting that blather asked, "What's the point of the gutenburg press, it can't reproduce full color paintings very well, and the existing technology does." Both the new idiots and the idiots back then don't get the fact that the new technology makes an array of new things possible. If you are not a developer BE GLAD that they are releasing the card publicaly so anyone who wants to try their hand at it can. It will be better games for you in the long haul.

  • That is all fine and good, but my fucking NVidia card doesn't work for X in either PPC or x86. Why? Becase the drivers are ASS and nobody can get in there to fix them. Now that NVidia is the only 3d card manufacturer in town, they managed to hoodwink PHBs into lucrative OEM deals and lock ins.. all of which conveniently disregard the fact that NVidia's 2d performance 1) sucks and 2) is unstable in just about ANY system, even ones that are supported.

    NVidia can blow me, and as soon as somebody else topples their 3d card monopoly, I will never buy their cards ever again.
  • Did that. They don't work. I played with them for 2 months. I have a dual processor Celeron 500 with a BX chipset and NOTHING works.

    And they don't work under PPC. I've tried. Send me a version compiled for the PPC and I'll try it.
  • I should clarify... I don't have the source. Care to send me a tarball, or a precompiled version for ppc?
  • I am absolutely positive the Nvidia drivers do NOT work on my SMP system. I have tried every option under the sun. Neither the stock nv drivers or the nvidia devel drivers work properly.

    I am also absoultely postive that I can't physically build PPC binaries, since, hey, guess what? No source code.

    Again, I reiterate: NVidia, blow me.
  • http://www.wgz.org/chromatic/nvidia/analysis.html
  • I was having some strange problems with the lameness filter for some reason. "Junk charector post". Here is the beyond3d link: http://www.beyond3d.com/interviews/cebit2001/index 2.php
  • Look at the docs already linked.

    As geometric detail increases, so does required bandwidth. So does overdraw. This is where the kyro shines. Take a look at some of the benches with some nice high polygon count images, the Kyro2 does REAL nice.

    Lack of T&L is a BS argument that nVidia only fanboys keep putting up, which has little basis in reality.

    1. The highest selling cards right now have extremely limited T&L units, or none at all. (Dell and Gateway sell machines by default with Rage128's and TNT2's). 2. Game manufactures not only look at current day cards, but what the majority of the market has, so titles that rely completely on T&L won't appear for at least a year. (Dx8 titles probably longer) ESPECIALLY with the slowing economy. 3. The software T&L implementation in the Kyro2 is extremely excellent, and makes use of extensions in modern processors, DESIGNED to do those manipulations. 4. The Kyro2 has extremely advanced texturing and can do 8 layer multitexturing in a single pass, far beyond what the GeForce can do. 5. The mere fact that the Kyro2 is coming out with a big name behind it, and will most likely get developers attention, indicates that they will design games with at least some thought to it. 6. nVidia has managed to piss of Hercules(see earlier post), and Creative will no longer be selling graphics cards, this means that the low end retail, will have severe competition between Kyro2 and the MX. (And if Creative starts making the Kyro2, as many rumors and such say, nVidia wil have further problems.) See why this is important in 5. 6. Best selling games aren't 3d anyway.

    You're just using T&L as an excuse for why not to buy a Kyro2, when it performs extremely excellent on current day games.

  • FACT: The Kyro2 performs very well on games USING Hardware T&L, only a few titles have shown difficulties, and there are driver optimizations and such on the way for them. This has been stated again, and again, in several links provided by others.

    The average person doesn't care about technology, they care about what works. It doesn't matter that you've written a 3d engine, which is BTW, very impressive, but what DOES matter is consumer trends. You can't design a game around high-end hardware and sell it.

    You haven't addressed any of the issues I have brought up, only stated that the nVidia cards are more featured. Of course, the Kyro does have hardware EMBM and 8 layer multi texturing as features. Do they not matter?

    Again, highly detailed scenes, such as those present in Doom3, have LOTS of overdraw, haven't you read the quotes from John Carmack? Funny how you kindof brushed this issue asside.

    CPU requirements? Which would you rather do? Pay $150 for a video card, and put that other $100-150 to a better CPU. (Take a look at the difference $150 can make on pricewatch: Duron 750Mhz - $45, Thunderbird 1.2Ghz @ 266Mhz FSB - $193) If you were an OEM which would you rather put:

    1.2 GHZ System ---- specs

    750Mhz System --- specs

    And of course, the argument is, simply that the GeForce has T&L makes it supperiour? LOL! People bought WinModems didn't they? And you still haven't refuted that the Kyro2 performs better or as well as several other cards in Dx7 and 8 games using T&L. Or how well it performs in GL games.

    It is a budget card. Aimed at people buying MX's. NOT GeForce3's. The fact that the performance is excellent is only a side-benifit. Do you have hard numbers from several games showing your claims on hardware T&L?

    Let's take a look:

    * Quake3 1600x1200x32 Normal- GeForce2MX(20.1) - Kyro2(41.6) - Performance Benefit(Loss) 207% * MDK2(T&L enabled) 1600x1200x32 - GeForce2MX(23.7) - Kyro2(41.4) - Performance Benefit(Loss) 174% * UT 1600x1200x16 Min Frame Rate- GeForce2MX(10.8) - Kyro2(22.66) - Performance Benefit(Loss) 210% * Serious Sam - A new game @ 1600x1200x32 - Geforce2MX(12.5) - Kyro2(34.2) * MBTR(w/o driver fixes) @ 1024x768 - GeForce2MX(17.1) - Kyro2(30.8) - 1.80

    Some of those games could be considered old tech. But serious Sam, and MBTR, both represent a sample of the type of games we will see this year. Looking at just how far ahead the Kyro2 is in front of the MX, I fail to see how the GeForce2MX, even with T&L, compares to the Kyro2.

    If the rest of the card can't back up the limited geometry processing seen in the MX, it is worthless.

    I would rather upgrade my CPU alone in a couple months, than my CPU and video card. Even on Dx8 titles(Aquanox) the Kyro2 performs about the same as the GeForce2 and Radeon.

    You failed to make any claims regarding the utility of T&L, other than, "I want to see the industry move forward."

    Quite frankly, what's best for the consumer right now, is what matters to them. They don't give a crap about what YOU do or don't want in the future. As I said before, the Kyro2 has a nice software T&L implementation. (Which, in Dx8 games, if I recall correctly, the GeForce2MX will fallback to something similar.)

    Hardware T&L is cool, but it, like HDTV, will have to wait till prices come down, and the technology is a bit more developed. The GeForce3 is the first example of a developed card. It also, just like a HDTV, is too expensive for the consumer, who won't see any benefit expect in a hand-full of token titles. Titles like the SIMS, C&C RA2, and Diablo2 are the best sellers.

    Well, I like to think that game developers consider people that have purchased their computer in the past year, the great majority of which own Rage128's and TNT2's, and not screw them over by forcing them to upgrade every 6 months, as some companies would.

    Good game engines will know when to take advantage of features that are there, or not.

  • by amccall ( 24406 ) on Sunday April 08, 2001 @06:26AM (#307570) Homepage
    For those that have not yet seen this little spectacle of corporate greed you might be interested in taking a look here

    Nvidia PDF document link in order to avoid lameness filter. [savagenews.com]

    nVidia has recently admitted that this document is real but that it was intended for marketting people only. I would contend that the document really has been sent to several businesses which several industry sources say is true.

    In the PDF nvidia basically says that one of their key video card makers is a crap company. What was that makers response? They are right to be scared 3D Prophet 4500 will really be a great product Street Date May 16th 2001 - Claude Guillemot, President Hercules Technologies.

    But that isn't as interesting as the news that follows:

    Hercules one of nVidia's major card manufacturers will not make a new MX card.

    Well Nvidia stopped the production of the MX chips these will be replaced by MX200 and MX400. We will not produce a board based on the MX200 or MX400 chips. The big problem is that the new MX series will have a bigger pricetag as the Kyro II based boards and the performance will be slower as the Kyro II based boards. So we don't think a the new MX series will sell very well.

    Thought this was all a bit interesting lesson

    don't go around making your big suppliers mad, even if it is with leaked documents.

  • Every review I have seen on the Kyro II has been in a GHz+ Athlon machine. If you were to compare the Kyro II to the GeForce MX in a 500MHz Intel machine, the numbers would look very, very different. The GeForce cards perform extremely well in "older" machines, whereas the Kyro II will not.

    The Kyro II has about the same feature set as a juiced-up Voodoo2. It's got a higher fill rate and supports more texture units, but that's about it. The GeForce3, on the other hand, supports some extremely cool NEW features for developers to take advantage of. I do agree that the GeForce3 is not something users would be interested in at this point in time because of the price, but for 3D developers, it's a wet dream.

    And one thing that should not be glossed over is that NVIDIA has superb Linux drivers for their cards *TODAY*. If GeForce3's were currently available, you could pop one in your machine and be up and running within minutes (the current drivers have GeForce3 support).

    On the other hand, Kyro drivers do not yet exist. They have promised Linux support, but it remains to be seen if/when support will be available and what quality the drivers will be. NEVER purchase hardware based on a promise from a vendor, because these promises are often not kept.

    If you're running Windows and have a very fast processor, the Kyro II will give you respectable performance at a nice price point. If you want to run under Linux with a card with a proven track record, buy NVIDIA.

    -Mark
  • If you can call unstable drivers that may or may not work for you a "proven track record"...

    The NVIDIA drivers have been stable for quite some time. Stop spreading FUD just because they're not open source.

    I just purchased a Radeon All-In-Wonder a couple of weeks ago, and put it into a Athlon/SD11 box. Unfortunately for me, the Radeon driver does not work with the Irongate chipset at all. The only way to get X to run is with the "NoAccel" option, meaning I get no 2D acceleration, and no 3D acceleration. This is a known bug (#121904), and has been known since at least November 2000. Excuse me for saying so, but the open source magic just isn't working for 3D drivers. The G400 drivers were stable at one point (although not even close to compliant). The lastest drivers consistently lock up my system within 15 minutes of use.

    I've been running stable on many machines (most of them SMP) with the NVIDIA drivers for well over 9 months. The NVIDIA drivers are fast, compliant, and stable. I am deeply regretting my decision to purchase a Radeon which is completely useless to me under Linux.

    -Mark
  • Maybe for you, moron, but certainly not for everyone.

    I stand in awe of your maturity. You're obviously an intelligent person with much experience on the matter.

    Out of curiosity, I did a Google search on "ranessin NVIDIA" and came up with a few Slashdot hits. Here is one of your posts:

    nVidia's drivers are fine as long as you don't have a problem. If you do, you're screwed.

    With the DRI drivers, if you have a problem or uncover a bug, just ask the DRI developers and it's usually fixed in a timely fashion.


    So the bug which causes the Radon to totally lock a machine with the Irongate chipset has been known by the DRI developers for 6 months, yet it still exists.

    BTW, is your second V5 processor supported yet?

    In my personal experience, I've never been screwed with the NVIDIA drivers. I am, however, screwed in a major way by the DRI Radeon drivers.

  • Don't accuse me of spreading FUD, and I won't point out that you're a moron. Quite simple.

    But you are spreading FUD. A very small number of NVIDIA users experience problems with certain hardware combinations. The same is true of the DRI drivers. The NVIDIA drivers are no less stable than the DRI drivers, and in many cases, more stable. They certainly offer more features and are far easier to install.

    No, but simply because 3dfx went belly-up at the time they were working on it.

    One argument that is used all the time in support of open source drivers is that if a company goes belly-up, progress on the driver proceeds. This doesn't seem to be happening with the 3DFX drivers.

    I was screwed for a very long time with no decent 3D drivers because they promised them and then took over a year to deliver.

    Poppycock. The promise that NVIDIA made was to deliver direct rendering drivers when XFree86 4 was released. XFree86 4.0.0 was released in March, 2000, and the NVIDIA XF4 GLX drivers were released in April, 2000.

    I am still screwed because their drivers don't like my system.

    And I'm screwed because the DRI Radeon drivers don't like my system. What's your point? If you're using this as your basis for saying the NVIDIA drivers are unstable, then you're also saying that the DRI drivers are unstable.

    You seem to be implying that the DRI drivers are perfect and that any bugs that pop up are instantly fixed. This is simply not true. The Radeon was released what, 10 months ago?, The drivers are still incomplete and in a state of flux. In my case, they don't work at all.

    The NVIDIA drivers were released in April 2000, with full support for the GeForce2. Yes, the first few releases were a bit shakey, but from the 0.9.5 release on, the drivers were quite solid.

    The NVIDIA drivers are not perfect. But neither are the DRI drivers. If the NVIDIA drivers are "unstable" by your measure, then you have to say that the DRI drivers are also unstable. The fact that they are closed source should have no measure in the evaluation of stablity. Just because something can be fixed does not mean it will be fixed.

  • On my linux box my nvidia tnt2 works perfectly an the 2d performance are stellar, so what to say about your statement of instability? That you speak withouth cognition! Before making such absolute claims make sure of what you say. Regards, Antonio P.S.: I beg your pardon for my english, I don't live in a anglophone country, so I don't make much practice.
  • From the review page:

    System Requirements:

    IBM PC or compatible with available AGP bus slot (AGP 1.0 or 2.0 complaint)

    See? You must have the proper complaint before the card will work.

  • Are there components on the card that are really costly to bring the price up?

  • I think that 3x speed improvement was under DirectX 8 games, mainly.

    And yes, I'm posting this way way late. I know. It's mainly to test my karma to see if I'm getting the +1 automatic bonus finally.

    -brennan

  • Kyro II has no T&L. You think that isn't important for today's games? Well, you better hope you don't like Tribes 2.

    ------
  • 3fps difference in a game that is not T&L intensive. Whoop-de-do.

    ------
  • Again, geometric detail generally is not so much limited by VRAM bandwidth. Fillrate is. See my other posts.

    ------
  • My GTS was $120, and that was quite some time ago... Are you using PriceWatch [pricewatch.com]? ::looks at PriceWatch:: umm... the prices appear to have gone up... odd... $130 for a GTS.

    Doom 3 is designed to run on the GeForce and up, according to Carmack. Of course, it will look much better on the GeForce3, but it will run on basically any T&L card.

    ------

  • I've played it a bit, and it appears that it gets alot choppier in internet play. Single player and lan games work wonderfully for me...

    ------
  • Unstable for *you*, maybe. I haven't had any sort of stability problem on NVidia hardware in almost two years... and the problem then was the motherboard. That's on either Linux or Windows.

    Now, why can't you make it work on PPC? Because the drivers aren't compiled for PPC. Duh. As for x86, I'd suggest trying the latest drivers, and seeking help on #nvidia at irc.openprojects.net.

    ------

  • There is no PPC version. And no, of course I don't have the source.

    ------
  • The document you refer to was written over a year ago, before NVidia released their XFree4 drivers. It addresses exactly none of my points, and has no relevance to anything which exists today. What is your point?

    ------
  • The page you linked to discusses memory bandwidth problems between the chipset and the on-board memory. This is not really a T&L issue. T&L info is usually pulled from AGP memory in large DMA transfers, or streamed directly from the CPU via the AGP bus. On-board memory bandwidth primarily affects fill rate, which is why the Kyro II kicks ass in that department.

    Trust me, I have been writing graphics software for two years. T&L is important. Going from a GF2 to a Kyro II would be a major step backwards, and future games will run much better on the GF2 than on the Kyro II. As a matter of fact, the GF2 beats the Kyro II by far in current T&L-heavy games. Look at that AnandTech review you linked, under Quake3. Even at extremely high resolutions, the Ultra beats the Kyro by over a third (15fps). I'd like to see some Tribes 2 benchmarks. I suspect they will be similar. Tribes 2 actually has different system requirements based on your hardware -- GF2 requires a PII 300MHz, while a Kyro requires a PIII 500MHz.

    ------

  • Quake3 has much higher geometric detail than UT, hence the difference. (Part of the higher polygon count is due to crappiness of the Q3 engine, though.)

    Yes, I prefer UT over Q3, but that is besides the point. Games coming out *now* have high geometric detail levels--higher than Q3--, and that means you want a card with T&L support.

    ------

  • Just out of curiosity, how low do you have to set your graphics settings to make that work? On my GF2 I have no trouble maxing out everything. Without T&L, however, you probably won't be able to set the geometric details very high. I haven't seen actual benchmarks, but neither have you. :)

    As for the DX8 not supporting GF2 T&L... The GeForce 1&2 support programmable vertex shaders a la DX8, so I'm not sure what you mean. I assure you NVidia and Microsoft would never let DX8 out the door if it didn't support current hardware.

    ------

  • Hmm... I'm not sure if the NVidia drivers currently support vertex shaders on the GF2. I know the GL ones say it won't be supported until R10, which apparently hasn't been released yet. But it is supposed to be there eventually.

    I suppose buying a Kyro to tide yourself over until the GF3 is at a reasonable price is a good idea, as long as you have a nice, fast CPU to back it up. :) The tile based rendering is actually a pretty cool technology... I'd love to see a card with both tile rendering and the feature set of the GF3. I suspect NVidia's next big core change will be such a card, but that's probably a year off or more. :(

    I just don't like it when people claim that a technology is useless because games don't use it. If that mentality keeps up, then people won't buy the cards with the newer features, and thus developers will continue to not use the features. Then, the industry goes nowhere. Luckily, NVidia has been the best choice for both current and future games for awhile now, and thus they have managed to keep the industry advancing despite the chicken-and-egg dilemma. Kyro is advancing the industry in its own way (tile-based rendering), but at the same time it is a step back in T&L, which I see as the more important technology at the moment. :S

    ------

  • Interesting... I just tested vertex shaders on my GF2, and they seem to be supported just fine. Perhaps Aquanox uses some special feature that isn't supported, like n-patches... Thanks for the link, though.

    ------
  • As geometric detail increases, so does required bandwidth. So does overdraw.

    Usually, no. As I mentioned elsewhere, geometry data is usually pulled from AGP memory, NOT from VRAM. Thus, it isn't much affected by the VRAM bandwidth. Furthermore, geometry data takes far less bandwidth than pixel data, so even if VRAM bandwidth does affect it, it doesn't affect it much.

    As for overdraw, no, increasing geometric detail does not increase overdraw. Usually, added geo detail means *smaller* triangles. The total surface area rendered is about the same, which means the overdraw is about the same.

    Lack of T&L is a BS argument that nVidia only fanboys keep putting up, which has little basis in reality.

    Umm... would you care to tell me what makes you think that? Personally, I tend to believe the T&L thing based on my experience in 3d graphics [gauge3d.org]. Other people knowledgable in the field, like John Carmack, Tim Sweeny, all of SGI, and others, generally seem to agree that T&L is, well... a good thing.

    You know, the CPU speed requirements for Tribes 2 are 40% lower if you have a GeForce 2 than if you have a Kyro. I don't think they made that number up. Perhaps, however, you think that Tribes 2 is not part of reality? I know, it's hard to believe that it was finally released. Maybe you haven't received your copy yet, and don't believe it exists. But it does.

    I, instead, assert that the argument that T&L doesn't matter is a BS argument taken up by Kyro and 3dfx fanboys.

    You're just using T&L as an excuse for why not to buy a Kyro2, when it performs extremely excellent on current day games.

    Huh? I'm not on NVidia's payroll. Why would I support NVidia without a good reason? Hell, let me go right out and say that I like the Radeon. It's a pretty good card. It has T&L.

    I'll tell you why I want people to buy T&L cards: I do NOT want to be limited by having to support cards with no geometry accelleration. I prefer that the industry moves *forwards*. Yes, the tile-based rendering is nice, but it is not worth the loss of T&L.

    The argument about current games not using it is not only wrong (current games DO use T&L), but it is stupid. It's a chicken-and-egg problem. If non of the gamers buy T&L hardware, none of the developers will use it. Fortunately, NVidia has managed to, for the most part, keep their hardware competitive in current games at the same time as they add new features for future games. I like it when the industry advances. Don't you?

    ------

  • According to the NVidia GL extensions docs, NV_vertex_program is supported in hardware on the NV1x series (GeForce 1&2), but only using the R10 drivers.

    ------
  • Troll? Probably, but I wouldn't want anyone to be mislead by your words, so...

    1) removing all content relating to linux.3dfx.com and 3dfxgamers.com

    That was 3dfx that did that, NOT NVidia. NVidia is NOT responsible for supporting 3dfx's current product line. NVidia gave 3dfx a fat wad of cash which 3dfx is *supposed* to be spending on supporting their users. Hmm...

    2) not updating either of the Linux, Mac, or MS Windows codebase for the 3dfx Voodoo products.

    Again, that is not NVidia's responsibility. NVidia did NOT buy 3dfx's current product line. If you go out and buy a Voodoo card right now, the money goes to 3dfx, NOT NVidia. Thus, NVidia has no reason to be supporting Voodoo cards.

    3) releasing closed-source drivers for their video cards that prevent accurate security audits

    I think this has been argued enough, but just in case, here is the primary argument: NVidia's GL drivers contains MUCH MUCH more than just a hardware interface. OpenGL drivers are required to implement all of OpenGL's functionality, even if the hardware doesn't support it. Thus, NVidia's GL drivers contain software implementations of anything that isn't supported in all of their cards. This includes an extremely good T&L implementation, which every one of NVidia's competitors would probably love to have. It would not be legal for them to release that open source -- their shareholders would sue them!

    4) conspiring with the enemy ala Microsoft

    NVidia does not suck up to Microsoft. Microsoft sucks up to NVidia. Has NVidia ever done anything purely for Microsoft's sake? No. They are providing hardware for the X-Box, but at a price, and at the same time they are providing hardware for the Indrema, a fact which I'm sure Microsoft is not happy about. Furthermore, NVidia has the best 3D hardware drivers available on Linux (closed source or not). Microsoft has acknoledged Linux to be their primary competitor, so I'm sure they'd prefer than NVidia not support Linux. Last, but not least, NVidia has the best GL implementation of any consumer 3D hardware vendor. There is no advantage whatsoever to using Direct3D rather than OpenGL on NVidia hardware. As a matter of fact, the GL implementation is often said to be *better* than the D3D implementation. I don't think Microsoft likes that, either.

    In summary, NVidia makes a point of not being manipulated by Microsoft, and they are damned proud of it.

    5) not boosting and advertizing for the widespread software development on the indrema console; which has incorporated their hardware.

    What the hell are you talking about!? NVidia doesn't advertise for the X-Box either. They also don't advertise for any of the graphics card manufacturers that use NVidia's chipset. As a matter of fact, NVidia does not (for the most part) advertise *any* individual product which uses their hardware. They only advertise their chipset itself.

    It is Indrema's responsibility to advertise their own hardware. I want Indrema to succeed too, but NVidia is the last place I'd blame if they fail.

    ------

  • The reason why the GeForce 3 is slower in a lot of tests is it doesn't push pixels as fast out of the 3d pipeline as the current crop of cards do. This is not all bad, because chances are that it will manage to actually have a pixel to push every clock because the rendering pipeline is a lot faster. This does not show with current games because they are rendering quite simple scenes, with games like Doom 3 the story will be quite different because cards like the GeForce 2 (and variants) won't manage to push a pixel every clock, while the GeForce 3 will.

    There are big speed improvements to be had on the GeForce 3 - the memory access for instance is using a crossbar switch (which they use in Cray's to keep the pipeline happy). There is a lot of special stuff in there, you just won't notice it until you start trying to render the amazing stuff being shown off on the GeForce 3 demos and videos (like the Apple / GeForce 3 / Doom 3 vid).

    There is loads of technical info on this if you look for it, mine came from http://www.tomshardware.com if you are interested.

  • Note that the GeForce 3 benchmarks were run against the GeForce 2 Pro, not the GeForce 2 Ultra, nVidia's previous high-end product. The GeForce 3 appears to be slightly slower than the GeForce 2 Ultra on standard benchmarks. That's a surprise; a few months ago, a 3X speed improvement was claimed.

    It looks like you get a speed improvement only if the application directly uses the vertex-programmable features of the new board.

    There's a lack of info on how well this board works when you have multiple 2D and 3D windows open. That's important for professional users doing development. (I'm likely to have Softimage|3D, Visual C++, and an app of my own all open at once, for example.) With all that programmability in the board, context switches may be a problem. Does anybody know? If you have one of these boards, bring up a few apps in different windows and report back. Thanks.

  • I think what NVIDIA did was realize that we don't need an extra 50 frames per second in quake 3 when you're already getting 150. people don't realize that neither our monitors nor our eyes can take advantage of that. The only think left to improve is picture quality. wich means higher resolutions and more extra effects.
  • HardOCP did a comparison [hardocp.com] between the GF2 and the GF3. Of particular note are the benchmarks of DroneZ, an upcoming OpenGL-based game that has special modes optimized for the GF3's features. Using the generic settings, the GF3 was about 15-25% faster. But with the extra GF3 stuff enabled, the GF3 was faster by a factor of between two and four. Remeber that this is with an OpenGL benchmark, not D3D, so don't start worrying that nVidia is ignoring OGL. I imagine that John Carmack will be taking advantage of this kind of stuff in DOOM3, as well.
  • I saw that too... and went ahead and placed my order. Cool policy.
  • The GeForce3 seems to have a very similar level of performance on lower resolution. However, crank it up to 1600x1200x32 and watch it rise above everything else.

    GeForce 1 & 2 simply dies in high resolutions and 32 bit color compared to the GeForce 3.

  • Measuring brute force speed is a very narrow analysis. The GeForce 3, albeit not worth $500, is an order of magnatude faster than the GeForce 2. You'll see this as games such as Doom3 take advantage of Bump mapping, T&L2, and higher Polys (not to mention a lot of other stuff).

    It's not just "frames per second", it's the quality of the frame. GF3 didn't add much in terms of maximum throughput, they mainly added features that would allow a dramtic improvement in graphics while maintaining a high frame rate.
  • Sorry, Temporal is right. If you review the press release of the aquisition on nVidia's site, you'll see that they essentially acquired "Intellectual Property" and prototypes and the VSA stuff they did with Quantum. No employees, support staff/systems, etc. 3dFX is still a company that is still operating it's own website.
  • Absolutely nothing. The GeForce 2 Pro, in the tests, beat the GeForce3 handily in most areas.
  • I disagree with the aspect of closing down the 3DFX driver sites. Unless you have a really good source to back that up, I'm under the understanding that nVidia did it as a cost-cutting measure to consolidate the two companies. It was a bad decision.
  • I've never been a big nVidia fan (the last two cards I bought were Voodoo's -- perfect for running Unreal Tournament), but I'm wondering if we'll see better results once they leave reference board land. It's possible that the drivers may have internally left a few diagnostic switches on for nVidia engineers to mull over, and perhaps the final boxed product will run substantially better than these tests show.

    As it stands now, I found the results to be more than a little disappointing. INAE (I'm not an engineer), but if your company makes a new card it should always be faster than your previous cards. Period.

    Although, I guess what someone else said is true. This would be a fine time to buy a GeForce 2 Pro...

  • I honestly don't care if it pushs said pixels "better", I'm like most people: if it ain't going faster than what I've got for what I'm playing, I ain't going to buy it.

    It's like owning a Porche which you regularly drive above 90 mph, and some salesman offering you a Toyota Camera that can reach that speed, burning class cleaner but taking a longer rating. "It burns fuel better." I don't care if it burns fuel better! I'm interested in speed.

    For now I have Voodoo 3500 that has performed more than admirably, even with its 16MB of RAM and inability to render 32-bit 3D graphics. I don't push my games past 800x600 resolution anyway, because I'm interested in them looking fast and smooth more than pretty. The way things are standing, I'll probably by a GeForce 2 Pro.

  • Note: All tests were carried out using Windows 98SE.

    Which brings up the question of drivers for other platforms. Of course you could use generics. Some folks may like to use some of the more exotic features that you can get at by hacking the Win Registry.

    But this might not be needed in some cases. For example, we all know that Linux users are all serious coders dedicated to the OS Revolution, and so *never* have any time available for something as trivial as games

    [JOKE! JOKE!]

    Check out the Vinny the Vampire [eplugz.com] comic strip

  • What the hell are you talking about!? NVidia doesn't advertise for the X-Box either.

    Not true. See http://www.nvidia.com/products.nsf/xbox.html

    They have a list of their products by platform, and under Game Console they have Xbox and not Indrema. I agree that you can't blame NVidia for Indrema's troubles, but they certaintly aren't treating the two consoles equally.
  • If you want to run under Linux with a card with a proven track record, buy NVIDIA.

    If you can call unstable drivers that may or may not work for you a "proven track record"...

    Ranessin
  • The NVIDIA drivers have been stable for quite some time.

    Maybe for you, moron, but certainly not for everyone.

    Ranessin
  • I stand in awe of your maturity. You're obviously an intelligent person with much experience on the matter.

    Don't accuse me of spreading FUD, and I won't point out that you're a moron. Quite simple.

    So the bug which causes the Radon to totally lock a machine with the Irongate chipset has been known by the DRI developers for 6 months, yet it still exists.

    I did say usually. In either case, a bug is more likely to get fixed, in a timely manner, with open source DRI drivers than nVidia's closed source drivers. How long did nVidia users suffer with leaky drivers? At least with the DRI drivers, you have instant access to the code and can fix bugs yourself if you so desire.

    BTW, is your second V5 processor supported yet?

    No, but simply because 3dfx went belly-up at the time they were working on it. The same thing could, in theory, happen with any company. BTW, the Glide source code is out there. The only reason it hasn't been finished is because no one is interested enough.

    n my personal experience, I've never been screwed with the NVIDIA drivers.

    I was screwed for a very long time with no decent 3D drivers because they promised them and then took over a year to deliver. I am still screwed because their drivers don't like my system.

    Ranessin
  • Yes, the first few releases were a bit shakey, but from the 0.9.5 release on, the drivers were quite solid.

    For you. There are still a lot of people who have problems with these drivers. I'm not alone.

    Ranessin
  • One big reason to purchase this card is that it supports all those new DirectX8 extensions. Stuff like vertex blending (I think that's what it's called) and pixel shaders. And the nFiniteFX engine. Still, you already have one big video card investment right now. Maybe in a few months they'll shrink it down to a more managable and cost effective size with more features (a la the GeForce to GeForce2 transition).
  • I expressed similar sentiment a while back about the 1.33Ghz Athlon: Far more expensive than a 1.1Ghz model, but with only a modest performance gain. So I wondered, why would anyone pay that much more for a small incremental upgrade?

    I found that most people were interested in the release not because they wanted to buy the 1.33Ghz chip, but because it forces the others (like the 1.1Ghz) down dramatically.

    It's sort of the same here. The GeForce3 will simply force down the price of cards like your (excellent) GeForce2 GTSII.

    I agree that the price for this card is pretty insane, but without it, your GeForce2 GTSII would probably be sitting in a similar price range.

  • Why on Earth didn't they benchmark with 3DMark 2001? If you want to see how well the GF3 performs against other cards, this is one of the benchmarks that really should be used.

    Why? Because it actually USES the NV20 DirectX8 optimisations. I believe the GF3 is currently the only DX8 card around, and if you compare a GF2 or Radeon against it in 3DMark2001 then it would be miles ahead.

    Why they chose 2000 I have no idea, but there you go.
  • by LordArathres ( 244483 ) on Sunday April 08, 2001 @04:24AM (#307616) Homepage
    I own a GeForce2 GTSII with 64 MB of DDR RAM. What is the incentive of my going out and buying at $500+ video card?...None. The current one is just fine and will be just fine for a while. The price of video cards is insane, for $500 you could get AMD 1 Ghz, MB, 256 MB RAM, Case and HD plus some other stuff easy. This card shouldnt catch on quick becuase there isnt any software to take full advantage of it. I think the only people that will buy this card are those that want to stay "cutting-edge" and have too much money. I think ill wait till the GeForce4 comes out becuase it looks as though the software section has a while to catch up.

    Arathres


    I love my iBook. I use it to run Linux!
  • Yea.... it has no T&L... but did you read the reviews I mentioned? --- I play Tribes 2 on my system right now (and love it to death)... with a card that has no T&L.. (Nvidia Vanta 16meg)... T&L isn't needed and especially with the Kyro II's effeciancy it isn't missed either.

    The future games using DirectX-8 behave the *exact* same on any of the Geforce 2 cards as they do on the Kyro or any other card. Meaning the the future games *don't even use* todays form of T&L for any benefits. --- Again.... read the Anandtech review please.
  • To make it easier to get to the reviews where they explain why T&L *in it's current Geforce 2 form* isn't as important as it would seem read Anandtechs review here : http://www.anandtech.com/showdoc.html?i=1435

    and Sharky Extreme's review can be found here : http://www.sharkyextreme.com/hardware/previews/her cules_3dprophet_4500_kyro2/

    Again... I'll be getting a Geforce 3 in it's Pro or Ulta form... but for anyone wanting to upgrade before then... the Kyro II card for 150 bucks is the best thing out for the price.
  • Sorry to provide so much proof but the info just keeps getting better. -- Here They Kyro *stomps* on the 400 dollar Geforce 2. *without T&L* ...

    http://www.anandtech.com/showdoc.html?i=1435&p=1 4

    ----Next Gen Games will Require the Geforce *3* T&L but untill a better version of the Geforce 3 comes out... this is where your money aught to go.(it is only 149 dollars afterall)

  • Check out the same review for the benchmarks for Unreal Tournament. --- If your talking about Quake 3... well... then yea... that game has always loved the Geforce T&L... but if your into Quake as your FPS of choice.. your missing out on a lot.
  • I agree that future games will benifit greatly using T&L... infact.... in future DirectX-8 games I would be supprised if the Kyro II gets even 20 fps with em.

    I'm just saying that those games are not out *now* ... and if your buying something now.. go with a Kyro because those games are quite a few months off and when they finally do come out... get a Geforce 3 then... (in a pro version).. because it will be absoluty *necessary* then.

    I'm speaking to the gaming hardcore mainly I guess... ( who don't mind upgrading twice in one year).. but really who else is in the market for a Geforce 3 right now anyway? The Kyro is so cheap it saves you money and performance in the long run..because the Geforce 3 Pro will be much better then over the current Geforce 3 you can buy now.
  • Heh... I'm running default mode with my card on Tribes.. but the FPS is very smooth. (1ghz Athlon 256 DDR RAM). --- My system would benifit greatly with a new card and that is why I'm going with a Kyro to hold me over untill the pro version of the Geforce 3 is released.

    As for current generation T&L... I was reading a review ( I am sure at Anandtechs aswell) that showed the Geforce 2 Ultra, Kyro, and Radeon getting 15fps each.. (even across all cards).. in Aquanox (a DirectX-8 game)... and how this was amazing because the T&L built into both the Radeon and Nvidia weren't helping them out at all. The Geforce 3 was getting 35 fps.... over twice as much... showing for sure how much future directX-8 games will be dependant on DirectX-8 hardware.
  • On AnandTech.... (wich did a much better Geforce 3 review)...

    http://www2.anandtech.com/showdoc.html?i=1442&p= 11

    Certainly the Kyro did poorly on directX-8 Aquanox ... but for that matter so did the Geforce 2 and Radeon.
  • Check out:

    http://www2.anandtech.com/showdoc.html?i=1442&p= 1

    A much more professional and thorough review.
  • I have a fairly fast computer (1ghz athlon 256 DDR Ram) but I play all my games at near the lowest resolution... with a lot of the eye candy off just cause I am a fiend for framerates. I guess to me I picture it as how much the card will help you win. I guess I could entertain the idea of "What kinda experience do I want this game to be". Maybe it is just the games I play.... (lots of first person shooters online).

    I never really put much stock in Eye Candy.... No matter how pretty the game is in the first half hour... you usually forget about that anyway later on if the gameplay is good.
  • I agree with you a hundred percent. -- My current situation is such that I run all of todays games fine with a 1Ghz Athlon and 256 DDR Ram.... but I am using a Nvidia Vanta 16MB! ;).

    I believe my system should see a big leap enough from the Kyro II to last untill the Pro GF3 comes out when it will then be a necessity.

    If I had a $300 Ultra I would not then buy a Kyro... nor would I recommend anyone else either. But if you have something *truely old* like a Voodoo X then they Kyro can give you a taste of high performance for cheap. Unless your a hardcore gamer you probably aren't in the market for GF3 right now anyway.
  • You know, the CPU speed requirements for Tribes 2 are 40% lower if you have a GeForce 2 than if you have a Kyro. I don't think they made that number up.

    That is the spec for the original Kyro... Not the Kyro II. Unfortunately I have not heard of anyone using Tribes 2 in the testing of the Kyro II.

  • Check out the review on Anandtechs... much more thorough with it's review of the GF3....

    http://www2.anandtech.com/showdoc.html?i=1442&p= 1
  • It might be honest... but it isn't really thorough.... for example... why didn't they use 3DMark2001 durring the testing. 3DMark2001 is the best thing the Geforce 3 has going for it right now... it gives a glimpse into the importance of DirectX-8 hardware support for future DX-8 games.

    Heh... if your spending 500+ on a card... your probably gonna want to read all the reviews you can. The "no need to go elsewhere" argument doesn't work... especially when the Avault review was *so* narrow.
  • by minus23 ( 250338 ) on Sunday April 08, 2001 @04:25AM (#307630)
    If you want to spend the cash on the Geforce 3.. I applaud your gaming dedication. I really do. If money was absolutly not an issue for me I would be there with you.

    I will buy a Geforce 3 someday aswell... probably within the next 8 months.. but it wont be the first generation, otherwise known as "MX" version of the card... I will wait to buy the "Pro" or "Ultra" version of the card wich will cost as much then as this first gen card does now.

    In the meantime.... the Kyro II card beats in some instances the Geforce 3 and the Geforce 2. It also only costs 150 bucks. It uses a totally different way rendering it's stuff.... and it works great on todays games... but sucks about as much as a Geforce 2 on tomarrows games. -- But for 150 bucks... I'll get it today.. (or in a week when it comes out)... and get the Pro Version of the Geforce 3... later around the time I imagine when the X-Box is released.

    Read a review of the Kyro II at Anandtech.com ... or a few others have reviewed it aswell.... they all have glowing reviews for it..... *for the price*.

    Also... main product page can be found here --- > http://us.hercules.com/mediaroom/pr/lookpr.php3?pr =101
  • Well actually, I'm seriously thinking about buying an Athlon 1.33 GHz processor. It's priced at $222 after price cuts this past week. Considering the overall cost of building a new pc, I think the $47 extra cost over a 1.1 GHz thunderbird is only a "modest" price increase. But I do agree with your point. I'm also going to be buying a $155 32 MB Geforce 2 GTS Pro card. The price is right and the price/performance is damn good compared to a $500 Geforce 3 card.
  • See http://firingsquad.gamers.com/features/nvidia1215/

    FiringSquad: Will NVIDIA continue driver support for 3dfx's Voodoo 3,4,5 line?

    We're not purchasing 3dfx's current product line. We're only purchasing 3dfx's core assets. Support of 3dfx's current products will remain with 3dfx.
  • I agree with you whole-heartedly!! Do you guys realize how hard it is to find win2k drivers for my banshee!! I used to go to 3dfx gamers.com but now, Nvidia would probably have me shot if I tried to write them myself!!!

  • What was your incentive for getting a GeForce2 GTS II with 64megs of ram?
  • I know drivers will get better but I found really disappointing that a $500 card is so slow [avault.com] doing something as simple as 2D graphics. Hope they get it better.
  • NVidia will be getting ~$500 for these things.

    I think you got this totally wrong...
    Nvidia pays M$ for someone to start manufacturing their chips. Then some bucks per chip.
    Then the nVidia takes it cut, and sells those chips to gfx card manufacturer which makes those cards. The card manufacturer pays for the PCB.
    And the highspeed memory chips, and takes its own cut (% over the costs he gets.) Then the Retailer takes his cut too. For the volumes of first generation parts, nVidia makes quite a little cash. In the line Nvidia gets less than 100$ per chip 50$ probably. And that needs to offset the initial 1-2M$ NRE, and the developement costs.
    (NRE=Non recurring engineering costs, a price to pay for foundry for each time different chip has production start, ANY change in chip will result repaying that.)
    These initial GF3 chips need to ship in quantities of 20k before nvidia gets its manufacturing costs back, with these high prices. But rememeber always that are more eaters to this pie than nVidia.
  • In the article they blame geforce3 from that it has 20% lower clock frequency than GF2 ultra.
    Well there is reasons for that.
    At certain voltage your power consumption is
    x Wats per gate, per Mhz. Well 20% lower clock frequency is 20% lower power consumption...
    Not quite. Well speed of transition depends on voltage, lower voltage= slower transistors.
    But power consumption depends on Voltage to power of 2. So this may lead up to 40% reduction of power consumption in total. Then the die shrink from 0.18 to 0.15 reduces the power consumption compared to GTS 44%. After calculating these two figures and pointing out that GF3 has 128% increase in numbers of transistor, the peak power consumption of chip is higher than GTS pro. And would be MUCH higher than GTS pro if they would run in same clock frequency.
    Also there is point that GF2 pro and ultra:s are picked from volume productions higher frequency parts. The process advantage would give GF3 better change in getting high clock frequencies, but then we must remember. This is their first layout of the new T&L unit and probably made automatically by tools with some human assistance. While GTS2 has 2nd generation layout with more customizations by humans. (The T&L of GF1 with new layout.)
    The GF1 didn't have much higher frequency than TNT2 ultras didn't it? When you get totally new design of a large fancy unit expect it to come in lower clock frequency than the old, refined version that has more optimized layout and less gates consuming the power. (exception, if the layout is done by hand in the first place. eg P4 you get high frequency depending on the design.)
    Also if the design is the clock speed limiter, then the changes that makes the pipeline better balanced and/or with more stages usually get frequency advantage, but in case of GFX cards its not the most feasible way.
  • To those who think x wats may be too much to pay per gate the x isn't integer... x is around 40*10^-9
  • >That's nice, but who the fuck cares how much profit nVidia gets? I still spent $500. Is your point that I should feel sorry for them and that this is a justified cost regardless of the performance?

    Well my point was that there is reason for the high costs of these boards. Other than the corporate greed. They get what they can get but thats nothing how much they make selling some high volume chips. This chip is for those who are willing to pay that price. Nvidia makes chips for everyone, for those who are willing to pay for their chips more and for those who are not. GF3 obviously targets currently for the market that is wiling to pay more. The performance... well I know some chip designing, and their clock frequency is exacly where I think the chip of that size should be. [not everyone has cryo coolers for their gfx cards.]
    The performance, well the 2D simply shows that these are immature drivers just like its immature chip. There is always price to pay in start of new architecture, its usually lower in price/perf than older ones.
    Well the card isn't made for you or me. Its made for kids of some rich people, at the beginning.
    You are not getting this for same reason you wouldn't by a 1.5ghz P4 with 256MB ram.
    You are not in the performance market. (those users who are willing to pay twice as much to get 20% increase for their favourite apps.)
  • "This card shouldnt catch on quick becuase there isnt any software to take full advantage of it."
    "I think ill wait till the GeForce4 comes out becuase it looks as though the software section has a while to catch up."

    Am I the only one sensing irony here? :)

    Regards,
    infernix
  • This is a little depressing if you consider how much ass the NV20 was envisioned to be able to kick when it first was announced. Too often the shipping products aren't quite the revolution they seemed in that initial glowing press-release. Yes, it sounds great to be able to get higher framerates at the higher resolutions with better picture quality. To that extent this is a sucessful upgrade. But as the article mentions, NVidia will be getting ~$500 for these things. We should be able to expect all-around ass-whooping of the GF2 Ultra and especially the Radeon at that price.
  • I think you got this totally wrong... rememeber always that are more eaters to this pie than nVidia.

    That's nice, but who the fuck cares how much profit nVidia gets? I still spent $500. Is your point that I should feel sorry for them and that this is a justified cost regardless of the performance? No, I think your point is that I should have said "People will be paying ~$500 for these things" instead of 'nVidia will be getting ~$500 for these things". Thanks for the lesson in nitpicking.
  • Well the card isn't made for you or me. Its made for kids of some rich people, at the beginning.

    Good point there. I'm definitely not eager to spend that kind of Ching, but luckily there are some people out there who will. I still think the thing should be resoundingly better to cost almost twice as much. Similar to the P4, it seems that both are positioned as hedges against future developments. "Just wait till 'X' is optimized or 'X, Y, & Z' games ship".

"What man has done, man can aspire to do." -- Jerry Pournelle, about space flight

Working...