Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Graphics Software

The Good Old Days of 3Dfx 160

Fosters writes: "There's a short story about the old days of the 3D graphics world, when 3dfx (3Dfx) were kings of 3D and how things have changed in today's industry. The authors talk about how that came about, albeit somewhat light-heartedly. This sums it up as the author says, "To this day, I truly believe that this was the turning point for 3dfx and their SuperG downhill slide (that's a winter Olympics event). And it wasn't because of some fancy technology, a military leader (depends on how you look at some of the former VPs at 3dfx), or even a drill instructor named Zim (yeah, Starship Troopers- too easy). 3dfx started to lose their fan-boys and early technology adopters to NVIDIA then, who were waiting and watching, as 'Bugs' do, with TNT, TNT2 and something more than '22-bits' of color.'"
This discussion has been archived. No new comments can be posted.

The Good Old Days of 3Dfx

Comments Filter:
  • That dosn't mean jack. Glide is dead. Espically when their next card comes out.
  • Yeah, and the same hardware developers will tell you that a GeForce on a really fast computer, with hardware T&L turned on will be slower than with T&L turned off, because it becomes a limiting factor. 3dfx has always been hesitant to put tech in until they think it can be done right, and as far as I can see, they've always been pretty correct. And antialiasing is the way of the future. That's why, for example, Dreamcast DOA2 looks so much better than PS2 version; no jaggies. Less polygons, sure, but it still looks so much better....
  • umm nVidia?
    I just dl their newest source and i counted 3 switch statements in the entire source.
    And they have direct support for the KX133 and KT133 finaly. Cant wait untill i COMPILE this and install it from the SOURCE direct from

    ftp://ftp1.detonator.nvidia.com/pub/drivers/engl ish/XFree86_40/0.9-5/NVIDIA_kernel-0.9-5.t ar.gz


    Sanchi

  • Your definetly an FSAA addict; an addiction worse than being a girl-drink drunk. You obviously forget that all GeForce owners like to have >30fps in all situations. The only real reason why FSAA is there is to help in screenshots; FSAA's antics can really fudge up a game which uses detailed sprites (take a look at Half-Life and Soldier of Fortune under FSAA; I thought it was downright ghastly).

    To sum up, I'd rather have the jaggies so I can gauge movement and angles, than having the "blurries," which serve no other purpose than lying to your eyes.

  • Who's blaming 3DFX? I'm totally blaming my case manufacturer..... Did you read the DIE COMPAQ, DIE DIE part???

    :)

  • Try the fish [altavista.com]. And if you can figure out what language the article was written in please let me know.
  • Well, this is true only if the entire CPU is calculating T&L. The whole point of offloading it to the graphics card is that is frees up a rather mundane graphic task so that the CPU can do other things like AI routines.
  • A while ago I walked into a computer store to by a 3D card. On the shelf was the 3dfx Voodoo 3 and some card with the NVIDIA TNT2 chipset. I bought the NVIDIA card because it was about $30.00 cheaper than the Voodoo, but I found it was not what they told me.

    The game that I bought with it, which I was assured would work with the game, only worked after about three patches. Even then it crashed when the real high level graphics had to be loaded. Not much other software supported it either. Even the game that came with it only worked about a third of the time.

    The end result? I traded in the card for a Voodoo 3 a few days later. I couldn't be happier. It draws great graphics and runs every piece of 3D software I have come across...including the software that came with it.

    The Voodoo 3 may not have all the features of the TNT2, but at least I can be sure that it will run most of the software that I have without hassle. That is also why 3dfx can continue along with inferior technology...because it is compatable.

  • Don't forget that the voodoo3 was limited to 256x256 textures while the tnt2 supported 2048x2048 pixel textures. While this didn't come into play nearly as much as the 16 bit color and 16 megs of memory, it was still a stupid move by 3dfx.

    I was always entertained by 3dfx's late notion of including support for 22.5 bit colour, as a halfway point between 16 and 32. I don't remember how they were going to get half a bit, so if someone else knows the details, I'd be most interested.
  • It takes a leader with a real inferiority complex (like Bill Gates's) to keep up the fight after all the competitors have been slain, and it takes a lot of cahones to squash all future competitors before they can rise to the challenge.

    Creative seems to be doing a pretty good job of this lately. Sunk their competition into bankruptcy with frivolous litigation and then swooped in to buy up the pieces. Which is really a shame because Aureal arguably had better technology.

    I'll tip my hat to 3dfx. They may be losing the war, but at least they've been mostly fighting a fair fight. Companies like Creative, or even Intel piss me off because they are on top and complacent, but they stay there through marketing and lawsuits, rather than technical merit.

    At least AMD processors have fared better than the Aureal cards :/

    - Just another AC

  • by RISCy Business ( 27981 ) on Sunday September 24, 2000 @09:28PM (#757330) Homepage
    Well, they're both making chipsets. Wow.

    3Dfx makes cards. nVidia does not - they simply make chips and sell them to manufacturers to make the card.

    They both claim to be better than the other. Now, gee, am I the only one seeing atypical competition there?

    3Dfx wants to innovate - ie; use a stagnant chipset for a base and make a better design. nVidia wants to go faster - ie; redesign every 6 to 12 months to milk out another 3FPS on systems that aren't already running into bus limitations.

    Reguardless, I won't buy either. Why? Because they both claim OpenGL support. Now, I don't know about you folks, but seeing as I work in AutoCAD frequently, that means hardware support. Neither of them have it. ATI doesn't. #9 didn't. Why? Most of them view software as the future.

    The mistake both nVidia and 3Dfx are making is that they're trying to take on the world when they don't have the staff, technology, or knowhow to do it. I've seen the Voodoo3, the Voodoo5, the TNT2, the GeForce MX, etcetera. The 2D quality, quite honestly, SUCKS. I've seen better 2D rendering in a blender. And I'm not talking a video blender, I'm talking the one in my kitchen. The refresh rates are for the most part inadequate for professional graphics work, the 2D image quality is abhorable at best, and considering I'm hitting bus limitations, I don't see how the extra 30FPS it's capable of in the chipset is going to help any.

    I remember back about a year or two ago, I have a 3DLabs Permedia2 8M PCI card and the Voodoo had just come out. My friend snagged one, I scoffed. I said 'ha! My card already does all that stuff.' His 8M Voodoo continuously and routinely got smoked by my Permedia2 when it came to games. QuakeGL, Final Fantasy 7, etcetera. My Voodoo2's (2x 12M - for Unreal Tournament, there is no other option) combined with my #9 RevolutionIV 32M still are VERY hard pressed to beat the Permedia2 in any number of tests.

    Now if you go read AnandTech sometime, you'll note that a lot of the cards these days - at least the gaming cards - are getting *OBSCENE* FPS rates. >70FPS. And when you pair the big names - nVidia, 3Dfx, Matrox, ATI, etc - up in, say, a pIII 733 or whatever it is, they ALL get the SAME RATE at 640x480x whatever depth. Why? The bus is full. Can't push any more data than that. And what are these companies doing about it?

    3Dfx is adding an external powersupply for all the active cooling you're going to need just to run the Voodoo6 at normal speeds. nVidia is gleefully ignoring it and boasting a faster and bigger chip. ATI's touting more and more memory. Now, bear in mind, if the chipset doesn't use the memory for ZBuffering (mind you, not true ZBuffering), 64M of DDR SDRAM is doing you no good - 1800x1440 only needs around 14M or 16M IIRC. The companies are putting memory on the cards to make them look bigger, perhaps perform a bit better, and ignoring the core problems.

    Quite frankly, I could care less whether or not 3Dfx is 'stagnating' or nVidia is 'amazing' or what have you. I need a card that works with and around bus limitations, that can do 2D and hardware OpenGL, that can do what I need. I don't buy mass marketed cards because unlike the Permedia2, which *was* mass marketed (Diamond FireGL 1000 Pro (PCI and AGP)) and an excellent card, today's cards are the equivalent of junk for me.

    What do I use? Well, now instead of putting multiple cards in a single system, I'm stuck using top end cards. We're talking cards that cost more than your typical PC and more than a well configured laptop in some cases.

    I just purchased, much against my desires but in tune with my *needs*, a $4,200 Wildcat 4210 graphics adapter. What is it? Dual pipeline. Dual head and a few more outputs. 90 Hz at 1824x1368. One AGP Pro 110 and two PCI connectors. All on a single card. That requires 110W of power. Wildcat was just bought by 3DLabs, the name in 'affordable' cards. (The Permedia3 is affordable, but not enough for what I do.) I was forced into spending more for a single video card than I spent on the entire system. ($2,935 for the curious. I reused the 18G SCSI-UW disks and controller.)

    Now maybe some of you don't have this problem. Actually, I'm betting most of you don't. But for those of us who actually really *don't* do this for a living, per se, but need the hardware anyways (I use AutoCAD for various engine modification work on a very regular basis) are getting screwed by the dick wars between 3Dfx and company. It used to be that I could do just fine with a happy Permedia2 and AutoCAD R14. Then it was a #9 Revolution IV 32M. I went to go buy something with excellent 2D quality that could perform better than the #9 Revolution IV and found out that nothing does. If I want 2D, I have to go Matrox, which doesn't perform terribly well under AutoCAD R14 or AC2K. If I want real rendering performance, I have to go up to the professional cards, which I really didn't want to do. Now maybe the 4210 was overkill, but quite frankly, any of the cards is a pain to find and order. I could have probably gotten a 3DLabs Oxygen GVX420, but they also made the mistake of ignoring bus issues, and boom. The card ends up limited by the bus, performing really not all that much better than the other options. Just with ZBuffering and a $2300 pricetag. A single AGP/PCI combination (yes, two connectors, two PCBs) still runs into bus limits before the card hits its.

    I don't know about you, but I really feel cheated.

    Maybe I'll just put the Wildcat on eBay. Bidding starts at $1.

    =RISCy Business

  • OK... 3dfx is great.

    Ranessin
  • "How is it that, in January, I bought a dang-fast TNT2 for $60, while the Voodoo2, a slower card, sold for over $100 everywhere I looked? Simple - the different board manufacturers compete with each other" The Voodoo 2 WAS sold by different board manufacturers, and you can still find V2's by Creative and Diamond in some places.
  • Geforce DDR myself. (Former 3dfx user.) I don't see myself going back anytime soon.
  • have you ever looked at a recent nvidia based card? come back and tell me you still think 3dfx is great.
  • Anyone can sell information these days. It's nothing special, anymore -- the internet has lowered the price/datum ratio to all-time levels. To succeed, you have to add value, and what easier value to add is there than verve? It boosts your credibility, and it especially raises your stature in markets that cater to other smug assholes (no offense to 3d gamerz, but that's the generalization).
  • What is 3dfx doing wrong?

    Look at their stockprice and earnings for the last year or so. Where is the Voodoo5-6000 that was announce last year and still doesn't have a release date?

    And their next-gen card, code name RAMPAGE, has been in development for 3.5 years and $50mill R&D, and it doesn't have a release date.

  • by Anonymous Coward
    "And, so far, only they and Matrox have opened up all their specs. (3dfx was first, though)"

    Incorrect. Matrox provided all the info needed for Utah-GLX to write 3d Drivers for their cards long before 3dfx released any information at all. 3dfx did have binary x86 only Glide drivers for linux for a while, which is probably what confused you.

    3dfx didn't get open source friendly until it became clear that nvidia was kicking their ass on the high end. product differentiation is the name of the game for the also-rans.
  • by Phokus ( 192971 ) on Sunday September 24, 2000 @05:34PM (#757338)
    "The King is dead, long live the king!"

    Ok well at least the first part of that cliche is true. The turning point of 3dfx is quite ironic though. After Nvidia came out with the Tnt (which was the first consumer chip to have 32 bit color), and both Nvidia and 3dfx began revealing their plans for their next generation chips, we were all surprised by 3dfx's arrogance (or naiveness) when they announced they would not include 32 bit color.

    "Speed is King"

    That was 3dfx's response when everyone questioned their next product, the voodoo 3. Alright so both products were relatively compareable in speed, and 3dfx still had some clout with glide games (tribes anyone?). Lessons (not) Learned?

    Now what has happened? Nvidia introduces Hardware T&L on their chips, but the controversy is, who needs hardware T&L when there are no games supporting it (back then that is)?? 3dfx yet again sat on their laurels and decided to let Nvidia introduce it in their products first. While it's quite true that HW T&L was not really important back then, Nvidia was smart and marketed it as the next revolution in 3d acceleration. I mean, who wouldn't want to remove the CPU bottleneck and let the graphics card handle most of the 3d rendering? Sadly, 3dfx was only able to say "see? We have 32 bit now!"

    The irony that is 3dfx

    Now what do we have with the voodoo 5? Still, 3dfx REFUSES to incorporate onboard T&L when it's becomming more and more apparent that it is important these days. However, now 3dfx is ditching their "speed is king" philosophy and is trying to be innovative with their anti-aliasing and T-Buffer technology. But it seems that Nvidia has learned from 3dfx's mistakes and have included anti-aliasing technology of their own.

    The road ahead?

    After all these mistakes, has 3dfx learned their lesson yet? Who knows, perhaps 3dfx was right all along about not needing HW T&L right now (still, not many popular games support it) and they may very well outdo Nvidia with the release of the Napalm. But you have to admit, Nvidia played the marketing card extremely well with their Geforce cards, even if T&L wasn't really useful at the time. I pray to God that 3dfx will get their feet back on the ground, otherwise we may see another monopoly in the computing industry.

  • Does anyone else think 3DFX started on the downhill run around the time of the Voodoo2/Voodoo Banshee series?
    Personally, I love my little Voodoo2....the only issue of course being limited to a resolution of 800x600. Even with overclocking, there's only so much you can do with the card.
    Before the Voodoo2, 3DFX chips were blowing people away...Woodoo2 came along, and for a short period of time, were amazing. I remember playing Quake 2 on my Voodoo2 and being told that my CPU wasn't fast enough to make the Voodoo2 really perform properly. I stick a Voodoo2 inside a box with a better processor, and while nice, the limitations it was bound to prevented me from going out and grabbing a Voodoo3.
    Nowadays, people want high framerates and DVD decoding (or so it seems). Every box I pick up these days says "DVD decoding". I have a GeForce 256 32mb DDR and lemme tell you, no matter which drivers I use, the DVD decoding SUCKS. Maybe my system is holding it back, but honestly....For gaming, I choose NVIDIA....for DVD I pick ATI....woudln't it be nice for someone to throw together an MB with 2 AGP ports, perhaps one for an addon card?

    As long as 3dfx keeps up with NVIDIA and ATI as far as releases, they'll do fine. As far as Linux support, 3DFX has easily provided the quickest and easiest. While this may change in the future, if they keep catering to the geek community in this way, 3dfx will most likely make some kind of comeback.

  • I`m in the u.k. - No idea if its different. Cheers though :)
  • 1. Having the same scores at 640*480 means that the app is CPU limited. 2. The Permedia1 and 2 cards got smoked by a Voodoo1 in any game. I know it because I had a FireGL 1000 Pro. It sucked in GLQuake, hardly pushing 15-20 fps at 640*480. 3. I wonder what your Wildcat can do about the (maximum) 1 GB/sec speed of the FSB in your system...
  • The game that I bought with it, which I was assured would work with the game, only worked after about three patches. Even then it crashed when the real high level graphics had to be loaded.

    If its a bad game its a bad game, the card you're using can't fix bugs in the program. Sin will crash wether you're running a Voodoo card or one from nVidia.

    The Voodoo 3 may not have all the features of the TNT2, but at least I can be sure that it will run most of the software that I have without hassle. That is also why 3dfx can continue along with inferior technology...because it is compatable.

    The only "compatibility" that the Voodoo has over the TNT2 is glide, which is dying. Even 3dfx have admitted so, and will concentrate on Direct3d and OpenGL, which all games are now designed around anyway.
  • by Anonymous Coward
    You have some good points, but also made some ill-informed comments.

    While it is true that most consumer cards have OpenGL drivers that are...lacking...for professional work, you completely miss the point of having more memory on graphics cards: textures. Texture thrashing can be a significant problem if a level in, say, a FPS uses a lot of textures, or high-res textures. By keeping them on the card rather than having to continually transfer them over the bus, performance is increased considerably. This is also why texture compression, IMHO, is something that all game developers and card/chipset manufacturers need to support. This is a reaction to the limitations of the system bus, so your complaint about card/chipset manufacturers seems a little unfounded.

    And, not speaking as a graphics professional, I find today's cards' 2D performance to be adequate -- decent image quality and insane refresh rate support mean no eyestrain for me. Dunno what your standards are, but again, I'm no professional hehehe.
  • Bought Deus Ex and have a GeForce 2? Gotta wait for the patch. Same goes for many games. I want NVidia to get this right, but they're obviously focusing elsewhere.

    Nothing's wrong with the Geforce 2 or its drivers, the problem is with the game. Deus Ex is based on the Unreal engine, which is glide-centric and was glide-only when it first came out.

    Try playing Dues Ex on a Voodoo under Direct3d or OpenGL and you'd have the same problems.

  • Get a sense of humor.

    Click away I've got tons more Karma than you'll ever have.
  • With the Detonator 3 drivers, I get 62 fps in Quake 3 at fastest settings at 640x480. If I crank it up to high quality I only lose 7 fps.

    And this is on a 450 celeron with a TNT2 Ultra.
  • John Carmack Said:-
    > All of the modern cards have full rasterization
    > support for OpenGL, but I guess you are refering
    > to geometry acceleration.
    [and]
    > Today, there isn't a single combination of
    > rendering attributes that will let a wildcat
    > out-rasterize a GeForce2.

    What is the case with antialiased lines? Both 3D and 2D? These were once considered a 'professional' level feature. Are these now accelerated in consumer cards?

    --
    - Antony Suter (antony@mira.net) "Examiner" openpgp:71ADFC87
    - "And how do you store the nuclear equivalent of the universal solvent?"

  • Ah, the screams of those on the bleeding edge of technology. Well, I'm glad you're out there, dumping money into R&D with the purchase of $4,200 graphics cards; I'm sure it will benefit me someday. Quite frankly, if your current needs are a dual pipeline/dual head display with 90Hz at 1824x_whatever_, you should be paying through the nose for it. I think you're stuck not being able to make a distinction between home systems and true workstations. Kind of like a GT driver complaining about the stock brakes on an Audi. The Voodoo/ATI/NVIDIA chipsets you mention are clearly for the home market. You mention your Permedia card "smoking" a Voodoo2. That's splendid. I hope your friend discovered the price of your card and had a good laugh. Bleeding edge technology is always a lot of fun, especially if you can afford/justify it. Spend the extra money if you need to, but don't tell me that graphics card wars that allow me to purchase a GeForce SDR (I know, you wouldn't touch it) for ~$130 are a waste of time. I'm loving it. On a technical note, at 640x480 you do see filling of the PCI/AGP bus. I know you don't work at those resolutions, and I don't play at them. At anything above 800x600, the bus ceases to be the limiting factor. RC (I bet you use RDRAM, don't you?)
  • by John Carmack ( 101025 ) on Monday September 25, 2000 @09:00AM (#757349)
    >Q. After reading the voodooextreme interview, it sounds like you are pursuing an allmost completely different rendering pass/phases with Doom 3. Can you give us any more details? :-)

    It adds up to lots and lots of passes. I am expecting the total overdraw to average around 30x when you have all features enabled.

    >Q. Could you give us your thoughts on T&L? Why does 3Dfx say it's not important?

    Contrary to some comments here, 3dfx didn't just "decide not to put in T&L", the didn't have that option. Product development cycles can take years, and you can't just change your mind at the end.

    They don't have it, so naturally they downplay the importance of it.

    John Carmack

  • by John Carmack ( 101025 ) on Sunday September 24, 2000 @10:44PM (#757350)
    Actually, even the original Verite V1000 could do 32 bit color rendering.

    At a whopping 6 mpix or so...

    Rendition did a lot of things right, even on their very first card. They had all the blend modes and texture environments from the very beginning. The only thing they didn't have was per-pixel mip-mapping.

    If they had delivered the V2xx series on time, they could have had a strong foothold before voodoo2. The V3xx seried would have been a solid TNT competitor, but again, it wasn't ready on time. They wound up ditching that entire generation.

    John Carmack
  • NVidia does this so that they don't have to maintain multiple driver cores. They use the same core set of code for all OS's and all products, from the TNT to the GeForce2.
  • Okay, maybe if you have a slower processor T&L can help out quite a bit. But as it stands, the current crop of T&L units found in nVidia and ATi products aren't a whole heck of a lot faster than letting a p3/athlon do it with their extra instruction sets (the S3 attempt was a heck of a lot slower).

    In some situations the on-board T&L unit can actually slow things down. A considerable amount of data must be sent to and from the card when using hardware T&L. If you're already bouncing off the memory bandwidth limitation of your video card (1600x1200x32 anyone?) then you'll see little or no advantage to hardware T&L.

    Of course I think hardware T&L units are generally a Good Thing. They're just a bit over-hyped IMHO.
  • by Anonymous Coward
    Ontopic?
  • Considering the fact that even now, even the fastest 3D cards give poor performance in 32 bits color; and considering that it does'nt make much difference when your textures are compressed to death ... I wonder if they are so wrong stating that 32 bit color is useless?
  • by bonzoesc ( 155812 ) on Sunday September 24, 2000 @04:33PM (#757355) Homepage
    3dfx didn't just get mercilessly ripped apart by nVidia - they let it happen. When the TNT2, supposedly the first consumer chip faster than Voodoo2 SLI, came out, sporting 32-bit color and support for over 32 megs of video ram, 3dfx countered months later with the Voodoo3, which had the 16-bit color and 16-megs of ram limitation of the Voodoo2. That was really the first nail in the coffin of 3dfx. Their relase schedule can also be blamed, because nVidia has a new chip out every year or so, and sells them to many different board manufacturers, causing competition. 3dfx makes their own boards, just like ATI.

    How is it that, in January, I bought a dang-fast TNT2 for $60, while the Voodoo2, a slower card, sold for over $100 everywhere I looked? Simple - the different board manufacturers compete with each other, trying to sell their TNT2 board over somebody else's. The 3dfx board manufacturer just tries to sell their boards to Voodoo zealots, who are, for the most part, GeForce believers now.

    Tell me what makes you so afraid
    Of all those people you say you hate

  • by John Carmack ( 101025 ) on Sunday September 24, 2000 @11:21PM (#757356)

    >Reguardless, I won't buy either. Why? Because
    >they both claim OpenGL support. Now, I don't
    >know about you folks, but seeing as I work in
    >AutoCAD frequently, that means hardware support.
    >Neither of them have it. ATI doesn't. #9 didn't.
    >Why? Most of them view software as the future.

    All of the modern cards have full rasterization support for OpenGL, but I guess you are refering to geometry acceleration.

    The situation has changed since you last looked at it.

    The Nvidia GeForce cards have an extremely capable geometry accelerator, and they have the ability to fetch display lists either over AGP with a large bandwidth savings due to vertex reuse, or store the display lists completely in local memory to remove all vertex traffic from the bus.

    The issue with professional OpenGL support has mostly been the focus of the driver writers, not the hardware. I think that Nvidia's partnering with ELSA to work on professional app certification with the Nvidia hardware was an extremely good move.

    There are a few edges that the expensive professional boards still have over the nvidia consumer cards, but not many:

    You can get more total memory, like a 32mb framebuffer and 64mb texture memory configuration. We will probably see workstation graphics systems with up to a gig of memory within a year. Consumer cards will offer 128mb next year, but the workstation cards can easily maintain an advantage there.

    This has a cost, though: large, expandable memory subsystems can't be clocked as high as the single-option, short trace layouts that nvidia does. Even dual pipe workstation boards can't match the memory bandwidth of a GeForce2.

    You generally get better high end DACs and shielding on workstation boards. The upper end of the consumer boards will do the high numbers, but it just isn't as clean of a signal.

    Dual monitor has been supported much better on the workstation boards. This is starting to become a feature on consumer boards, which is welcome.

    The consumer cards are still skimping on itterator precision bits. Under demanding conditions, like very large magnitude texcoord values stretched a small increment across a large number of pixels, you can see many consumer cards start getting fuzzy texel edges while the workstation cards still look rock solid.

    Probably the most noticable case is in edge rasterization, where some workstation cards are so good that you don't usually notice T-Junction cracks in your data, while the consumer cards have them stand out all over the place.

    Next years consumer cards should fix that.

    When the consumer cards first started posting fill rate numbers higher than the professional boards, it was mostly a lie. They got impressive numbers at 640x480 in 16 bit color, without blending, depth buffering, and filtering, but if you turned on 32 bit, depth, blend, trilinear, and ran at high res, they could fall to 1/4 or less of the quoted value.

    Today, there isn't a single combination of rendering attributes that will let a wildcat out-rasterize a GeForce2.

    Wildcat was supposed to offer huge 8 and 16 way scalability that would offset that, but it doesn't look like it is coming any time soon.

    The workstation vendors do stupid driver tricks to make CDRS go faster, while consumer vendors do stupid driver tricks to make Q3 go faster.

    We bought three generations of intergraph/intense3D products, but the last generation (initial wildcat) was a mistake. We use nvidia boards for both professional work and gaming now. I still think the professional boards are a bit more stable, but they fell behind in other features, especially fill rate. Being an order of magnitude cheaper doesn't directly factor into our decisions, but it would for most people.

    John Carmack

  • by bbay ( 192854 )
    Holy cow. It's like they ran the article through babelfish several times in a row. I've seen better grammar from Dr. Spaitso.
  • No, the voodoo 2 was 3dfx's most successful card and it's also the card that most people associate with 3dfx's success. ANYONE who wanted good 3d graphics bought one, even though they were somewhat pricey. The Banshee is the product that started giving the consumer doubts about 3dfx's future though.
  • Slightly off topic, but..... Has anyone else noticed that 3DFX cards seem to be a lot longer than any other GFX cards?
    I mean, my Voodoo2 barely fit into my last CPU (DIE COMPAQ, DIE DIE!), and I mean, with a little work (read bending) I managed to get the card in. I haven't had the problem since...but then, today, reading the posts, I saw some shots of the Voodoo 5....and it just astonished me that the cards don't really seem to be case frriendly....just from looking at it, I don't think it would fit into my case, and I ain't bending a $500 card just so it fits in there.....
    I wonder if this has some crazy effect on our buying habits....

    Or I'm just nuts :)

    (Check out http://www.mooshware.com/images/3dfx/V5-6000.jpg to see what I mean)

  • by Anonymous Coward
    The argument can also be made that the reason there aren't many games that REALLY support hardware T&L is that it was not adopted by both NVIDIA and 3dfx.

    You need to figure about two years after a feature set is first broadly introducted for games to broadly accept it (because of the turn around time in game development).

    If 3dfx had even a weak T&L processor in the Voodoo3 3500 and later card, that would signaled the go-ahead to game developers to push hardware T&L in their games right then and there.

    Unfortunately, it's not going to be 2003, before you start to see games that REALLY take advantage of hardware T&L, not only using it to push optional higher polygon renderings faster, but to load balance between rendering on the video card and more advance AI, physics, and simulation models on the CPU.

    THAT is the cost of 3dfx's arrogance this time... 32bit color was a minor stumble compared to the difference between 100,000+ poly and 10,000+ polys. FSAA is something that is cool, but will only really be fully taken advantage of when the rendering of the world looks like something less than lego.
  • Above an average of about 40FPS, nobody notices anymore - they can't! As others have mentioned, the top end is probably closer to 60fps than 40.

    More important, though, is the headroom you get with a faster card. A game like Q3 has a standard deviation of about 7fps, which means over 15% of your frames are under 33fps, and about 3% are under 26fps. These are very noticeable slowdowns.

    At 80fps mean, your standard deviation may jump to 14 fps (it's not a linear progression in real life, but for argument's sake...), 97% of your frames are at 52fps+, and 99.85% above 38fps. So it's smooth all the time, not just when you're standing around with nothing happening.

    And that's why NVidia is still in business.

  • As for waiting for the "patch" for Dues Ex. It has been out almost as long as the game. It was the game engine at fault. Specifically the engine had "texture thrashing" problems. Eidos has a new dll to download. Also, most game that I have seen that run better on a Voodoo than anything else is usually because the game has been optimized for glide and not anything else. Hence, if you run a game, like Deus Ex, on a Voodoo, but put it in Direct 3D mode you will get the same crappy frame rates that everyone gets with their Direct 3D cards. So, this is an issue having more to do with the games than NVidia's drivers.
  • Stop using windows goddamnit!!!

    How would I look outside?
  • Same here. I only had 3dfx boards (all three of the Voodoos) until finally I broke down and bought myself a GeForce DDR. I had no idea what I was missing, and I'm not going back anytime soon either.
  • I prefer 3DFX's approach to improve quality, over nVidia's raw speed philisophy. The main reason to go with nVidia is an attempt at future-proofing. Graphics cards are still such a hotbed of development though, future-proofing probably isn't possible.
  • Rendition's not dead, it just looks that way...
    Rendition Revival [levitate.org]


    --------
  • Comment removed based on user account deletion
  • by Watts ( 3033 )
    I remember those early days of 3D gaming as I was able to bask in the glory of VQuake. It's too bad that Rendition has sort of faded out. The Verite chipset was actually fairly decent for the time but it got squashed by the juggernaut that was Voodoo.

    If anyone wants to read more about VQuake, the first 3D accelerated version of Quake, I'd suggest checking out the articles Michael Abrash wrote for Dr. Dobbs magazine, some of which were reprinted in his Graphics Programming Black Book. I believe that book is out of print now as well, though.

    I was lucky enough to see Abrash give a speech last year and was able to talk to him about some things, including VQuake. In retrospect, it's amazing what was done with such a low fill rate. :)
  • Virtually every reviewer benchmarks cards on Quake 3. The GeForce cards score highly here. They don't consider FSAA because it doesn't do much for Quake 3. Add in FSAA and GeForce performance plummets.

    I'm not really into twitch games, though I do enjoy a quick bash at Unreal now and then. I am into sims, and FSAA is a godsend. On a low-spec system like mine (dual Celeron 400), the V5 is much the better choice where FSAA is a must.
  • God, I was thinking exactly the same thing. It reminded me of the line from MST3K:

    "Filmed in Confuso-Vision for your viewing displeasure."

    I have rarely read a more incoherent article. It didn't actually SAY anything.

    One of the downsides of the web: most of us aren't lucky enough to have an editor. This guy needed one.

  • Nothing's wrong with the Geforce 2 or its drivers, the problem is with the game. Deus Ex is based on the Unreal engine, which is glide-centric and was glide-only when it first came out.

    It's not just Deus Ex. In general, the GeForce 2 drivers seems to be wackier than those for other cards. I've seen cases where installing newer drivers trashes your machine in weird ways--ways you wouldn't expect to be related to video card drivers. There's a good list of games with GeForce 2 trouble. Great card, but the drivers were released much too early. Don't let devotion to Nvidia blind you here.

    In case I look like a pro-3dfx zealot, I'm not happy with the recent Voodoo cards either. The power consumption is about 4x what is should be. Good grief.
  • The best card I ever bought was a Millenium 1. I am still using it in my OpenBSD box. I spent nearly $500 on it -- back when I didn't have much money -- and I used that card steadily for probably three years. When I bought it, it was the fastest DOS VGA card you could buy, and also was damn fast in Windows. 3-D acceleration hadn't even been thought of yet.

    I haven't been afraid since to spend a lot of money on a video card, but I'm starting to re-evaluate that a bit. NVidia is moving SO fast with their cards that it's getting foolish to try to buy their top-of-the-line; in six months it will be half-assed at best. It has let them gain a lot of ground but it sure does shorten the life cycle of gfx cards. :(

    There's an upside to all the progress too, of course. I downloaded and ran that XL-R8R utility. Pardon my language, but F*CK that is am impressive demo. I remember the old Amiga demo scene -- those guys would have (and probably still will) shit their pants when they saw that. I wouldn't have believed it could be done live until I saw it. I figured graphics like that were another two or three years out -- WRONG. Wow. Recommended. (www.madonion.com)

  • He did mean in 3d. Having 32bit color in 2d is something I remember being able to do in the 2meg trident days. Maybe not a wise idea, but possible.
    treke
  • Last I heard Tribes 2 was in the "Linux isn't out of the question" zone. Dynamix wasn't committing to ever releasing a Tribes 2 Client, but it wasn't quite agreeing to one either. Hope it does happen though.
    treke
  • I just dl their newest source and i counted 3 switch statements in the entire source.
    And they have direct support for the KX133 and KT133 finaly. Cant wait untill i COMPILE this and install it from the SOURCE direct from
    ftp://ftp1.detonator.nvidia.com/pub/drivers/englis h/XFree86_40/0.9-5/NVIDIA_kernel-0.9-5.t ar.gz

    Bahaha! What's that, the kernel source for their DRI/DRM ripoff? NVidia are most definitely keeping their X/GLX drivers closed. They need to release the kernel source because compiled kernel modules are tied so closely to the kernel version. It's easier just to release source and let people do a little compiling.

  • I don't know about you, but i'm more interested in what nader's running -- if you know what I mean.
  • I dunno about having to be rich... I got a WinFast GF2MX for $140, and I'm quite happy with the 40-70fps I get in Counterstrike. Great performance for low cost is what is propelling nVidia into the territories formerly covered by 3dfx, and nVidia may soon be invading 3DLabs' and the chipset manufacturers territories.

    Personally, I was hoping for a buyout of the Aureal assets by nVidia, but that's not happening. Although.... Word has it that nVidia "aquired" Aureal's techs. Whether it was sneaky and underhanded, I don't know, but it could bode well for the future.
  • actually, the 32 bit controversy was almost as big a stumbling block to 3dfx as the T&L one, because almost EVERYONE heard about it. If you aren't the type of company that likes to be first in their industry, you're likely to fail. Even if the new feature isn't particularly useful, or isn't useful at that particular time, if the consumer sees it as being a big plus, you're going to be in deep trouble. I mean, 3dfx was almost like free marketing for Nvidia, because ALL the websites and magazines published their blunders in comparison to Nvidia's innovation.

    And don't worry about T&L not comming soon to games. While it is true that the only games that support it right now are the lesser known titles, we should see more mainstream titles that support it by the time napalm rolls around. Hey, doesn't soldier of fortune support it? That's a good sign already...

  • ... and for good reason. Because 3dfx is basically reusing the same old architecture, and the fact that they're processing chips cannot really go any faster than they already are, they decided to just slap on multiple chips in order to compensate for their lack of engineering. The voodoo 5 has like 4 VSA 100 chips on it i believe (correct me if i'm wrong).

    Of course since you increase the number of processors, you also have to increase the ammount of memory. That is why 3dfx's cards are getting bigger and bigger. However, with the release of napalm, the card will be 'normal size' since it's an entirely new architecture.

  • No, 3dfx was smart to release FSAA. And still, it is the best and fastest implementation to date, because it is hardware, whereas nVidia used some kind of driver-trick to scale down the resolution, giving something that's slower and blurrier.

    Also, they never supported T-buffering, which may not be supported yet, but definitely proves to be great when implemented, unlike HW T&L, which sometimes actually lowered image quality.
  • But you must admit that, even in Q3, the HW T&L is minimal at best.
  • The article says it runs at 90 MHz, but in reality it's 100. This means it's slightly faster in single-texturing games.
  • If its a bad game its a bad game, the card you're using can't fix bugs in the program. Sin will crash wether you're running a Voodoo card or one from nVidia.

    I admit that poor code is poor code, but the fact is that the game was coded for both cards and only one worked. If the card properly worked with OpenGL as it claimed it did, then there is no reason the TNT2 should fail where the Voodoo didn't. Unless of course the TNT2 uses some other sort of OpenGL standard than everyone else...

  • considering 3DFX is seeing how many chips they can plug onto a board, I'd say _they_ are the ones trying to use raw power/speed to achieve goals.
  • happened when they went proprietary. Basically: They got greedy. This allowed an opening for Nvidia to swoop in and snap up that market. The other stuff, like 32bit color (which _is_ better) and such just made it easier.
  • If your fridge starts up while you're playing Q3A with this monster you might blow a fuse :)

    And how much lift do those fans generate? What if you opened your case and those fans just blew your computer on to its side?

  • I just dl their newest source and i counted 3 switch statements in the entire source. Why does it matter how many switch statements they used?
  • Fact? I don't know about your machine, but mine flies with my GeForce at 32 bit color. And there is no doubt it looks better. I use to be a 'nah, I don't need 32 bit color' til I spent some time switching between them. A _lot_ of it depends on whether the game designers took the time to provide true 32 bit color maps. So what is wrong is that it isn't even _close_ to a fact that performance is poor. Incidentally, it also flew under my TNT & TNT2. Now I just use 1200x1024 instead of 1024x700. To suggest not using them because of performance just means you need a new machine.
  • The only "compatibility" that the Voodoo has over the TNT2 is glide, which is dying. Even 3dfx have admitted so, and will concentrate on Direct3d and OpenGL, which all games are now designed around anyway.

    People have been predicting the death of Glide for about three years now and it's still supported. Granted Q3A knocked a nail into its coffin by going OpenGL only, but it is still around, eg. UT, etc.

    Direct3d is still a poor cousin (maybe the upcoming DirectGL will be an improvement). OpenGL is improving now that it's actually being used.

    Frankly, the main reason I still by 3dfx is a personal one on a matter of principal, I find it hard to support a company that condones whores like Tom Pabst. But that's just my opinion, someday I'll give nVidea a try, maybe.
  • I suspect 3DFX has a problem at the core of their chips that they are having a hard time working around. Why else would it take dual chips to even compe close to keeping up with a GeForce2? (and it doesn't but the latest reviews, though I suspect when the drivers get better it will be much closer)

  • You need *one* libGL for all the DRI cards... This means one libGL for ATI Rage 128, Matrox G400, Intel 810, and 3dfx Voodoo3/4/5. This is the libGL shipped with XFree86 4.0.* and up. nVidia requires a completely different libGL.

    Ranessin
  • Should also mention that the V5 needs it's own powersupply (what a pain)
  • Fewer colours needs less memory for the same resolution. At the time, any more than 16mb for 16bit colour would be overkill much like 64mb for 32bit colour would be. Having owned a voodoo2 and then a geforce though, 32bit colour is a drastic improvement that they shouldn't have put off so long, IMHO.

    As for the 22.5bit colour, it's actually 16bit dithered to 32 (or close to... I think they lose a few bits for some reason). It's "22.5 bit" in name only.
    ---
    Where can the word be found, where can the word resound? Not here, there is not enough silence.
  • while yes that is the kernel source, You can DL the GLX source from ftp://ftp1.detonator.nvidia.com/pub/drivers/englis h/XFree86_40/NVIDIA_GLX 0.9-5.tar.gz

    Nvidia has had the source open sence they releaced 0.92. The default may be to install a RPM but the Tars have been there the whole time

    The ftp site is down as of 0-Dark 30 or else i would tell you the number of switch statements in that also.

    Sanchi
  • I don't even know how to respond to this, your video card drivers must be playing a horrible trick on you.
  • by Jett ( 135113 ) on Sunday September 24, 2000 @06:13PM (#757396)
    The TNT was NOT the first consumer card with 32-bit support. That would be the Rendition Verite 2100/2200 (it also had a pretty decent full OpenGL ICD).

    I agree 3dfx has lagged on implementing features, not counting the FSAA/T-Buffer deal in the VSA-100 generation which is pretty cute. It would be nice if they lead the market into new features rather than the other way around.

    I strongly believe that 3dfx is positioning themselves for a solid comeback. They bought out Gigapixel which had some really great tech from everything I have read. Low power, high performance stuff, with very low transistor counts, they were a finalist in the X-box bid but lost out to Nvidia. The "Rampage" core has been in developement for a very long time now, a huge amount of 3dfx's R&D budget has gone to developing it. Considering the resources they've thrown at it I don't see any reason why it won't kick ass, unless they run into another component shortage (one of their biggest problems has been the RAM market).

    I think 3dfx is going 1 of 2 ways: They either release the first "Rampage" core product and it kicks all ass and the company bounces back, or they release the "Rampage" product and it doesn't do very good and they continue their downward spiral and are bought out by another company. I personally think the first is more likely, but then I own a little 3dfx stock so I'm naturally a little optimistic.
  • That is !exactly! what I thought while reading the article. 95% babble and 5% relevant content.

    Someone needs to teach that guy the meaning of the word concise.
  • iirc it has two vsa-100 chips on it, each with their own 32mb memory banks.

    (i've been debating a better video card than a voodoo3 3000 16mb agp for playing counterstrike, and have been checking out the voodoo5 vs the geforce 2 cards. i may have mixed up the details though, so correct me if i'm wrong.)

    eudas
  • Nobody seems to have mentioned what I believe is the real reason for this company's slide.

    3dfx used to sell their chips to OEMs, in much the same way nVidia does. But after the Banshee, they bought a video card maker, and announced that they would no longer sell their chips to OEMs. In my opinion, this was a blatant attempt to monopolise the 3D card market.

    Naturally, the video card makers had no alternative but to buy chips from other companies, or go out of business. This set up a situation of 3dfx vs the rest of the industry, and combined with nVidia's superior technology, the rest is history.

    Pride goeth before a fall, as they say.

  • I'd be very surprised if SOF supported T&L considering that it uses the Q2 engine. I know MDK2 was the first game I heard about that supports T&L. Tribes 2 and Doom 3 will both support it... and they're both comming out for Linux too!! :-)
  • I agree! I had to reread most of the paragraphs 5 or more times before I could parse what it was the guy was trying to say. And even after that, there was nothing of interest to be had in the article.
  • >I was always entertained by 3dfx's late
    >notion of including support for 22.5 bit
    >colour, as a halfway point between 16 and
    >32. I don't remember how they were going
    >to get alf a bit, so if someone else knows
    >the details, I'd be most interested.

    The colourdepth is a technical value which describes the number of distinctive colours that a card can produce. 16 bit equals 65536 colours, 32 bit equals more than 16.7 million colours. The problem with these technical numbers is that they don't take into account what a human eye can distinguish. Visual perception is not linear. Users of early drivers for tnt-cards used to have the problem of games showing up too dark. That is because games were optimized for Voodoo2 gamma values. Gamma is a number which describes the non-linearity of the digital-to-analogue conversion. The Voodoo2 had a gamma value which "wasted" less colours on the dark end of the scale and was better adapted to human vision. To get the numbers of distinguishable colours which can be put on the screen by a Voodoo2, a graphics card with the now more common TNT1-gamma would need 22.5 bits per pixel.

  • by TDSObeseWhale ( 221698 ) on Sunday September 24, 2000 @06:19PM (#757420) Homepage
    It is not too infrequent that we see many articles labeling gamers as "fanboys" of 3dfx or nVidia... Let me tell you right away, that 90% of gamers are not "fanboys". Like it or not, most gamers will become fans of the company with the fastest across the board chip, which currently is nVidia with their GeForce2 Ultra (ATI took the title for a bit, but that is arugable due to ATI's humiliating performance with Full-Screen Anti-Aliasing). Us gamers are not the strange breed many label us as, we don't tend to flock to a brand due to any sort of allegiance to a name, but rather we will end up flocking to the company with the best, fastest chip.

    Now, onto my main point. 3dfx has a very thin fanbase, and there is a clear reason behind it. 3dfx, time after time, again and again, has been making the same mistake of denying the future. We've heard the argument that the Voodoo3's lack of 32 bit color support (and the memory to handle that feature) is what started 3dfx on its downward spiral. Yet, 3dfx has AGAIN made this mistake with their resistance twoards using a hardware transform and lighting solution... Any game developer will tell you that Hardware T&L is the way of the future, and 3dfx is shooting themselves in the foot. And what about the FSAA card? 3dfx did have a point when the voodoo5 was released and creamed the GeForce 2 in antialiasing performance, but those days are over, and with new drivers, the GeForce 2 beats the voodoo5 in its home territory, FSAA.

    So to conclude, us "nVidia fanboys" have reason behind what we believe in. As soon as 3dfx comes out with a better chip than nVidia, count me in on the 3dfx bandwagon. -Matt "ObeseWhale" Grinshpun
    -The Darker Sector [3dactionplanet.com]
    -Website coming soon! Team Corrosive Quake 3 mods.

  • Lisa "I recognize all three of those words but that statement doesn't make any sense."

    [sign on a closed-down movie theatre that reads: Yahoo Serious Festival]
  • by Bilestoad ( 60385 ) on Sunday September 24, 2000 @04:36PM (#757428)
    When the Voodoo II was current I was using a Matrox Millennium II with a Voodoo II - best of both worlds. The Matrox was an uncompromising professional card, laid out by engineers who understood analog and how to produce a sharp, sharp signal. The Voodoo was the best 3d around, of course.

    Today you get all in one cards, but a lot of them are still not as good as the Matrox was. It's trial and (expensive) error to get one that's as satisfying in a text editor as it is in Tomb Raider. Anyone care to name a combined card, maybe a GeForce II or Radeon, that is as sharp as it should be in the highest resolutions?

    The problem goes away when LCD monitors become affordable in 21/22" sizes, but I think that's at least a year away .
  • by Fervent ( 178271 ) on Sunday September 24, 2000 @04:38PM (#757430)
    Given the relatively quick turnarounds technology companies have, I don't feel inclined to put 3DFX down for the count. They were a pioneer (along with PowerVR) and pioneers aren't always rewarded in their time.

    In PowerVR's example, they no longer make PC cards in bulk, but their chipset is in use with the Dreamcast. It has shown to be surprisingly robust and has turned the Dreamcast from a dark horse into a system with some incredible games (Sonic Adventure, NFL 2K1, Jet Grind Radio, etc.)

    Same goes for ATI, which has been running in neutral for the last few years, then released the jaw-dropping Radeon this year.

    Point is, don't count 3DFX out yet. Their latest chipsets are nothing extraordinary, but a few engineers and some faith within their infrastructure might be enough to turn them around.

  • They open their source code. You guys know who to buy stuff from.
  • Nvidia pushes the limit with every new generation of card they make. 3dfx just sells an overclocked version of the original Voodoo card. They can best be compared to Intel and MS. MS has been selling Windows 95 under different names for the last five years with a little more every new generation. Intel has been doing the same with the Ppro core. Nvidia learned their lesson with the NV1 and came back looking for blood with the TNT. It has been downhill for 3dfx ever since.
  • by Froid ( 235187 ) on Sunday September 24, 2000 @04:46PM (#757441)
    Success generally breeds complacency, and even the most intelligent falter when they've finally reached the top and try to stay there. It takes a leader with a real inferiority complex (like Bill Gates's) to keep up the fight after all the competitors have been slain, and it takes a lot of cahones to squash all future competitors before they can rise to the challenge. Thankfully, many companies can recover after their pointy reckoning; look at IBM of today as compared to IBM of even ten years ago, and you'll see a company who's managed to extricate its head from its nether-regions after falling from the top. Will 3dfx be like IBM or will they follow the likes of Polaroid (and Kodak, at least with their copiers)? That remains to be seen.
  • I recently upgraded from a Matrox Millenium G200... absolutely beautiful card. Wonderful 2D performance. I upgraded to a Creative Labs OEM TNT2, and it actually looks better! And that was before I bumped the refresh rate up from 85hz to 120hz... impossible on the Matrox with my resolution.

    In short, the Matrox was the best of the times. In fact, the best looking card I've seen was the Matrox G400.

    "Evil beware: I'm armed to the teeth and packing a hampster!"
  • Nvidia is clearly the graphics chip maker to shoot for now, obviously.


    What I _am_ concerned about is the proprietary infrastructure they've put in place in order to support Direct Rendering. Correct me if I'm wrong, but they aren't using the standard "stuff" that XFree provides but their own special infrastructue to support direct rendering. Their binary drivers have recently caught up in performance to its Windows cousin, but I'm always weary about having just binary only drivers. Want to upgrade to the newest kernel? Got the source? No problem! Binary only? You may have to wait for the vendor to release a version for that kernel...


    As for my next system, I will purchase a Matrox G450 /w dualhead. They've even released driver source/binary to support dualhead, etc (link provided below)


    Matrox beta drivers [matrox.com]
  • Everyone talks about 3DFX and Nvidia from their vicarious marketing experience, which usually ends up being on the PC, because that's where people are using their systems. However, no one seems to realise that 3DFX supplied the graphics hardware for arcade machines as well. Does anyone know what is happening on that front?

    3DFX had a bigger market than the PC.

    As a side note, does anyone else think that Nvidia needs new spin doctors? What the hell is an 'N' Vidia, or a 'G' (e)force. It just sounds bad to me.

    Furthermore, what happens when we run out of high-testosterone burning, exploding and speedy names for video cards? :)
  • That must be the most obnoxious article I've read this year. I think it actually has more "Gee-I'm-clever" lines (NOT) than actual content.

    I read it, but I'm still not quite sure what it was about.


    --

  • My first 3d accelerator was a Voodoo 2, 8MB of RAM. I remember my reaction was, "oohh....I can't tell where the pixels are anymore. Graphics are good. I will never be without 3Dfx again."

    When the Voodoo 3 came out, I being a loyal fan of 3Dfx, rushed out to buy one. At the same time, my friend upgraded his Voodoo 2 to a TNT2. I didn't notice much of a difference at all, except that I could now watch TV on my computer. However, my friend had more colors at a much faster rate than I could get on the Voodoo 3.

    When the next generation of chips began being released this summer, I was about to upgrade my computer. So I waited a month or two, and read the reviews of all of them. I really couldn't see anything that would make me choose the Voodoo 5. I really wanted to support 3Dfx, but I just couldn't justify that decision. I ended up upgrading to the ATI Radion 32MB DDR, and I love it. I hope 3Dfx can get their acts back together, as they are one of the more supportive companies toward open sourcers.

  • Only difference is ATI has always catered to OEM's who were looking for value instead of the highest frame rate in Quake II. They always made plenty of money in the OEM business. 3dfx decided to concentrate on the retail channel where only about 20% of all graphics cards are sold. Creative has been successful there because it makes good products. Lately 3dfx hasn't been in the good products category when compared to Nvidia. And now ATI which has long been a leader of mediocrity has them beat in performance. They aren't dead yet, but their next product better be pretty good. And it better cost less than $600 and not have to be plugged into an outlet. It's one thing to engineer a low end card and market it to the lower end. But 3dfx made the mistake of marketing low performing cards to the high end where every last frame counts.
  • He said "I'd hate to end on a negative (I'll use the Intel740 for that later)"

    What the heck was he talking about?

    He never came back to the i740 issue.

    Was he talking about the mediocre performance for dirt-cheap price? Was he talking about the immense headaches suffered by those who installed them in VIA or SIS chipset boards?

    Sure the board took me 14 hours to get running with my VIA MVP3 based FIC 503+ board 3 years ago, and another 10 hours yesterday to get it running correctly with my new VIA KT133 based Abit KT7-RAID board, but the card cost me $38 years ago when the ATI Expert series were going for over $100. Yeah, I didn't get quite the performance of the TNT, much less the TNT2, but the image quality is still excellent, and it'll do until Xmas when I get a Geforce MX based card. (Good performance, very reasonable price at under $150 street.)

    Why did I post this? Well, I remember those days, and remember the extraordinary prices that have always been charged for the hot video cards of the day. And remeniscing about the "good ole days" is kind of fun... And I want to make sure, for anyone who scrounges up old hardware and didn't know, DON'T try to install an i740 board in a VIA or SIS based motherboard unless you have no other choice!

  • The irony of the Death of 3dfx is that 3D cards in general have been in a downward slide for several years now, with the last great cards being based on Voodoo 2 and Voodoo Banshee. The TNT came the following summer (and remember, we're only talking about 1998 here), and brought driver instability to new lows. Before the problems were ironed out, the TNT2 was released and continued NVidia's reputation as the poorest drivers writers in the industry. The GeForce and GeForce 2 have the same general troubles. Bought Deus Ex and have a GeForce 2? Gotta wait for the patch. Same goes for many games. I want NVidia to get this right, but they're obviously focusing elsewhere.

    The downfall of 3DFX was the fanboy cry of "16 bits per pixel sucks!" which is something that was picked up from interviews with John Carmack. 16 bits per pixel *can* suck, depending on what you want to do. If you're doing half a dozen passes per triangle, then, yes, you need more color resolution. If you're not, then there's no issue. This is a good example of fanboy-oriented web journalism running amok and having real consequences.

    In truth, many developers, including myself, really like 3dfx cards. The drivers are rock solid. Glide is the most predictable 3D API. Yes, OpenGL, blah, blah, blah, but Glide is number one in terms of stability.
  • You didn't look very far, did you? I still see no source for Module-nvkernel, which is statically linked with the rest of the source. You have a little bit of glue code, nothing more. All you do by buying an nvidia card is help them roll back the gains free software has made. I'm not even a free software zealot, I just want software that doesn't suck. A kernel that crashes because of a closed driver is software that sucks.
  • 3Dfx obviously fell into a position of comfort and even arrogance as the #1 card maker, and it hurt them pretty badly. They started producing products that didn't live up to expected improvement, and had some delays that cost them the market share.

    This let NVidia come out on top.

    Now I see a few minor delays in some NVidia products, and I'm concerned that NVidia might be starting to show the same kind of behavior 3Dfx did. NVidia needs to be careful to stay on top, and not let itself fall into the same pattern 3Dfx did.

No man is an island if he's on at least one mailing list.

Working...