The Good Old Days of 3Dfx 160
Fosters writes: "There's a short story about the old days of the 3D graphics world, when 3dfx (3Dfx) were kings of 3D and how things have changed in today's industry. The authors talk about how that came about, albeit somewhat light-heartedly. This sums it up as the author says, "To this day, I truly believe that this was the turning point for 3dfx and their SuperG downhill slide (that's a winter Olympics event). And it wasn't because of some fancy technology, a military leader (depends on how you look at some of the former VPs at 3dfx), or even a drill instructor named Zim (yeah, Starship Troopers- too easy). 3dfx started to lose their fan-boys and early technology adopters to NVIDIA then, who were waiting and watching, as 'Bugs' do, with TNT, TNT2 and something more than '22-bits' of color.'"
Re:3dfx still has a chance (Score:1)
Re:The same mistakes, again and again (Score:2)
Re:3dfx still has a chance (Score:2)
I just dl their newest source and i counted 3 switch statements in the entire source.
And they have direct support for the KX133 and KT133 finaly. Cant wait untill i COMPILE this and install it from the SOURCE direct from
ftp://ftp1.detonator.nvidia.com/pub/drivers/eng
Sanchi
Re:Obviously, you love Adobe and Apple as well (Score:2)
To sum up, I'd rather have the jaggies so I can gauge movement and angles, than having the "blurries," which serve no other purpose than lying to your eyes.
Re:Length.... (Score:1)
If you can't read ????? (Score:2)
Re:The same mistakes, again and again (Score:1)
The software support gives 3dfx the advantage. (Score:1)
A while ago I walked into a computer store to by a 3D card. On the shelf was the 3dfx Voodoo 3 and some card with the NVIDIA TNT2 chipset. I bought the NVIDIA card because it was about $30.00 cheaper than the Voodoo, but I found it was not what they told me.
The game that I bought with it, which I was assured would work with the game, only worked after about three patches. Even then it crashed when the real high level graphics had to be loaded. Not much other software supported it either. Even the game that came with it only worked about a third of the time.
The end result? I traded in the card for a Voodoo 3 a few days later. I couldn't be happier. It draws great graphics and runs every piece of 3D software I have come across...including the software that came with it.
The Voodoo 3 may not have all the features of the TNT2, but at least I can be sure that it will run most of the software that I have without hassle. That is also why 3dfx can continue along with inferior technology...because it is compatable.
Re:Self-Inflicted Wound (Score:1)
I was always entertained by 3dfx's late notion of including support for 22.5 bit colour, as a halfway point between 16 and 32. I don't remember how they were going to get half a bit, so if someone else knows the details, I'd be most interested.
Re:No surprise; most wounds are self-inflicted (Score:2)
Creative seems to be doing a pretty good job of this lately. Sunk their competition into bankruptcy with frivolous litigation and then swooped in to buy up the pieces. Which is really a shame because Aureal arguably had better technology.
I'll tip my hat to 3dfx. They may be losing the war, but at least they've been mostly fighting a fair fight. Companies like Creative, or even Intel piss me off because they are on top and complacent, but they stay there through marketing and lawsuits, rather than technical merit.
At least AMD processors have fared better than the Aureal cards :/
- Just another AC
3Dfx vs nVidia vs The World (Score:5)
3Dfx makes cards. nVidia does not - they simply make chips and sell them to manufacturers to make the card.
They both claim to be better than the other. Now, gee, am I the only one seeing atypical competition there?
3Dfx wants to innovate - ie; use a stagnant chipset for a base and make a better design. nVidia wants to go faster - ie; redesign every 6 to 12 months to milk out another 3FPS on systems that aren't already running into bus limitations.
Reguardless, I won't buy either. Why? Because they both claim OpenGL support. Now, I don't know about you folks, but seeing as I work in AutoCAD frequently, that means hardware support. Neither of them have it. ATI doesn't. #9 didn't. Why? Most of them view software as the future.
The mistake both nVidia and 3Dfx are making is that they're trying to take on the world when they don't have the staff, technology, or knowhow to do it. I've seen the Voodoo3, the Voodoo5, the TNT2, the GeForce MX, etcetera. The 2D quality, quite honestly, SUCKS. I've seen better 2D rendering in a blender. And I'm not talking a video blender, I'm talking the one in my kitchen. The refresh rates are for the most part inadequate for professional graphics work, the 2D image quality is abhorable at best, and considering I'm hitting bus limitations, I don't see how the extra 30FPS it's capable of in the chipset is going to help any.
I remember back about a year or two ago, I have a 3DLabs Permedia2 8M PCI card and the Voodoo had just come out. My friend snagged one, I scoffed. I said 'ha! My card already does all that stuff.' His 8M Voodoo continuously and routinely got smoked by my Permedia2 when it came to games. QuakeGL, Final Fantasy 7, etcetera. My Voodoo2's (2x 12M - for Unreal Tournament, there is no other option) combined with my #9 RevolutionIV 32M still are VERY hard pressed to beat the Permedia2 in any number of tests.
Now if you go read AnandTech sometime, you'll note that a lot of the cards these days - at least the gaming cards - are getting *OBSCENE* FPS rates. >70FPS. And when you pair the big names - nVidia, 3Dfx, Matrox, ATI, etc - up in, say, a pIII 733 or whatever it is, they ALL get the SAME RATE at 640x480x whatever depth. Why? The bus is full. Can't push any more data than that. And what are these companies doing about it?
3Dfx is adding an external powersupply for all the active cooling you're going to need just to run the Voodoo6 at normal speeds. nVidia is gleefully ignoring it and boasting a faster and bigger chip. ATI's touting more and more memory. Now, bear in mind, if the chipset doesn't use the memory for ZBuffering (mind you, not true ZBuffering), 64M of DDR SDRAM is doing you no good - 1800x1440 only needs around 14M or 16M IIRC. The companies are putting memory on the cards to make them look bigger, perhaps perform a bit better, and ignoring the core problems.
Quite frankly, I could care less whether or not 3Dfx is 'stagnating' or nVidia is 'amazing' or what have you. I need a card that works with and around bus limitations, that can do 2D and hardware OpenGL, that can do what I need. I don't buy mass marketed cards because unlike the Permedia2, which *was* mass marketed (Diamond FireGL 1000 Pro (PCI and AGP)) and an excellent card, today's cards are the equivalent of junk for me.
What do I use? Well, now instead of putting multiple cards in a single system, I'm stuck using top end cards. We're talking cards that cost more than your typical PC and more than a well configured laptop in some cases.
I just purchased, much against my desires but in tune with my *needs*, a $4,200 Wildcat 4210 graphics adapter. What is it? Dual pipeline. Dual head and a few more outputs. 90 Hz at 1824x1368. One AGP Pro 110 and two PCI connectors. All on a single card. That requires 110W of power. Wildcat was just bought by 3DLabs, the name in 'affordable' cards. (The Permedia3 is affordable, but not enough for what I do.) I was forced into spending more for a single video card than I spent on the entire system. ($2,935 for the curious. I reused the 18G SCSI-UW disks and controller.)
Now maybe some of you don't have this problem. Actually, I'm betting most of you don't. But for those of us who actually really *don't* do this for a living, per se, but need the hardware anyways (I use AutoCAD for various engine modification work on a very regular basis) are getting screwed by the dick wars between 3Dfx and company. It used to be that I could do just fine with a happy Permedia2 and AutoCAD R14. Then it was a #9 Revolution IV 32M. I went to go buy something with excellent 2D quality that could perform better than the #9 Revolution IV and found out that nothing does. If I want 2D, I have to go Matrox, which doesn't perform terribly well under AutoCAD R14 or AC2K. If I want real rendering performance, I have to go up to the professional cards, which I really didn't want to do. Now maybe the 4210 was overkill, but quite frankly, any of the cards is a pain to find and order. I could have probably gotten a 3DLabs Oxygen GVX420, but they also made the mistake of ignoring bus issues, and boom. The card ends up limited by the bus, performing really not all that much better than the other options. Just with ZBuffering and a $2300 pricetag. A single AGP/PCI combination (yes, two connectors, two PCBs) still runs into bus limits before the card hits its.
I don't know about you, but I really feel cheated.
Maybe I'll just put the Wildcat on eBay. Bidding starts at $1.
=RISCy Business
Re:Who says 3dfx is down and out? (Score:1)
OK... 3dfx is great.
Ranessin
Bad example... (Score:2)
Re:Self-Inflicted Wound (Score:1)
Re:Who says 3dfx is down and out? (Score:1)
Anyone can sell information (Score:1)
Re:Who says 3dfx is down and out? (Score:1)
Look at their stockprice and earnings for the last year or so. Where is the Voodoo5-6000 that was announce last year and still doesn't have a release date?
And their next-gen card, code name RAMPAGE, has been in development for 3.5 years and $50mill R&D, and it doesn't have a release date.
Re:here come the 3dfx-haters...sigh (Score:2)
Incorrect. Matrox provided all the info needed for Utah-GLX to write 3d Drivers for their cards long before 3dfx released any information at all. 3dfx did have binary x86 only Glide drivers for linux for a while, which is probably what confused you.
3dfx didn't get open source friendly until it became clear that nvidia was kicking their ass on the high end. product differentiation is the name of the game for the also-rans.
The irony that is 3dfx (Score:5)
Ok well at least the first part of that cliche is true. The turning point of 3dfx is quite ironic though. After Nvidia came out with the Tnt (which was the first consumer chip to have 32 bit color), and both Nvidia and 3dfx began revealing their plans for their next generation chips, we were all surprised by 3dfx's arrogance (or naiveness) when they announced they would not include 32 bit color.
"Speed is King"
That was 3dfx's response when everyone questioned their next product, the voodoo 3. Alright so both products were relatively compareable in speed, and 3dfx still had some clout with glide games (tribes anyone?). Lessons (not) Learned?
Now what has happened? Nvidia introduces Hardware T&L on their chips, but the controversy is, who needs hardware T&L when there are no games supporting it (back then that is)?? 3dfx yet again sat on their laurels and decided to let Nvidia introduce it in their products first. While it's quite true that HW T&L was not really important back then, Nvidia was smart and marketed it as the next revolution in 3d acceleration. I mean, who wouldn't want to remove the CPU bottleneck and let the graphics card handle most of the 3d rendering? Sadly, 3dfx was only able to say "see? We have 32 bit now!"
The irony that is 3dfx
Now what do we have with the voodoo 5? Still, 3dfx REFUSES to incorporate onboard T&L when it's becomming more and more apparent that it is important these days. However, now 3dfx is ditching their "speed is king" philosophy and is trying to be innovative with their anti-aliasing and T-Buffer technology. But it seems that Nvidia has learned from 3dfx's mistakes and have included anti-aliasing technology of their own.
The road ahead?
After all these mistakes, has 3dfx learned their lesson yet? Who knows, perhaps 3dfx was right all along about not needing HW T&L right now (still, not many popular games support it) and they may very well outdo Nvidia with the release of the Napalm. But you have to admit, Nvidia played the marketing card extremely well with their Geforce cards, even if T&L wasn't really useful at the time. I pray to God that 3dfx will get their feet back on the ground, otherwise we may see another monopoly in the computing industry.
What about Add-On Cards? (Score:1)
Personally, I love my little Voodoo2....the only issue of course being limited to a resolution of 800x600. Even with overclocking, there's only so much you can do with the card.
Before the Voodoo2, 3DFX chips were blowing people away...Woodoo2 came along, and for a short period of time, were amazing. I remember playing Quake 2 on my Voodoo2 and being told that my CPU wasn't fast enough to make the Voodoo2 really perform properly. I stick a Voodoo2 inside a box with a better processor, and while nice, the limitations it was bound to prevented me from going out and grabbing a Voodoo3.
Nowadays, people want high framerates and DVD decoding (or so it seems). Every box I pick up these days says "DVD decoding". I have a GeForce 256 32mb DDR and lemme tell you, no matter which drivers I use, the DVD decoding SUCKS. Maybe my system is holding it back, but honestly....For gaming, I choose NVIDIA....for DVD I pick ATI....woudln't it be nice for someone to throw together an MB with 2 AGP ports, perhaps one for an addon card?
As long as 3dfx keeps up with NVIDIA and ATI as far as releases, they'll do fine. As far as Linux support, 3DFX has easily provided the quickest and easiest. While this may change in the future, if they keep catering to the geek community in this way, 3dfx will most likely make some kind of comeback.
Re:There were advantages (Score:1)
Re:3Dfx vs nVidia vs The World (Score:1)
ummmm (Score:1)
If its a bad game its a bad game, the card you're using can't fix bugs in the program. Sin will crash wether you're running a Voodoo card or one from nVidia.
The Voodoo 3 may not have all the features of the TNT2, but at least I can be sure that it will run most of the software that I have without hassle. That is also why 3dfx can continue along with inferior technology...because it is compatable.
The only "compatibility" that the Voodoo has over the TNT2 is glide, which is dying. Even 3dfx have admitted so, and will concentrate on Direct3d and OpenGL, which all games are now designed around anyway.
Re:3Dfx vs nVidia vs The World (Score:1)
While it is true that most consumer cards have OpenGL drivers that are...lacking...for professional work, you completely miss the point of having more memory on graphics cards: textures. Texture thrashing can be a significant problem if a level in, say, a FPS uses a lot of textures, or high-res textures. By keeping them on the card rather than having to continually transfer them over the bus, performance is increased considerably. This is also why texture compression, IMHO, is something that all game developers and card/chipset manufacturers need to support. This is a reaction to the limitations of the system bus, so your complaint about card/chipset manufacturers seems a little unfounded.
And, not speaking as a graphics professional, I find today's cards' 2D performance to be adequate -- decent image quality and insane refresh rate support mean no eyestrain for me. Dunno what your standards are, but again, I'm no professional hehehe.
not nVidia's problem (Score:1)
Nothing's wrong with the Geforce 2 or its drivers, the problem is with the game. Deus Ex is based on the Unreal engine, which is glide-centric and was glide-only when it first came out.
Try playing Dues Ex on a Voodoo under Direct3d or OpenGL and you'd have the same problems.
To the moderators (Score:2)
Click away I've got tons more Karma than you'll ever have.
have you actually done comparisons? (Score:1)
And this is on a 450 celeron with a TNT2 Ultra.
Re:3Dfx vs nVidia vs The World (Score:1)
> All of the modern cards have full rasterization
> support for OpenGL, but I guess you are refering
> to geometry acceleration.
[and]
> Today, there isn't a single combination of
> rendering attributes that will let a wildcat
> out-rasterize a GeForce2.
What is the case with antialiased lines? Both 3D and 2D? These were once considered a 'professional' level feature. Are these now accelerated in consumer cards?
--
- Antony Suter (antony@mira.net) "Examiner" openpgp:71ADFC87
- "And how do you store the nuclear equivalent of the universal solvent?"
Re:3Dfx vs nVidia vs The World (Score:2)
Re:nope (Score:4)
It adds up to lots and lots of passes. I am expecting the total overdraw to average around 30x when you have all features enabled.
>Q. Could you give us your thoughts on T&L? Why does 3Dfx say it's not important?
Contrary to some comments here, 3dfx didn't just "decide not to put in T&L", the didn't have that option. Product development cycles can take years, and you can't just change your mind at the end.
They don't have it, so naturally they downplay the importance of it.
John Carmack
Re:nope (Score:4)
At a whopping 6 mpix or so...
Rendition did a lot of things right, even on their very first card. They had all the blend modes and texture environments from the very beginning. The only thing they didn't have was per-pixel mip-mapping.
If they had delivered the V2xx series on time, they could have had a strong foothold before voodoo2. The V3xx seried would have been a solid TNT competitor, but again, it wasn't ready on time. They wound up ditching that entire generation.
John Carmack
Re:Nvidia, proprietary concerns (Score:1)
Hardware T&L isn't a big issue... (Score:1)
In some situations the on-board T&L unit can actually slow things down. A considerable amount of data must be sent to and from the card when using hardware T&L. If you're already bouncing off the memory bandwidth limitation of your video card (1600x1200x32 anyone?) then you'll see little or no advantage to hardware T&L.
Of course I think hardware T&L units are generally a Good Thing. They're just a bit over-hyped IMHO.
3dFx (Score:1)
32 bits -- I'm not sure they were wrong! (Score:2)
Self-Inflicted Wound (Score:5)
How is it that, in January, I bought a dang-fast TNT2 for $60, while the Voodoo2, a slower card, sold for over $100 everywhere I looked? Simple - the different board manufacturers compete with each other, trying to sell their TNT2 board over somebody else's. The 3dfx board manufacturer just tries to sell their boards to Voodoo zealots, who are, for the most part, GeForce believers now.
Tell me what makes you so afraid
Of all those people you say you hate
Re:3Dfx vs nVidia vs The World (Score:4)
>Reguardless, I won't buy either. Why? Because
>they both claim OpenGL support. Now, I don't
>know about you folks, but seeing as I work in
>AutoCAD frequently, that means hardware support.
>Neither of them have it. ATI doesn't. #9 didn't.
>Why? Most of them view software as the future.
All of the modern cards have full rasterization support for OpenGL, but I guess you are refering to geometry acceleration.
The situation has changed since you last looked at it.
The Nvidia GeForce cards have an extremely capable geometry accelerator, and they have the ability to fetch display lists either over AGP with a large bandwidth savings due to vertex reuse, or store the display lists completely in local memory to remove all vertex traffic from the bus.
The issue with professional OpenGL support has mostly been the focus of the driver writers, not the hardware. I think that Nvidia's partnering with ELSA to work on professional app certification with the Nvidia hardware was an extremely good move.
There are a few edges that the expensive professional boards still have over the nvidia consumer cards, but not many:
You can get more total memory, like a 32mb framebuffer and 64mb texture memory configuration. We will probably see workstation graphics systems with up to a gig of memory within a year. Consumer cards will offer 128mb next year, but the workstation cards can easily maintain an advantage there.
This has a cost, though: large, expandable memory subsystems can't be clocked as high as the single-option, short trace layouts that nvidia does. Even dual pipe workstation boards can't match the memory bandwidth of a GeForce2.
You generally get better high end DACs and shielding on workstation boards. The upper end of the consumer boards will do the high numbers, but it just isn't as clean of a signal.
Dual monitor has been supported much better on the workstation boards. This is starting to become a feature on consumer boards, which is welcome.
The consumer cards are still skimping on itterator precision bits. Under demanding conditions, like very large magnitude texcoord values stretched a small increment across a large number of pixels, you can see many consumer cards start getting fuzzy texel edges while the workstation cards still look rock solid.
Probably the most noticable case is in edge rasterization, where some workstation cards are so good that you don't usually notice T-Junction cracks in your data, while the consumer cards have them stand out all over the place.
Next years consumer cards should fix that.
When the consumer cards first started posting fill rate numbers higher than the professional boards, it was mostly a lie. They got impressive numbers at 640x480 in 16 bit color, without blending, depth buffering, and filtering, but if you turned on 32 bit, depth, blend, trilinear, and ran at high res, they could fall to 1/4 or less of the quoted value.
Today, there isn't a single combination of rendering attributes that will let a wildcat out-rasterize a GeForce2.
Wildcat was supposed to offer huge 8 and 16 way scalability that would offset that, but it doesn't look like it is coming any time soon.
The workstation vendors do stupid driver tricks to make CDRS go faster, while consumer vendors do stupid driver tricks to make Q3 go faster.
We bought three generations of intergraph/intense3D products, but the last generation (initial wildcat) was a mistake. We use nvidia boards for both professional work and gaming now. I still think the professional boards are a bit more stable, but they fell behind in other features, especially fill rate. Being an order of magnitude cheaper doesn't directly factor into our decisions, but it would for most people.
John Carmack
Ugh (Score:1)
Re:What about Add-On Cards? (Score:1)
Length.... (Score:1)
I mean, my Voodoo2 barely fit into my last CPU (DIE COMPAQ, DIE DIE!), and I mean, with a little work (read bending) I managed to get the card in. I haven't had the problem since...but then, today, reading the posts, I saw some shots of the Voodoo 5....and it just astonished me that the cards don't really seem to be case frriendly....just from looking at it, I don't think it would fit into my case, and I ain't bending a $500 card just so it fits in there.....
I wonder if this has some crazy effect on our buying habits....
Or I'm just nuts
(Check out http://www.mooshware.com/images/3dfx/V5-6000.jpg to see what I mean)
Re:The irony that is 3dfx (Score:1)
You need to figure about two years after a feature set is first broadly introducted for games to broadly accept it (because of the turn around time in game development).
If 3dfx had even a weak T&L processor in the Voodoo3 3500 and later card, that would signaled the go-ahead to game developers to push hardware T&L in their games right then and there.
Unfortunately, it's not going to be 2003, before you start to see games that REALLY take advantage of hardware T&L, not only using it to push optional higher polygon renderings faster, but to load balance between rendering on the video card and more advance AI, physics, and simulation models on the CPU.
THAT is the cost of 3dfx's arrogance this time... 32bit color was a minor stumble compared to the difference between 100,000+ poly and 10,000+ polys. FSAA is something that is cool, but will only really be fully taken advantage of when the rendering of the world looks like something less than lego.
Re:Yeah 3dfx stinks, but Nvidia? Please... (Score:1)
More important, though, is the headroom you get with a faster card. A game like Q3 has a standard deviation of about 7fps, which means over 15% of your frames are under 33fps, and about 3% are under 26fps. These are very noticeable slowdowns.
At 80fps mean, your standard deviation may jump to 14 fps (it's not a linear progression in real life, but for argument's sake...), 97% of your frames are at 52fps+, and 99.85% above 38fps. So it's smooth all the time, not just when you're standing around with nothing happening.
And that's why NVidia is still in business.
Re:Brought down by fanboy journalism (Score:1)
Re:?Bugs? (Score:1)
How would I look outside?
Re:Self-Inflicted Wound (Score:1)
TBuffer (Score:1)
Re:What about Rendition?! (Score:1)
Rendition Revival [levitate.org]
--------
Re: (Score:1)
Re:nope (Score:1)
If anyone wants to read more about VQuake, the first 3D accelerated version of Quake, I'd suggest checking out the articles Michael Abrash wrote for Dr. Dobbs magazine, some of which were reprinted in his Graphics Programming Black Book. I believe that book is out of print now as well, though.
I was lucky enough to see Abrash give a speech last year and was able to talk to him about some things, including VQuake. In retrospect, it's amazing what was done with such a low fill rate.
Concentration on Quake does not help 3dfx (Score:1)
I'm not really into twitch games, though I do enjoy a quick bash at Unreal now and then. I am into sims, and FSAA is a godsend. On a low-spec system like mine (dual Celeron 400), the V5 is much the better choice where FSAA is a must.
Re:We have a winner (Score:2)
I have rarely read a more incoherent article. It didn't actually SAY anything.
One of the downsides of the web: most of us aren't lucky enough to have an editor. This guy needed one.
Re:not nVidia's problem (Score:2)
It's not just Deus Ex. In general, the GeForce 2 drivers seems to be wackier than those for other cards. I've seen cases where installing newer drivers trashes your machine in weird ways--ways you wouldn't expect to be related to video card drivers. There's a good list of games with GeForce 2 trouble. Great card, but the drivers were released much too early. Don't let devotion to Nvidia blind you here.
In case I look like a pro-3dfx zealot, I'm not happy with the recent Voodoo cards either. The power consumption is about 4x what is should be. Good grief.
Re:There were advantages (Score:2)
I haven't been afraid since to spend a lot of money on a video card, but I'm starting to re-evaluate that a bit. NVidia is moving SO fast with their cards that it's getting foolish to try to buy their top-of-the-line; in six months it will be half-assed at best. It has let them gain a lot of ground but it sure does shorten the life cycle of gfx cards.
There's an upside to all the progress too, of course. I downloaded and ran that XL-R8R utility. Pardon my language, but F*CK that is am impressive demo. I remember the old Amiga demo scene -- those guys would have (and probably still will) shit their pants when they saw that. I wouldn't have believed it could be done live until I saw it. I figured graphics like that were another two or three years out -- WRONG. Wow. Recommended. (www.madonion.com)
Re:maybe im stupid (Score:1)
treke
Re:True, but not true at the same time (Score:1)
treke
Re:3dfx still has a chance (Score:1)
Bahaha! What's that, the kernel source for their DRI/DRM ripoff? NVidia are most definitely keeping their X/GLX drivers closed. They need to release the kernel source because compiled kernel modules are tied so closely to the kernel version. It's easier just to release source and let people do a little compiling.
Re:To the moderators (Score:1)
Re:There were advantages (Score:1)
Personally, I was hoping for a buyout of the Aureal assets by nVidia, but that's not happening. Although.... Word has it that nVidia "aquired" Aureal's techs. Whether it was sneaky and underhanded, I don't know, but it could bode well for the future.
True, but not true at the same time (Score:1)
And don't worry about T&L not comming soon to games. While it is true that the only games that support it right now are the lesser known titles, we should see more mainstream titles that support it by the time napalm rolls around. Hey, doesn't soldier of fortune support it? That's a good sign already...
Well yes, it's obvious that they've gotten larger (Score:2)
Of course since you increase the number of processors, you also have to increase the ammount of memory. That is why 3dfx's cards are getting bigger and bigger. However, with the release of napalm, the card will be 'normal size' since it's an entirely new architecture.
FSAA (Score:1)
Also, they never supported T-buffering, which may not be supported yet, but definitely proves to be great when implemented, unlike HW T&L, which sometimes actually lowered image quality.
Re:nope (Score:1)
Voodoo Banshee runs @ 100 MHz (Score:1)
Re:ummmm (Score:1)
If its a bad game its a bad game, the card you're using can't fix bugs in the program. Sin will crash wether you're running a Voodoo card or one from nVidia.
I admit that poor code is poor code, but the fact is that the game was coded for both cards and only one worked. If the card properly worked with OpenGL as it claimed it did, then there is no reason the TNT2 should fail where the Voodoo didn't. Unless of course the TNT2 uses some other sort of OpenGL standard than everyone else...
Interesting point of view... (Score:1)
3DFX downwards slide.. (Score:1)
Be careful... (Score:1)
And how much lift do those fans generate? What if you opened your case and those fans just blew your computer on to its side?
Re:3dfx still has a chance (Score:1)
What? Are you running a P1? (Score:1)
Re:ummmm (Score:1)
People have been predicting the death of Glide for about three years now and it's still supported. Granted Q3A knocked a nail into its coffin by going OpenGL only, but it is still around, eg. UT, etc.
Direct3d is still a poor cousin (maybe the upcoming DirectGL will be an improvement). OpenGL is improving now that it's actually being used.
Frankly, the main reason I still by 3dfx is a personal one on a matter of principal, I find it hard to support a company that condones whores like Tom Pabst. But that's just my opinion, someday I'll give nVidea a try, maybe.
Re:The same mistakes, again and again (Score:1)
Re:Nvidia, proprietary concerns (Score:1)
You need *one* libGL for all the DRI cards... This means one libGL for ATI Rage 128, Matrox G400, Intel 810, and 3dfx Voodoo3/4/5. This is the libGL shipped with XFree86 4.0.* and up. nVidia requires a completely different libGL.
Ranessin
Re:Well yes, it's obvious that they've gotten larg (Score:1)
Re:Self-Inflicted Wound (Score:1)
As for the 22.5bit colour, it's actually 16bit dithered to 32 (or close to... I think they lose a few bits for some reason). It's "22.5 bit" in name only.
---
Where can the word be found, where can the word resound? Not here, there is not enough silence.
Re:3dfx still has a chance (Score:1)
Nvidia has had the source open sence they releaced 0.92. The default may be to install a RPM but the Tars have been there the whole time
The ftp site is down as of 0-Dark 30 or else i would tell you the number of switch statements in that also.
Sanchi
your fps deviates from 160 to 30? What a lie... (Score:1)
nope (Score:4)
I agree 3dfx has lagged on implementing features, not counting the FSAA/T-Buffer deal in the VSA-100 generation which is pretty cute. It would be nice if they lead the market into new features rather than the other way around.
I strongly believe that 3dfx is positioning themselves for a solid comeback. They bought out Gigapixel which had some really great tech from everything I have read. Low power, high performance stuff, with very low transistor counts, they were a finalist in the X-box bid but lost out to Nvidia. The "Rampage" core has been in developement for a very long time now, a huge amount of 3dfx's R&D budget has gone to developing it. Considering the resources they've thrown at it I don't see any reason why it won't kick ass, unless they run into another component shortage (one of their biggest problems has been the RAM market).
I think 3dfx is going 1 of 2 ways: They either release the first "Rampage" core product and it kicks all ass and the company bounces back, or they release the "Rampage" product and it doesn't do very good and they continue their downward spiral and are bought out by another company. I personally think the first is more likely, but then I own a little 3dfx stock so I'm naturally a little optimistic.
Re:We have a winner (Score:1)
Someone needs to teach that guy the meaning of the word concise.
Re:Well yes, it's obvious that they've gotten larg (Score:1)
(i've been debating a better video card than a voodoo3 3000 16mb agp for playing counterstrike, and have been checking out the voodoo5 vs the geforce 2 cards. i may have mixed up the details though, so correct me if i'm wrong.)
eudas
Greed was rheir downfall (Score:2)
3dfx used to sell their chips to OEMs, in much the same way nVidia does. But after the Banshee, they bought a video card maker, and announced that they would no longer sell their chips to OEMs. In my opinion, this was a blatant attempt to monopolise the 3D card market.
Naturally, the video card makers had no alternative but to buy chips from other companies, or go out of business. This set up a situation of 3dfx vs the rest of the industry, and combined with nVidia's superior technology, the rest is history.
Pride goeth before a fall, as they say.
Re:True, but not true at the same time (Score:1)
Re:Ugh (Score:2)
Re:Self-Inflicted Wound (Score:2)
>notion of including support for 22.5 bit
>colour, as a halfway point between 16 and
>32. I don't remember how they were going
>to get alf a bit, so if someone else knows
>the details, I'd be most interested.
The colourdepth is a technical value which describes the number of distinctive colours that a card can produce. 16 bit equals 65536 colours, 32 bit equals more than 16.7 million colours. The problem with these technical numbers is that they don't take into account what a human eye can distinguish. Visual perception is not linear. Users of early drivers for tnt-cards used to have the problem of games showing up too dark. That is because games were optimized for Voodoo2 gamma values. Gamma is a number which describes the non-linearity of the digital-to-analogue conversion. The Voodoo2 had a gamma value which "wasted" less colours on the dark end of the scale and was better adapted to human vision. To get the numbers of distinguishable colours which can be put on the screen by a Voodoo2, a graphics card with the now more common TNT1-gamma would need 22.5 bits per pixel.
The same mistakes, again and again (Score:3)
Now, onto my main point. 3dfx has a very thin fanbase, and there is a clear reason behind it. 3dfx, time after time, again and again, has been making the same mistake of denying the future. We've heard the argument that the Voodoo3's lack of 32 bit color support (and the memory to handle that feature) is what started 3dfx on its downward spiral. Yet, 3dfx has AGAIN made this mistake with their resistance twoards using a hardware transform and lighting solution... Any game developer will tell you that Hardware T&L is the way of the future, and 3dfx is shooting themselves in the foot. And what about the FSAA card? 3dfx did have a point when the voodoo5 was released and creamed the GeForce 2 in antialiasing performance, but those days are over, and with new drivers, the GeForce 2 beats the voodoo5 in its home territory, FSAA.
So to conclude, us "nVidia fanboys" have reason behind what we believe in. As soon as 3dfx comes out with a better chip than nVidia, count me in on the 3dfx bandwagon. -Matt "ObeseWhale" Grinshpun
-The Darker Sector [3dactionplanet.com]
-Website coming soon! Team Corrosive Quake 3 mods.
We have a simpons quote (Score:2)
[sign on a closed-down movie theatre that reads: Yahoo Serious Festival]
There were advantages (Score:3)
Today you get all in one cards, but a lot of them are still not as good as the Matrox was. It's trial and (expensive) error to get one that's as satisfying in a text editor as it is in Tomb Raider. Anyone care to name a combined card, maybe a GeForce II or Radeon, that is as sharp as it should be in the highest resolutions?
The problem goes away when LCD monitors become affordable in 21/22" sizes, but I think that's at least a year away .
Turnarounds (Score:3)
In PowerVR's example, they no longer make PC cards in bulk, but their chipset is in use with the Dreamcast. It has shown to be surprisingly robust and has turned the Dreamcast from a dark horse into a system with some incredible games (Sonic Adventure, NFL 2K1, Jet Grind Radio, etc.)
Same goes for ATI, which has been running in neutral for the last few years, then released the jaw-dropping Radeon this year.
Point is, don't count 3DFX out yet. Their latest chipsets are nothing extraordinary, but a few engineers and some faith within their infrastructure might be enough to turn them around.
3dfx still has a chance (Score:2)
Death of 3dfx (Score:2)
No surprise; most wounds are self-inflicted (Score:3)
Re:There were advantages (Score:2)
In short, the Matrox was the best of the times. In fact, the best looking card I've seen was the Matrox G400.
"Evil beware: I'm armed to the teeth and packing a hampster!"
Nvidia, proprietary concerns (Score:2)
Nvidia is clearly the graphics chip maker to shoot for now, obviously.
What I _am_ concerned about is the proprietary infrastructure they've put in place in order to support Direct Rendering. Correct me if I'm wrong, but they aren't using the standard "stuff" that XFree provides but their own special infrastructue to support direct rendering. Their binary drivers have recently caught up in performance to its Windows cousin, but I'm always weary about having just binary only drivers. Want to upgrade to the newest kernel? Got the source? No problem! Binary only? You may have to wait for the vendor to release a version for that kernel...
As for my next system, I will purchase a Matrox G450 /w dualhead. They've even released driver source/binary to support dualhead, etc (link provided below)
Matrox beta drivers [matrox.com]
Arcade Games (Score:2)
3DFX had a bigger market than the PC.
As a side note, does anyone else think that Nvidia needs new spin doctors? What the hell is an 'N' Vidia, or a 'G' (e)force. It just sounds bad to me.
Furthermore, what happens when we run out of high-testosterone burning, exploding and speedy names for video cards?
We have a winner (Score:3)
That must be the most obnoxious article I've read this year. I think it actually has more "Gee-I'm-clever" lines (NOT) than actual content.
I read it, but I'm still not quite sure what it was about.
--
A personal experience (Score:2)
When the Voodoo 3 came out, I being a loyal fan of 3Dfx, rushed out to buy one. At the same time, my friend upgraded his Voodoo 2 to a TNT2. I didn't notice much of a difference at all, except that I could now watch TV on my computer. However, my friend had more colors at a much faster rate than I could get on the Voodoo 3.
When the next generation of chips began being released this summer, I was about to upgrade my computer. So I waited a month or two, and read the reviews of all of them. I really couldn't see anything that would make me choose the Voodoo 5. I really wanted to support 3Dfx, but I just couldn't justify that decision. I ended up upgrading to the ATI Radion 32MB DDR, and I love it. I hope 3Dfx can get their acts back together, as they are one of the more supportive companies toward open sourcers.
Re:Turnarounds (Score:2)
He left me hanging (Score:2)
What the heck was he talking about?
He never came back to the i740 issue.
Was he talking about the mediocre performance for dirt-cheap price? Was he talking about the immense headaches suffered by those who installed them in VIA or SIS chipset boards?
Sure the board took me 14 hours to get running with my VIA MVP3 based FIC 503+ board 3 years ago, and another 10 hours yesterday to get it running correctly with my new VIA KT133 based Abit KT7-RAID board, but the card cost me $38 years ago when the ATI Expert series were going for over $100. Yeah, I didn't get quite the performance of the TNT, much less the TNT2, but the image quality is still excellent, and it'll do until Xmas when I get a Geforce MX based card. (Good performance, very reasonable price at under $150 street.)
Why did I post this? Well, I remember those days, and remember the extraordinary prices that have always been charged for the hot video cards of the day. And remeniscing about the "good ole days" is kind of fun... And I want to make sure, for anyone who scrounges up old hardware and didn't know, DON'T try to install an i740 board in a VIA or SIS based motherboard unless you have no other choice!
Brought down by fanboy journalism (Score:2)
The downfall of 3DFX was the fanboy cry of "16 bits per pixel sucks!" which is something that was picked up from interviews with John Carmack. 16 bits per pixel *can* suck, depending on what you want to do. If you're doing half a dozen passes per triangle, then, yes, you need more color resolution. If you're not, then there's no issue. This is a good example of fanboy-oriented web journalism running amok and having real consequences.
In truth, many developers, including myself, really like 3dfx cards. The drivers are rock solid. Glide is the most predictable 3D API. Yes, OpenGL, blah, blah, blah, but Glide is number one in terms of stability.
Re:3dfx still has a chance (Score:2)
I hope Nvidia learns from 3Dfx's mistake. (Score:2)
This let NVidia come out on top.
Now I see a few minor delays in some NVidia products, and I'm concerned that NVidia might be starting to show the same kind of behavior 3Dfx did. NVidia needs to be careful to stay on top, and not let itself fall into the same pattern 3Dfx did.