Video Card History 390
John Mathers writes "While searching the net for information on old Voodoo Video Cards, I came across this fabulous Video Card History article up at FastSilicon.com. It's nice to see that someone has taken the time to look back on the Video Cards that revolutionized the personal computer. Here's a quote "When 3dfx released their first card in October of 1996, it hit the computer world like a right cross to the face. That card was called Voodoo. This new card held the door open for video game development. The Voodoo card was simply a 3D graphics accelerator, which was plugged into a regular 2D video card; known as piggy backing."
Revisionist History? (Score:5, Interesting)
Re:Revisionist History? (Score:5, Informative)
And absolutely no mention of Matrox whatsoever... despite the fact that their add-on 3D accelerator was arguably superior to the voodoo, and the parhelia is the ONLY 3d solution to support 3 display devices.
Re:Revisionist History? (Score:5, Informative)
And no mention about that company whatsoever
But hey, what can you expect from (probably) fps kiddie biased negatively towards Matrox among others - because if he'd be JUST fps kiddie (not anti Matrox and...) he'd mention the fact that for ~half a year in 99 Matrox was the leader BOTH in performance and quality...too bad since then only the second holds true.
Matrox put themselves in obscurity. (Score:5, Interesting)
However, Matrox has shown one thing since the days of Millenium cards... and that is they don't care about the consumer market.
They left the consumer behind to go for the business market and it has done them well. As for 3d gaming, they were irrelevant back in the days of 3dfx because 3dfx marketed their cards and their API to the people that mattered; developers.
Matrox has had superior technology a lot of times, their problem is it rarely does anything people really want. (and a handful of geeks doesn't count)
Re:Revisionist History? (Score:4, Insightful)
A) the Verite chip had no opengl support - it could only run vquake - a specialy made version of Quark for the verite. And even then it was slow. Also it was kinda pokey for Direct 3D stuff as well.
B) S3 also had no opengl support - and limited direct 3d support - most direct 3d games did not support it (for instance it didn't support uploading textures...)
C) Matrox - except for high end equipment also wasn't nearly fast enough to play GLQuake. The Mystique is not nealy fast enough to play actual video games.
Why do I keep mentioning Quake? I think in 96 is was the defining game. If your card could run GLQuake smoothly you were in the zone. And the only cards that could run it even near smoothly cost well over 2000 dollars. Don't believe me? This is actually all in the GLQuake readme (more or less)
When my Intel P-120 first started GL Quake on the Voodoo 1 I just about crapped my pants. It was smooth, fluid and it looked awsome! No other video card at the time for 150-200 dollars could deliver those kinds of results.
Re:Revisionist History? (Score:4, Informative)
http://www.accelenation.com/?ac.id.123.1
Re:Revisionist History? (Score:5, Funny)
Re:Revisionist History? (Score:2)
Re:Revisionist History? (Score:3, Insightful)
Just playing Devil's Advocate here, but it wasn't until 3DFX hit the market that it became a mainstream gaming card. Creative Labs didn't invent the sound card, but they sure made that market blossom.
Re:Revisionist History? (Score:5, Informative)
Any article which try to encapsulate the history of 3d cards but fails to mention the Verite cards is a piss-poor article right from the get-go.
Only 1996 to the Present (Score:5, Interesting)
I'd love to see a history of all video cards for the PC platform...
Re:Only 1996 to the Present (Score:2, Interesting)
Re:Only 1996 to the Present (Score:3, Interesting)
Re:Only 1996 to the Present (Score:2)
Hmmm yeah, Trident and Tseng go way back. Oak Technology made a lot of cards... Diamond was hot for awhile, but in my memory, at least, they came in to the game pretty late, at least I didn't see them till the mid 90s.
Re:Only 1996 to the Present (Score:3, Interesting)
It was a 10Mhz XT, Sweeeeeet!
Ah... those were the days :-) (Score:3, Insightful)
Re:Ah... those were the days :-) (Score:2)
Re:Ah... those were the days :-) (Score:2)
Re:Ah... those were the days :-) (Score:3, Informative)
Re:Ah... those were the days :-) (Score:5, Interesting)
OTOH, most of the peers of the early PCs had total crap text modes; they couldn't do what the PC could do. (Yes, this includes the Apple. There were no Macs yet.) This is one of the major reasons the PC ended up dominating; text mode was simply more important. Remember that back then most all business use and a good amount of home use was in text mode (word processing, spreadsheets, financial, etc.).
The original IBM PC and its clones usually came with a specially designed monochrome text mode monitor with relatively high resolution (720 x something, no dot pitch to worry about). The monitors had a very long persistence phosphor that totally eliminated flicker. The monochrome text-mode video cards had a very nice serif font stored in their ROMs. IBM's intent was to recreate the feel of their expensive dedicated mainframe text terminals.
This setup had a very high quality feel, and you could stare at it all day without getting eye strain. Early color graphics monitors, OTOH, were horrible at showing text. This was compounded by the crappy fonts that were shipped with most early graphic OSes. This made most of the PC's early competitors pretty useless for doing long stretches of serious work.
IBM's attempt to provide color graphics did suck big time [*]. Originally, you had to buy two graphics adapters and two separate monitors to get text and graphics on the same machine. One of Compaq's claims to fame was getting the patent on unifying the PCs high-quality text mode and its graphics modes on a single graphics adapter and monitor.
[*] The original 16-bit color mode of the EGA cards and VGA cards must have been designed by somebody who was high on crack. You can't get at the pixel memory without setting up a bewildering array of registers that control mandatory and mostly non-useful logic operations on your bits. The memory is accessed as 4 independent planes, so you have to unnaturally slice every pixel up into individual bits and have a PhD in boolean logic to get them on the screen as you intended. It easily could take a newbie a whole day of reading manuals and hacking before they could get a single white dot on the screen.
Re:Bullshit (Score:2, Insightful)
EGA was expensive and slow compared.
Re:Bullshit (Score:5, Informative)
Re:Ah... those were the days :-) (Score:3, Funny)
What are you talking about? On CGA, you had complete choice! You could use either the rasta red/green/yellow palette, or the nuclear pink/cyan/white palette! What more do you want?
Re:Only 1996 to the Present (Score:5, Insightful)
Back then, the hardware specs (so you could program the device) came with all the accessories you bought for your PC. Imagine that.
Printers had a book with all the Escape codes, Video cards told you which modes they supported, modems had AT command set references...
Try getting the specs to a PCI card nowadays....
Re:Only 1996 to the Present (Score:3, Interesting)
Man - that brought a smile to my face. I remember that picture. And the clown? Remember the clown at 320x240? It could really show you what a good picture could look like even at that low resolution.
I remember the first game I had to make use of that more (on a PS/2 Model 70): Mean Streets. Anyone remember that? Had a blast playing that Christmas morning.
And then there was fractint, which coul
Re:Only 1996 to the Present (Score:3, Informative)
Re:Only 1996 to the Present (Score:2)
Re:Only 1996 to the Present (Score:5, Insightful)
It almost sounds like the author only talked about the cards he owned.
Just on the nVidia side, he barely mentioned the TNT and it's various derivatives, didn't mention the TNT2 Ultra or other TNT2 cards (except the baseline), and didn't mention that the GeForce 256 came in SDR and DDR versions, pretty much solidifying the future of DDR on video cards (because there was little other difference between the cards to explain the difference in benchmarks). Not to mention the later GF2 upgrades, the GF3, and the GF4.
Even with his early mentions of ATI he missed the mark a bit. ATI wasn't aiming for the 3d market so much because they had a solid hold on the OEM market, which didn't care (at the time) about 3d. When the OEM's started to care, nVidia had their chipset ready in part because of their XBox work (or they got the XBox work because they were working on the chipsets for the OEM market, either way it wasn't long before they were releasing motherboard chipsets), and a solid hold on the lead in 3D graphics technology.
Beyond that, he mentions that nVidia 'bought out 3dfx', which isn't quite right, since nVidia simply bought most of their IP and left the company to it's own devices (3dfx basically sold all of their assets and shut down).
Overall, it's a very light article that could be surpassed by a quick read through the review history on most sites that review graphics hardware.
Nostalgia (Score:3, Funny)
I remember when we would write ASCII graphics contouring programs for line printers!
Re:Nostalgia (Score:2, Interesting)
Bring me back to 1994 when the real 3d cards were $3000.00 and only in the CAD workstations.
1996.. not long ago at all.
Re:Nostalgia (Score:3, Funny)
Well, sort of. (Score:5, Informative)
This isn't entirely correct, as any Voodoo 1 user could tell you. The card took up its own slot, and used a pass-through video cable to connect the monitor: When a Voodoo-compliant video signal was detected, it hijacked the output to the monitor and took over.
Nice design, for the time. The best thing was, it was CHEAP for the time (considering the performance). I think I paid $199.
M-
Re: (Score:2)
Re:Well, sort of. (Score:4, Informative)
At the ISP I worked at I had two Voodoo 2 cards, which, on a lowly PII-350, ran Unreal Tournament with full detail in 1024*768 at a massive framerate!
Confusion with later Voodoo cards? (Score:2)
Re:Confusion with later Voodoo cards? (Score:2)
Which is a pretty simple way to get double the performance. I wonder why noone's done this recently...
Re:Confusion with later Voodoo cards? (Score:3, Informative)
They are, in the chip itself, sorta. Modern all-in-one GPUs have multiple texture pipelines, which does split some of the load on the silicon level. It's not SLI, but it's the same concept.
The problem is SLI only doubles the fillrate. Both 3D chipsets need to work on the exact same dataset. SLI was a great boost back when 3D hardware was slow at texturing. These days the hardware can pump a couple thousand
Re:Confusion with later Voodoo cards? (Score:2)
But if both cards are on the AGP bus (I know, only one client devi
parallel graphics pipelines (Score:4, Informative)
To do 60 frames per second, you have roughly 16ms to generate a frame. A couple of those ms will be gobbled up with I/O transactions and various wait states, so you're already at the point where double the power is only going to result in 1.75x the performance. This will also be highly dependant on how well the 3D code can be parallelized (are there a lot of read_pixels callbacks that require both GPUs and both banks of memory to talk to each other? etc).
This has actually been done by SGI for awhile now. A couple years ago they took their Origin 3000 architecture and stuck on dozens of V12 GPUs and tiled the graphics for higher performance. That concept has been tweaked for their Onyx4 sytems... one large single computer with up to 34 ATI FireGL X1 GPUs. 16 GPUs work on each display in a 4x4 grid. Each GPU generates its 400x300 piece and 16 of those are composited in real time to make up a 1600x1200 display. I believe the biggest such machine to date has 32 GPUs powering two zippy fast 1600x1200 displays and 2 GPUs driving an additional 4 lesser powered displays. SGI gets quite a speedup by doing it that way, with 16 GPUs per display, but there's also a lot of overhead (even more in SGI's case, with 34 AGP 8X busses in said system). Their implementation of OpenGL and OpenGL Performer is tweaked for this, though.
So yeah, it can be done, but the fact that the GPUs will spend a significant amount of time doing non-rendering tasks (I/O, waiting for data, copying the final result to frame buffer, etc) means that you won't see a nice linear scaling. The cost of making custom hardware and custom drivers also adds up. With top-end PC 3D accelerators costing $400 already, I can't picture many users shelling out $1000+ for a dual GPU card.
dual voodoo2 via SLI (Score:2)
A lot of professional/expensive 3D systems before the Voodoo2 used a similar technique. If one of the texture ram modules comes loose on an SGI Indigo2 MaximumImpact, textured mode
Re:Well, sort of. (Score:3, Informative)
The Voodoo2 cards started at $250 and $350 (or somewhere around there) for the 8MB and 12MB models, respectively. The only way to get the 1024x768 mentioned in the article was to have 2 12MB cards in SLI mode (which meant connecting the 2 V2 cards with a small ribbon cable between the two cards inside the case). Additionally, the pass-through cables that came with most V2 cards cause
Relays (Score:2)
Re:Well, sort of. (Score:2, Redundant)
Nice design, for the time. The best thing was, it was CHEAP for the time (considering the performance). I think I paid $199.
I personally think (but I am biased) that the PowerVR PCX1/2 solution was nicer. It also piggy-backed off the 2D card but, instead of using cables, it squirted pixels (in packets) across the P
Nice article and all.. (Score:5, Interesting)
The Memories... Ahhhh (Score:5, Interesting)
When I saw Quake 2 CTF for the first time at the Respawn LAN party, Zoid [iwarp.com] showed us on this decked out system, how totally amazing it was. I remember how georgeous q2ctf1 looked for the first time my eyes caught it. It was magic. I even wrote about it. You could never have seen it if it wasn't for the people at 3dfx, who pretty much paved the way for all the gaming rigs we've got now. It's a shame that the same people who built this dream had to shut their business down.
I guess, that's how we award our innovators today... with steady, constant competition, or you're out. Seems cold, doesn't it?
Re:The Memories... Ahhhh (Score:2)
What are you talking about? Wing Commander's a bit choppy on my machine, but F19 Stealth Fighter screams along. So smooth. I tried that Quake game, but it was hard finding space on my 80MB hard drive. Then it was unplayably slow. How can these other games you mention be even slower - you can't get slow
Re:The Memories... Ahhhh (Score:2)
My point was specifically about gameplay, which has slowed in FPS games since the early ones. I think I was trying to correlate this slowed gameplay to the raised detail levels in games, and that the raised detail levels were only a result of faster, better 3d cards.
Don't believe me? Load up a game of TW and see what I mean, with the harpoon gun. You can't get that kind of speed from any other
FastSilicon.com (Score:3, Funny)
Almost... (Score:2)
Lost $50 on that Voodoo II (Score:2, Funny)
Blurb from article (Score:5, Funny)
Nuke warm cards huh? How many fans do you need for one of those?
The Internet needs an editor or two hanging around.
Slashdotted (Score:2, Funny)
XGL? (Score:4, Interesting)
Re:XGL? (Score:2)
Re:XGL? (Score:2)
Nathan
Re:XGL? (Score:2)
It's pretty nifty, using expose can be quite addicting. Big..small..big..small..big..
Re:XGL? (Score:2, Informative)
This isn't quite true. Most of the desktop rendering is still done by the CPU in the same way that it was done before QE was added to OS X. It's simply the individual windows that are rendered by QE, and OpenGL handles the 'surface' which is handed
An interesting tidbit. (Score:5, Informative)
nVidia's pioneering RIVA 128 chipset was the first chipset that could compare itself in 3-D mode against the vaunted Voodoo cards of that period; once nVidia unveiled the groundbreaking TNT chipset it was pretty much over for Voodoo's separate board approach. This is what spurred ATI into developing the Rage Pro and later Rage 128 chipsets in the late 1990's, starting the competition between ATI and nVidia that has lasted to this day.
Re:An interesting tidbit. (Score:2)
ATI's RageII/RageII+/RageII+DVD, the forerunner to the RagePro, was a decent all-in-one performer for certain games. The drivers were AWFUL, though, so its hard to tell if the problems were with the silicon itself. I don't think there was ever even a DVD Player that made use of the RageII+DVD. ATI's head was
I had one (Score:4, Interesting)
Ah... those were the days.
Re:I had one (Score:2)
Re:I had one (Score:2)
Memories... (Score:3, Interesting)
We sold Diamond Monster 3D's like hotcakes back at Best Buy in the mid 90's.
Then the Voodoo Rush came out. All in one. It stunk.
Then the Voodoo II came out. Remember you could buy 2 of the cards and some games (SLI) would run faster than with just one!
Then they did the combiniation card again...Voodoo Banshee. Worked pretty well.
Then NVIDIA wiped them off the face of the earth.
When will it go back to the CPU? (Score:2, Interesting)
Re:When will it go back to the CPU? (Score:2)
speed/cheap/general purpose
pick two
Re:When will it go back to the CPU? (Score:2, Informative)
Re:When will it go back to the CPU? (Score:2)
That's a good point. Video cards, NICs etc... all with their own processors and RAM. This is out of control! I log for the days when downloading a file or viewing a graphic would actually tie up your machine.
Oops, did I type that out loud?
Some egregious errors here... (Score:5, Informative)
Erm. That's not even enough to fill in a single horizontal bar of the screen (unless you're running in 320*240 resolution). Perhaps they meant megapixels? This was hardly the only such error that I noticed, though - these guys really need to have someone proofread their articles.
Want to read more about older video cards? (Score:5, Informative)
Check out Tom's Hardware Guide [tomshardware.com]
http://www20.tomshardware.com/graphic/1997.html [tomshardware.com]
http://www20.tomshardware.com/graphic/1998.html [tomshardware.com]
http://www20.tomshardware.com/graphic/1999.html [tomshardware.com]
http://www20.tomshardware.com/graphic/2000.html [tomshardware.com]
http://www20.tomshardware.com/graphic/2001.html [tomshardware.com]
Orchid Righteous 3D (3dfx Voodoo) (Score:3, Interesting)
Took my breath away.. (Score:3, Interesting)
Now I have a 256mg geforcefx 5600 (some letters after it) and all games look amazing, in my other pc i have a 64mg geforce2 4400 (i think) and all games look good. Shame they dont play like Unreal did
ps that voodoo2 is still going, its running on a p3 500 with a 8mg rage card, still can use it to play quake3 in a 800x600 res with pretty good textureing and fast as well
ahhh any other memories or first time looks at the games that made you go 'ohhhhh thats pretty' ?
Re:Took my breath away.. (Score:2)
I was so taken by the 'great outdoors' that I grabbed the cheats off the web and just flew around that area. Absolutely one of the most impressive games I have played, eyecandy-wise. Not to mention the run down the hallway after the 'creature' and *hearing* your shipmates get torn apart on the other side of the door.
All with the magic of a
Re:Took my breath away.. (Score:2)
Between the graphics and the scripting of those first few levels (lets face it, the game couldn't keep pace with itself), I was hooked on the new graphical and gameplaying goodness.
Anymore, it seems as though the scripting is getting the mainstage, and maybe someday we'll reminisce on how great this or that sequence was, even if i
Tomb Raider (Score:2)
Uselss Article. (Score:2)
Could use more info (Score:3, Insightful)
And if it's just 3D chipsets that count, what about the [near useless] S3 Virge, before the Voodoo? What about the extra details, like 3dfx buying out STB to manufacture its own integrated 2D/3D solutions (Voodoo3 onwards), effectively pissing off an entire industry?
Oh well. Maybe next time.
"Video Cards" started in 1996? (Score:3, Interesting)
And if it's a Video Card history, why no mention of EGA/CGA?
Sounds more like "the 3D accelerator world since the Voodoo" history. It's articles like this that make me wish the slashdot editors would remember they have some readers that are older than high school age.
[end rant]
This days... (Score:5, Interesting)
When you go back about 3 or 4 years ago... when you contrasted a Oxygen video card, or a FireGL vs a TNT or 3DFX card, you could see where the extra money went. But now, todays commerical grade video cards are more then capable. In fact, alot of people I know that work as graphic artists, use traditional Radeon or GeForce 4's in their workstation machines. Outside of say... Pixar, I just dont understand people buying the workstation class cards.
Re:This days... (Score:2)
What? No mention of the IBM CGA card (Score:5, Interesting)
gaming hardware in servers? (Score:3, Informative)
This card was massive and would never have been used in a server.
Re:gaming hardware in servers? (Score:2)
First, the Voodoo 5 6000 was NEVER sold at retail or OEM. Accordingly, how could anyone buy and install it into a server?! Second, why would anyone put a 3d card in a sever?! Heck, do server boards even have AGP slots?!
The "history" is not worth the code it was written in.
Terrible Timeline (Score:2)
Let alone the historical inaccuracies, the guy writes like he's in the 4th grade.
Here's my favorite typo, "As time went on, ATI and NVIDIA battled between themselves, releasing card after card. The cards released then were rather nuke warm."
Yeah. We wish.
Tal
Re:Terrible Timeline (Score:2)
Sadly, the last card 3dfx constructed was the Voodoo 5 6000, which was rarely seen at all. That is rather hard to believe seeing that it's one of the biggest graphics cards I have ever seen. It's equipped with 4 GPUs (That's right, 4.) and 128 megabits of memory. This card was mostly only seen in servers though
WHAT? In servers? OK buddy.
Not to mention the fact that he completely missed the original TNT. What a dipshit.
Quake 1 without Voodoo? (Score:2)
I still have the Q1 CD, but it occured to me -- can I even run it and get good graphics without a voodoo card, or am I stuck with software rendering? IIRC the Q1 patch was voodoo specific.
I'm also wonder if Q1 wasn't a DOS game as well, which might make it impossible to run on XP, unless a subsequent Windows version was released.
Re:Quake 1 without Voodoo? (Score:2, Informative)
Re:Quake 1 without Voodoo? (Score:2, Informative)
Not a complete history by any means... (Score:3, Informative)
The Monster3D memories (Score:3, Funny)
I also recall the controversy of transparent water in Quake and how that was considered "cheating" by en large. Those poor non-accelerated folks had to get in the water first to see anything!
Me, I'd just wait until they all jumped in the water and fire off that Lightning Gun. Sure it's suicide, but is it really suicide when you get to roast at least 5 or more people at the same time? DM3, how I miss thee.
Not accurate. (Score:2, Informative)
This text has some flaws... Nvidia didn't buy 3dfx nor its assets. It won them in a lawsuit with 3dfx.
-B
Matrox (Score:3, Funny)
Talking about older cards (Score:2)
Both at high end workstations, and for home desktop? I remember seeing ads for true color boards in 1989 Mac magazines. When were them avaliable to the PC? Where they at all avaliable to Intel PCs or Amigas earlier than for Macs?
If someone is kind enough to answer in a nice way (I could not find an answer in google), please consider making it ready for a write up at E2 [everything2.net].
History of Video only back to 1996? (Score:2)
Please Mod Up - Fastsilicon.com Response (Score:5, Informative)
-JT
There's so much missing (Score:4, Interesting)
Some of my favorite cards were the 'decelerators' such as the Yamaha device. They hadn't yet figured out how to do 'perfect scan' so if you rendered a pair of triangles with a common edge then the pixels on that edge would be rendered in both triangles. If you rendered a square tesselated as triangles in the obvious way then the corner pixels were rendered 6 times. I had arguments with the guys about performance. They told me my drivers sucked as I couldn't match their laboratory performance. It's astonishing that a company could bring a device as far as first silicon without knowing how to rasterize a triangle correctly! Even without such mistakes they were still slow as the PCI bus was no way to send 3D instructions, geometry and textures anywhere. It would often take longer to format the data and send it to the device than simply rasterize directly to screen memory in software - even on early Pentiums!
Then there was the first nvidia card that you may or may not know about. My God this thing was bad. Now I can't remember the exact details (this is many years ago) but it was very like the Sega Saturn's 3D system. (I don't think there's a coincidence here, the card came with a PC version of Virtua Fighter so I guess Sega and Nvidia were working together). Basically it was a sprite renderer. A square sprite renderer. But it had been hacked so the spans of the sprites could be lines that weren't raster aligned. So you could render twisted rectangles. With some deviousness you could render polygons with perspective and you had a 3D renderer. But it basically always 'pushed' an entire sprite. So it was pretty well impossible to do any kind of clipping. It was next to impossible to map the functionality to any kind of 3D API and so could only run applications dedicated to it. Again they complained that we were unable to write proper 3D drivers for their card. Admittedly their design did at least allow some games to run fast but I'm still amazed by the lack of understanding by the early nvidia guys. So when they eventually overtook 3dfx I was completely blown away.
And then there was the question of APIs. In the old days there was no Direct3D. There was OpenGL but most PCs were a long way from having the power for a full OpenGL implementation. Early on only one company was interested in OpenGL - 3dLabs. They were the only company who understood what they were doing on PCs in those early days. So there was a variety of APIs: Renderware, Rendermorphics, and BRender among others. Rendermorphics was eventually bought by MS and became Direct3D. The first few revisions were terrible but as they always do MS eventually 'got it'. Renderware [renderware.com] is still going. They are part of Canon. Anyone who knows Canon will be aware that they patent everything. If you dig out the early Canon patents you'll find they patented fast rendering of speculars by a technique which meant they didn't actually move as the viewpoint move. (If you know 3D you should be laughing loudly right about now.) But Renderware did get their act together and now have a 3D API that runs on a few consoles. And some of the earliest and coolest 3D hacks were first patented by them. BRender just disappeared though Jez San, the guy behind it, recently received an OBE for his influence on the British computer games industry. (Gossip tidbit: at one point SGI were looking for a 3D API for PCs and chose BRender over OpenGL for their FireWalker game development system.) If you dig into the pre-pre-history of 3D accelerators you'll see that San's company, Argonaut, developed the first commercial 3D accelerator (though not PC card) - the FX chip for the SNES, used for Starfox.
And this is all from memory so please accept my apologies for errors and post corrections!
Away from "Accelerated"? (Score:3, Interesting)
For example, I wonder how many FPS a P4 3.2 or an Operon can pump out @1024x768x16bit in Q3 with only software rending.
Re:1996? (Score:2)
Remember, computer years are like dog years, they advance so much faster than anything else.
I admit the article isn't the best in the world, but its still interesting to read. I was expecting bumph on old Hurcules cards and Tridents and the like, but it still took me on a little trip down memory lane.
Re:Forgotten cards (Score:2)
The S3 ViRGE also suffered from poor drivers in its later days. I really wish I knew just how well it could have stacked up against the Voodoo. I remember the patched version of Descent you mentioned! The software version looked good, the hardware version was amazing! There were also some other 3D demos on the bonus CD, one was a walk thru a flame torch - lit castle. Probably looks like crap by todays standards,
I'm happy with ATI & Linux (Score:3, Informative)
ATI generally releases an new WHQL Windows driver about once a month and a new Linux driver about every 6 weeks. I've had no problems with their XFree86 4.3 driver. They don't have a FreeBSD driver, though, but I guess a PowerBook would give somewhat of the same experience (BSD-based OS, XFree86-based X envrionment, Radeon 9600, plus Quartz/DisplayPDF and access to Mac apps). Mac OS X also has the ATI (and nVIDIA) drivers built-in and are updated with the software upd
Forceware -- the name sounds evil, in a bad way (Score:2)