Video Card History 390
John Mathers writes "While searching the net for information on old Voodoo Video Cards, I came across this fabulous Video Card History article up at FastSilicon.com. It's nice to see that someone has taken the time to look back on the Video Cards that revolutionized the personal computer. Here's a quote "When 3dfx released their first card in October of 1996, it hit the computer world like a right cross to the face. That card was called Voodoo. This new card held the door open for video game development. The Voodoo card was simply a 3D graphics accelerator, which was plugged into a regular 2D video card; known as piggy backing."
Revisionist History? (Score:5, Interesting)
Only 1996 to the Present (Score:5, Interesting)
I'd love to see a history of all video cards for the PC platform...
Nice article and all.. (Score:5, Interesting)
Re:Only 1996 to the Present (Score:2, Interesting)
The Memories... Ahhhh (Score:5, Interesting)
When I saw Quake 2 CTF for the first time at the Respawn LAN party, Zoid [iwarp.com] showed us on this decked out system, how totally amazing it was. I remember how georgeous q2ctf1 looked for the first time my eyes caught it. It was magic. I even wrote about it. You could never have seen it if it wasn't for the people at 3dfx, who pretty much paved the way for all the gaming rigs we've got now. It's a shame that the same people who built this dream had to shut their business down.
I guess, that's how we award our innovators today... with steady, constant competition, or you're out. Seems cold, doesn't it?
Re:Only 1996 to the Present (Score:3, Interesting)
Re:Nostalgia (Score:2, Interesting)
Bring me back to 1994 when the real 3d cards were $3000.00 and only in the CAD workstations.
1996.. not long ago at all.
XGL? (Score:4, Interesting)
I had one (Score:4, Interesting)
Ah... those were the days.
Memories... (Score:3, Interesting)
We sold Diamond Monster 3D's like hotcakes back at Best Buy in the mid 90's.
Then the Voodoo Rush came out. All in one. It stunk.
Then the Voodoo II came out. Remember you could buy 2 of the cards and some games (SLI) would run faster than with just one!
Then they did the combiniation card again...Voodoo Banshee. Worked pretty well.
Then NVIDIA wiped them off the face of the earth.
When will it go back to the CPU? (Score:2, Interesting)
Orchid Righteous 3D (3dfx Voodoo) (Score:3, Interesting)
Took my breath away.. (Score:3, Interesting)
Now I have a 256mg geforcefx 5600 (some letters after it) and all games look amazing, in my other pc i have a 64mg geforce2 4400 (i think) and all games look good. Shame they dont play like Unreal did
ps that voodoo2 is still going, its running on a p3 500 with a 8mg rage card, still can use it to play quake3 in a 800x600 res with pretty good textureing and fast as well
ahhh any other memories or first time looks at the games that made you go 'ohhhhh thats pretty' ?
"Video Cards" started in 1996? (Score:3, Interesting)
And if it's a Video Card history, why no mention of EGA/CGA?
Sounds more like "the 3D accelerator world since the Voodoo" history. It's articles like this that make me wish the slashdot editors would remember they have some readers that are older than high school age.
[end rant]
This days... (Score:5, Interesting)
When you go back about 3 or 4 years ago... when you contrasted a Oxygen video card, or a FireGL vs a TNT or 3DFX card, you could see where the extra money went. But now, todays commerical grade video cards are more then capable. In fact, alot of people I know that work as graphic artists, use traditional Radeon or GeForce 4's in their workstation machines. Outside of say... Pixar, I just dont understand people buying the workstation class cards.
What? No mention of the IBM CGA card (Score:5, Interesting)
Re:Ah... those were the days :-) (Score:5, Interesting)
OTOH, most of the peers of the early PCs had total crap text modes; they couldn't do what the PC could do. (Yes, this includes the Apple. There were no Macs yet.) This is one of the major reasons the PC ended up dominating; text mode was simply more important. Remember that back then most all business use and a good amount of home use was in text mode (word processing, spreadsheets, financial, etc.).
The original IBM PC and its clones usually came with a specially designed monochrome text mode monitor with relatively high resolution (720 x something, no dot pitch to worry about). The monitors had a very long persistence phosphor that totally eliminated flicker. The monochrome text-mode video cards had a very nice serif font stored in their ROMs. IBM's intent was to recreate the feel of their expensive dedicated mainframe text terminals.
This setup had a very high quality feel, and you could stare at it all day without getting eye strain. Early color graphics monitors, OTOH, were horrible at showing text. This was compounded by the crappy fonts that were shipped with most early graphic OSes. This made most of the PC's early competitors pretty useless for doing long stretches of serious work.
IBM's attempt to provide color graphics did suck big time [*]. Originally, you had to buy two graphics adapters and two separate monitors to get text and graphics on the same machine. One of Compaq's claims to fame was getting the patent on unifying the PCs high-quality text mode and its graphics modes on a single graphics adapter and monitor.
[*] The original 16-bit color mode of the EGA cards and VGA cards must have been designed by somebody who was high on crack. You can't get at the pixel memory without setting up a bewildering array of registers that control mandatory and mostly non-useful logic operations on your bits. The memory is accessed as 4 independent planes, so you have to unnaturally slice every pixel up into individual bits and have a PhD in boolean logic to get them on the screen as you intended. It easily could take a newbie a whole day of reading manuals and hacking before they could get a single white dot on the screen.
Re:Only 1996 to the Present (Score:3, Interesting)
Man - that brought a smile to my face. I remember that picture. And the clown? Remember the clown at 320x240? It could really show you what a good picture could look like even at that low resolution.
I remember the first game I had to make use of that more (on a PS/2 Model 70): Mean Streets. Anyone remember that? Had a blast playing that Christmas morning.
And then there was fractint, which could use "other" modes and draw fractals at beyond the 640x480 modes to things like 732x5??. Of course it took hours on the 16Mhz 386DX in that box.
Ah, times were simpler back then...
Re:Only 1996 to the Present (Score:3, Interesting)
It was a 10Mhz XT, Sweeeeeet!
Those were the days ... (Score:2, Interesting)
The only problem was that not enough games HAD 3D patches. A standard was missing. No game company wanted to write 3D patches for ALL the cards out there. Then the Voodoo1 came along, and it was WAY faster than anything else, and they had Glide (which apparently was pretty easy to program for). Suddenly, almost all new games came out with 3dfx support - and you had games you NEVER could have played on the old 2D hardware. The funny thing was, once you had a 3dfx card in your machine, the processor power was not that important anymore. The only thing which mattered was that you HAD a Voodoo card in there. No voodoo - no serious gaming. Voodoo in there - happiness
Well, then Quake and Quake2 came along, and you all know the rest.
The only thing to remember is that the Voodoo1 DID revolutionize gaming. It was a quantum leap. Either you had one, then you could game. Or you did not have one, then you wanted one.
Matrox put themselves in obscurity. (Score:5, Interesting)
However, Matrox has shown one thing since the days of Millenium cards... and that is they don't care about the consumer market.
They left the consumer behind to go for the business market and it has done them well. As for 3d gaming, they were irrelevant back in the days of 3dfx because 3dfx marketed their cards and their API to the people that mattered; developers.
Matrox has had superior technology a lot of times, their problem is it rarely does anything people really want. (and a handful of geeks doesn't count)
There's so much missing (Score:4, Interesting)
Some of my favorite cards were the 'decelerators' such as the Yamaha device. They hadn't yet figured out how to do 'perfect scan' so if you rendered a pair of triangles with a common edge then the pixels on that edge would be rendered in both triangles. If you rendered a square tesselated as triangles in the obvious way then the corner pixels were rendered 6 times. I had arguments with the guys about performance. They told me my drivers sucked as I couldn't match their laboratory performance. It's astonishing that a company could bring a device as far as first silicon without knowing how to rasterize a triangle correctly! Even without such mistakes they were still slow as the PCI bus was no way to send 3D instructions, geometry and textures anywhere. It would often take longer to format the data and send it to the device than simply rasterize directly to screen memory in software - even on early Pentiums!
Then there was the first nvidia card that you may or may not know about. My God this thing was bad. Now I can't remember the exact details (this is many years ago) but it was very like the Sega Saturn's 3D system. (I don't think there's a coincidence here, the card came with a PC version of Virtua Fighter so I guess Sega and Nvidia were working together). Basically it was a sprite renderer. A square sprite renderer. But it had been hacked so the spans of the sprites could be lines that weren't raster aligned. So you could render twisted rectangles. With some deviousness you could render polygons with perspective and you had a 3D renderer. But it basically always 'pushed' an entire sprite. So it was pretty well impossible to do any kind of clipping. It was next to impossible to map the functionality to any kind of 3D API and so could only run applications dedicated to it. Again they complained that we were unable to write proper 3D drivers for their card. Admittedly their design did at least allow some games to run fast but I'm still amazed by the lack of understanding by the early nvidia guys. So when they eventually overtook 3dfx I was completely blown away.
And then there was the question of APIs. In the old days there was no Direct3D. There was OpenGL but most PCs were a long way from having the power for a full OpenGL implementation. Early on only one company was interested in OpenGL - 3dLabs. They were the only company who understood what they were doing on PCs in those early days. So there was a variety of APIs: Renderware, Rendermorphics, and BRender among others. Rendermorphics was eventually bought by MS and became Direct3D. The first few revisions were terrible but as they always do MS eventually 'got it'. Renderware [renderware.com] is still going. They are part of Canon. Anyone who knows Canon will be aware that they patent everything. If you dig out the early Canon patents you'll find they patented fast rendering of speculars by a technique which meant they didn't actually move as the viewpoint move. (If you know 3D you should be laughing loudly right about now.) But Renderware did get their act together and now have a 3D API that runs on a few consoles. And some of the earliest and coolest 3D hacks were first patented by them. BRender just disappeared though Jez San, the guy behind it, recently received an OBE for his influence on the British computer games industry. (Gossip tidbit: at one point SGI were looking for a 3D API for PCs and chose BRender over OpenGL for their FireWalker game development system.) If you dig into the pre-pre-history of 3D accelerators you'll see that San's company, Argonaut, developed the first commercial 3D accelerator (though not PC card) - the FX chip for the SNES, used for Starfox.
And this is all from memory so please accept my apologies for errors and post corrections!
Away from "Accelerated"? (Score:3, Interesting)
For example, I wonder how many FPS a P4 3.2 or an Operon can pump out @1024x768x16bit in Q3 with only software rending.