Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

Video Card History 390

John Mathers writes "While searching the net for information on old Voodoo Video Cards, I came across this fabulous Video Card History article up at FastSilicon.com. It's nice to see that someone has taken the time to look back on the Video Cards that revolutionized the personal computer. Here's a quote "When 3dfx released their first card in October of 1996, it hit the computer world like a right cross to the face. That card was called Voodoo. This new card held the door open for video game development. The Voodoo card was simply a 3D graphics accelerator, which was plugged into a regular 2D video card; known as piggy backing."
This discussion has been archived. No new comments can be posted.

Video Card History

Comments Filter:
  • Revisionist History? (Score:5, Interesting)

    by Maradine ( 194191 ) * on Monday November 10, 2003 @11:15AM (#7433993) Homepage
    I note that the history of this article starts in 1996 . . . one year after Rendition's Verite chip became the first consumer add-on 3D accelerator.
  • by dada21 ( 163177 ) <adam.dada@gmail.com> on Monday November 10, 2003 @11:16AM (#7433998) Homepage Journal
    I was excited to load this article up and hope to see my first VGA card by Paradise. I believe it was called Paradise 256K or something like that. I had a Sony VGA monitor, and my friends and I were blown away by some 320x200 x 256 color graphic of a parrot. Then we found a nude GIF. Whoa. I think I had that card about 2 years before any game really supported it, although Police Quest in EGA mode was nothing like we could imagine.

    I'd love to see a history of all video cards for the PC platform...
  • by fault0 ( 514452 ) on Monday November 10, 2003 @11:18AM (#7434009) Homepage Journal
    but it'd nice to have a history of things before 1996 (i.e, pre-voodoo cards). Video card history between 1996-2000 was very well documented, perhaps thanks to all of the articles that came out near/after 3dfx's demise, and most of us remember everything within the last three years.
  • by shawn(at)fsu ( 447153 ) on Monday November 10, 2003 @11:18AM (#7434012) Homepage
    I had a paradise card. Wow that takes me back. When pictures almost looked like real pictures.
  • by dolo666 ( 195584 ) * on Monday November 10, 2003 @11:19AM (#7434016) Journal
    I remember my first Voodoo cardie. I was playing TWCTF [thunderwalker.net] alot with my buds, and many of them had fast systems (at the time) running glquake/glqw [gamespy3d.com]. Finally after being a software user for so long, getting decent lag-frags, I did the unthinkable and ditched the software client for some better visuals with my very own piggybacked Voodoo card, from 3dfx. Gaming has changed quite a bit since then, but you have to understand how much fun it was playing Quake in software mode. The mods were cool too, but everything about that experience was killer fast. Now since then, games have mostly slowed down on PC. Quake 2 and Quake 3 were much slower. The speed of play for TW back with software, was intense. You had to hold your adrenaline rush to the bitter end of any match. By the time I was playing for ZFA [142.179.67.73], everyone had a 3d card. I can remember the Q2 LAN parties when guys would show up with their configs all set for zero textures and no coloured lighting. The levels would all be just plain white, and guys would be saying how awesome it was they could get 100fps doing this. To me, it always took something away from the game to run configs like that, even if it could give you an edge in matches.

    When I saw Quake 2 CTF for the first time at the Respawn LAN party, Zoid [iwarp.com] showed us on this decked out system, how totally amazing it was. I remember how georgeous q2ctf1 looked for the first time my eyes caught it. It was magic. I even wrote about it. You could never have seen it if it wasn't for the people at 3dfx, who pretty much paved the way for all the gaming rigs we've got now. It's a shame that the same people who built this dream had to shut their business down.

    I guess, that's how we award our innovators today... with steady, constant competition, or you're out. Seems cold, doesn't it?
  • by Maradine ( 194191 ) * on Monday November 10, 2003 @11:19AM (#7434018) Homepage
    What were the big players back then? Paradise, Trident, and Tseng, right? Man. MCGA rocked.
  • Re:Nostalgia (Score:2, Interesting)

    by Lumpy ( 12016 ) on Monday November 10, 2003 @11:21AM (#7434031) Homepage
    HUH? in 1996 I was playing 3d accelerated games on my Virge 3D card. 3d gaming was around for a while at that point...

    Bring me back to 1994 when the real 3d cards were $3000.00 and only in the CAD workstations.

    1996.. not long ago at all.
  • XGL? (Score:4, Interesting)

    by Doc Ruby ( 173196 ) on Monday November 10, 2003 @11:23AM (#7434042) Homepage Journal
    How long until history catches up with the X Windows System, and I can get an X server that renders entirely to the OpenGL API? I'd love all those panel edges, drop shadows, animated buttons, textured skins, and other 3D "embossed" window decorations to come from my video card. The X server code could be much smaller, factoring out all the rendering code that it could reuse by calling the OpenGL API. And the X graphics primitives could become unified behind a widely cross-platform API, already implemented by blazingly fast hardware in the most competitive market in computing. And once XGL implemented the current style of X server display, we'd have an open, popular, and modular platform for experimenting with 3D spaces for "desktop" visualization. Let a thousand raytraced xscreensavers bloom!
  • I had one (Score:4, Interesting)

    by chunkwhite86 ( 593696 ) on Monday November 10, 2003 @11:24AM (#7434050)
    I had one of these original Voodoo I PCI boards. It had a VGA passthru connector on the back. The card didn't even have any heatsink or fan at all on it! I remember it ran at 43 Mhz or something like that, but I had overclocked mine to a whopping 47 Mhz! I glued a motherboard northbridge heatsink to the Voodoo chip to dissipate the extra heat, but I lost the neighboring PCI slot due to the size of the heatsink.

    Ah... those were the days.
  • Memories... (Score:3, Interesting)

    by vasqzr ( 619165 ) <`vasqzr' `at' `netscape.net'> on Monday November 10, 2003 @11:24AM (#7434051)

    We sold Diamond Monster 3D's like hotcakes back at Best Buy in the mid 90's.

    Then the Voodoo Rush came out. All in one. It stunk.

    Then the Voodoo II came out. Remember you could buy 2 of the cards and some games (SLI) would run faster than with just one!

    Then they did the combiniation card again...Voodoo Banshee. Worked pretty well.

    Then NVIDIA wiped them off the face of the earth.
  • by Thinkit3 ( 671998 ) * on Monday November 10, 2003 @11:24AM (#7434052)
    At some point shouldn't we just have really versatile CPUs? All these 3D cards are just kludges that happen to be tuned for 3D processing. They can do other general purpose processing as well. Thus the CPU can do their processing, given enough versatility.
  • by malf-uk ( 456583 ) on Monday November 10, 2003 @11:31AM (#7434097)
    Now there was a card that announced it was taking over the monitor - the not-so-delicate *clang* as its mechanical switch moved.
  • by boogy nightmare ( 207669 ) on Monday November 10, 2003 @11:32AM (#7434117) Homepage
    I remember plugging a 'piggy back' 12mg voodoo2 into my 4mg (or was it 8mg) Hercules graphics card, i remember installing UnReal and firing it up, when you get out of the ship for the first time and see that waterfall with the music playing i thought it was the most amazing thing i had ever seen. To this day it still ranks up there with the first time i saw a dinosaur in Jurassic Park and thinking 'this is the way to go' and being seriously in awe of all things to do with computer graphics.

    Now I have a 256mg geforcefx 5600 (some letters after it) and all games look amazing, in my other pc i have a 64mg geforce2 4400 (i think) and all games look good. Shame they dont play like Unreal did :(

    ps that voodoo2 is still going, its running on a p3 500 with a 8mg rage card, still can use it to play quake3 in a 800x600 res with pretty good textureing and fast as well :)

    ahhh any other memories or first time looks at the games that made you go 'ohhhhh thats pretty' ?
  • by green pizza ( 159161 ) on Monday November 10, 2003 @11:37AM (#7434154) Homepage
    Does anyone else notice that this "Video Card" history starts off with about the 3rd consumer 3D accelerator? They didn't even mention the groundbreaking Rendition Verite. Nor any of the non-PC 3D systems that came before it (Jim Clark / SGI's Geometry Engine based systems in 1983 or the image processors from Evans & Southerland).

    And if it's a Video Card history, why no mention of EGA/CGA?

    Sounds more like "the 3D accelerator world since the Voodoo" history. It's articles like this that make me wish the slashdot editors would remember they have some readers that are older than high school age.

    [end rant]
  • This days... (Score:5, Interesting)

    by dark-br ( 473115 ) on Monday November 10, 2003 @11:38AM (#7434159) Homepage
    I have never understood how this breed of cards exists to this day. Really... the difference between a "stock" GeForce and a workstation class Quadro GeForce... just doesnt justify the cost difference anymore.

    When you go back about 3 or 4 years ago... when you contrasted a Oxygen video card, or a FireGL vs a TNT or 3DFX card, you could see where the extra money went. But now, todays commerical grade video cards are more then capable. In fact, alot of people I know that work as graphic artists, use traditional Radeon or GeForce 4's in their workstation machines. Outside of say... Pixar, I just dont understand people buying the workstation class cards.

  • by pegr ( 46683 ) * on Monday November 10, 2003 @11:38AM (#7434160) Homepage Journal
    What? No mention of the IBM CGA card that you could destroy by putting it into video modes it didn't support? One of the few circustances in which PC hardware could be broken by software. That in itself should be worth mentioning!
  • by Waffle Iron ( 339739 ) on Monday November 10, 2003 @12:05PM (#7434363)
    It's so sad when people got excited about PC graphics cards. It wasn't/isn't because they were good, it's because they were /finally/ able to start doing what other platforms had been doing for years.

    OTOH, most of the peers of the early PCs had total crap text modes; they couldn't do what the PC could do. (Yes, this includes the Apple. There were no Macs yet.) This is one of the major reasons the PC ended up dominating; text mode was simply more important. Remember that back then most all business use and a good amount of home use was in text mode (word processing, spreadsheets, financial, etc.).

    The original IBM PC and its clones usually came with a specially designed monochrome text mode monitor with relatively high resolution (720 x something, no dot pitch to worry about). The monitors had a very long persistence phosphor that totally eliminated flicker. The monochrome text-mode video cards had a very nice serif font stored in their ROMs. IBM's intent was to recreate the feel of their expensive dedicated mainframe text terminals.

    This setup had a very high quality feel, and you could stare at it all day without getting eye strain. Early color graphics monitors, OTOH, were horrible at showing text. This was compounded by the crappy fonts that were shipped with most early graphic OSes. This made most of the PC's early competitors pretty useless for doing long stretches of serious work.

    IBM's attempt to provide color graphics did suck big time [*]. Originally, you had to buy two graphics adapters and two separate monitors to get text and graphics on the same machine. One of Compaq's claims to fame was getting the patent on unifying the PCs high-quality text mode and its graphics modes on a single graphics adapter and monitor.

    [*] The original 16-bit color mode of the EGA cards and VGA cards must have been designed by somebody who was high on crack. You can't get at the pixel memory without setting up a bewildering array of registers that control mandatory and mostly non-useful logic operations on your bits. The memory is accessed as 4 independent planes, so you have to unnaturally slice every pixel up into individual bits and have a PhD in boolean logic to get them on the screen as you intended. It easily could take a newbie a whole day of reading manuals and hacking before they could get a single white dot on the screen.

  • by Isldeur ( 125133 ) on Monday November 10, 2003 @12:23PM (#7434519)
    I had a Sony VGA monitor, and my friends and I were blown away by some 320x200 x 256 color graphic of a parrot.

    Man - that brought a smile to my face. I remember that picture. And the clown? Remember the clown at 320x240? It could really show you what a good picture could look like even at that low resolution.

    I remember the first game I had to make use of that more (on a PS/2 Model 70): Mean Streets. Anyone remember that? Had a blast playing that Christmas morning.

    And then there was fractint, which could use "other" modes and draw fractals at beyond the 640x480 modes to things like 732x5??. Of course it took hours on the 16Mhz 386DX in that box.

    Ah, times were simpler back then... :)
  • by gfxguy ( 98788 ) on Monday November 10, 2003 @12:32PM (#7434583)
    Hercules was the defacto "professional" monochrome card. I remember my first PC, a buddy had a hercules card and I got some EGA card. He had to use some sort of CGA emulator to play games.

    It was a 10Mhz XT, Sweeeeeet!
  • by Animedude ( 714940 ) on Monday November 10, 2003 @12:35PM (#7434614)
    As others said, it's sad that the "video card history" on that page only starts in '96. There were several other important 3D cards before the Voodoo1. Of course the Voodoo1 really DID revolutionize the way games were played. As soon as the first serious 3D cards came out, you basically selected the card based on which one had working "3D patches" for the games you wanted to play. I remember back then buying a Matrox Mystique because they offered a working 3D patch for Tomb Raider. I already had the game and it played halfway decent on my S3 card (in 2D mode). Then I plugged in the Mystique, applied the 3D patch and whoa - smooooooooooooth graphics :-)

    The only problem was that not enough games HAD 3D patches. A standard was missing. No game company wanted to write 3D patches for ALL the cards out there. Then the Voodoo1 came along, and it was WAY faster than anything else, and they had Glide (which apparently was pretty easy to program for). Suddenly, almost all new games came out with 3dfx support - and you had games you NEVER could have played on the old 2D hardware. The funny thing was, once you had a 3dfx card in your machine, the processor power was not that important anymore. The only thing which mattered was that you HAD a Voodoo card in there. No voodoo - no serious gaming. Voodoo in there - happiness :)

    Well, then Quake and Quake2 came along, and you all know the rest.

    The only thing to remember is that the Voodoo1 DID revolutionize gaming. It was a quantum leap. Either you had one, then you could game. Or you did not have one, then you wanted one.
  • by Shivetya ( 243324 ) on Monday November 10, 2003 @12:45PM (#7434690) Homepage Journal
    Yes he should have had Matrox listed.

    However, Matrox has shown one thing since the days of Millenium cards... and that is they don't care about the consumer market.

    They left the consumer behind to go for the business market and it has done them well. As for 3d gaming, they were irrelevant back in the days of 3dfx because 3dfx marketed their cards and their API to the people that mattered; developers.

    Matrox has had superior technology a lot of times, their problem is it rarely does anything people really want. (and a handful of geeks doesn't count)
  • by exp(pi*sqrt(163)) ( 613870 ) on Monday November 10, 2003 @02:43PM (#7435723) Journal
    For example there was a fascinating pre-history of graphics cards from before they were released to the general public. Many developers on /. were surely involved in developing for these things even though they never finally made it to market. Many companies were involved before the appearance of the 3dfx chipset: Cirrus Logic, Yamaha, LSI, Oak, 3dlabs, Nvidia and so on.

    Some of my favorite cards were the 'decelerators' such as the Yamaha device. They hadn't yet figured out how to do 'perfect scan' so if you rendered a pair of triangles with a common edge then the pixels on that edge would be rendered in both triangles. If you rendered a square tesselated as triangles in the obvious way then the corner pixels were rendered 6 times. I had arguments with the guys about performance. They told me my drivers sucked as I couldn't match their laboratory performance. It's astonishing that a company could bring a device as far as first silicon without knowing how to rasterize a triangle correctly! Even without such mistakes they were still slow as the PCI bus was no way to send 3D instructions, geometry and textures anywhere. It would often take longer to format the data and send it to the device than simply rasterize directly to screen memory in software - even on early Pentiums!

    Then there was the first nvidia card that you may or may not know about. My God this thing was bad. Now I can't remember the exact details (this is many years ago) but it was very like the Sega Saturn's 3D system. (I don't think there's a coincidence here, the card came with a PC version of Virtua Fighter so I guess Sega and Nvidia were working together). Basically it was a sprite renderer. A square sprite renderer. But it had been hacked so the spans of the sprites could be lines that weren't raster aligned. So you could render twisted rectangles. With some deviousness you could render polygons with perspective and you had a 3D renderer. But it basically always 'pushed' an entire sprite. So it was pretty well impossible to do any kind of clipping. It was next to impossible to map the functionality to any kind of 3D API and so could only run applications dedicated to it. Again they complained that we were unable to write proper 3D drivers for their card. Admittedly their design did at least allow some games to run fast but I'm still amazed by the lack of understanding by the early nvidia guys. So when they eventually overtook 3dfx I was completely blown away.

    And then there was the question of APIs. In the old days there was no Direct3D. There was OpenGL but most PCs were a long way from having the power for a full OpenGL implementation. Early on only one company was interested in OpenGL - 3dLabs. They were the only company who understood what they were doing on PCs in those early days. So there was a variety of APIs: Renderware, Rendermorphics, and BRender among others. Rendermorphics was eventually bought by MS and became Direct3D. The first few revisions were terrible but as they always do MS eventually 'got it'. Renderware [renderware.com] is still going. They are part of Canon. Anyone who knows Canon will be aware that they patent everything. If you dig out the early Canon patents you'll find they patented fast rendering of speculars by a technique which meant they didn't actually move as the viewpoint move. (If you know 3D you should be laughing loudly right about now.) But Renderware did get their act together and now have a 3D API that runs on a few consoles. And some of the earliest and coolest 3D hacks were first patented by them. BRender just disappeared though Jez San, the guy behind it, recently received an OBE for his influence on the British computer games industry. (Gossip tidbit: at one point SGI were looking for a 3D API for PCs and chose BRender over OpenGL for their FireWalker game development system.) If you dig into the pre-pre-history of 3D accelerators you'll see that San's company, Argonaut, developed the first commercial 3D accelerator (though not PC card) - the FX chip for the SNES, used for Starfox.

    And this is all from memory so please accept my apologies for errors and post corrections!

  • by Enonu ( 129798 ) on Monday November 10, 2003 @03:09PM (#7435906)
    With today's CPUs having more than enough power for most tasks done by the average user, when will we get to the point where we don't need a video accelerator? The six month cycle makes upgrading to a new video card an expensive and risky proposition afterall.

    For example, I wonder how many FPS a P4 3.2 or an Operon can pump out @1024x768x16bit in Q3 with only software rending.

THEGODDESSOFTHENETHASTWISTINGFINGERSANDHERVOICEISLIKEAJAVELININTHENIGHTDUDE

Working...