Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

Video Card History 390

John Mathers writes "While searching the net for information on old Voodoo Video Cards, I came across this fabulous Video Card History article up at FastSilicon.com. It's nice to see that someone has taken the time to look back on the Video Cards that revolutionized the personal computer. Here's a quote "When 3dfx released their first card in October of 1996, it hit the computer world like a right cross to the face. That card was called Voodoo. This new card held the door open for video game development. The Voodoo card was simply a 3D graphics accelerator, which was plugged into a regular 2D video card; known as piggy backing."
This discussion has been archived. No new comments can be posted.

Video Card History

Comments Filter:
  • Revisionist History? (Score:5, Interesting)

    by Maradine ( 194191 ) * on Monday November 10, 2003 @10:15AM (#7433993) Homepage
    I note that the history of this article starts in 1996 . . . one year after Rendition's Verite chip became the first consumer add-on 3D accelerator.
    • by merlin_jim ( 302773 ) <James@McCracken.stratapult@com> on Monday November 10, 2003 @10:37AM (#7434152)
      I note that the history of this article starts in 1996 . . . one year after Rendition's Verite chip became the first consumer add-on 3D accelerator

      And absolutely no mention of Matrox whatsoever... despite the fact that their add-on 3D accelerator was arguably superior to the voodoo, and the parhelia is the ONLY 3d solution to support 3 display devices.
      • by sznupi ( 719324 ) on Monday November 10, 2003 @10:59AM (#7434322) Homepage
        And I'll add only that Matrox basically invented (or at least first implemented in commercial product) video ram something like quarter century ago and that they had API capable of hardware accelerating 3d aps in a window in the times of win 3.11 (several years later Voodoo couldn't do it)
        And no mention about that company whatsoever :/
        But hey, what can you expect from (probably) fps kiddie biased negatively towards Matrox among others - because if he'd be JUST fps kiddie (not anti Matrox and...) he'd mention the fact that for ~half a year in 99 Matrox was the leader BOTH in performance and quality...too bad since then only the second holds true.
        • by Shivetya ( 243324 ) on Monday November 10, 2003 @11:45AM (#7434690) Homepage Journal
          Yes he should have had Matrox listed.

          However, Matrox has shown one thing since the days of Millenium cards... and that is they don't care about the consumer market.

          They left the consumer behind to go for the business market and it has done them well. As for 3d gaming, they were irrelevant back in the days of 3dfx because 3dfx marketed their cards and their API to the people that mattered; developers.

          Matrox has had superior technology a lot of times, their problem is it rarely does anything people really want. (and a handful of geeks doesn't count)
      • by Skuld-Chan ( 302449 ) on Monday November 10, 2003 @04:41PM (#7437603)
        Talk about revising history here's a few facts.

        A) the Verite chip had no opengl support - it could only run vquake - a specialy made version of Quark for the verite. And even then it was slow. Also it was kinda pokey for Direct 3D stuff as well.

        B) S3 also had no opengl support - and limited direct 3d support - most direct 3d games did not support it (for instance it didn't support uploading textures...)

        C) Matrox - except for high end equipment also wasn't nearly fast enough to play GLQuake. The Mystique is not nealy fast enough to play actual video games.

        Why do I keep mentioning Quake? I think in 96 is was the defining game. If your card could run GLQuake smoothly you were in the zone. And the only cards that could run it even near smoothly cost well over 2000 dollars. Don't believe me? This is actually all in the GLQuake readme (more or less)

        When my Intel P-120 first started GL Quake on the Voodoo 1 I just about crapped my pants. It was smooth, fluid and it looked awsome! No other video card at the time for 150-200 dollars could deliver those kinds of results.
    • by Anonymous Coward on Monday November 10, 2003 @10:48AM (#7434241)
      Check this one:

      http://www.accelenation.com/?ac.id.123.1
    • by rsmith-mac ( 639075 ) * on Monday November 10, 2003 @10:56AM (#7434295)
      Agreed. You can't have a serious history piece without also including S3(and the world's first 3D decelerator) and Rendition. The article pooped out before I got to page 3, but I'm willing to bet they've also managed to skip the Intel i740, another decent but notable product.
    • I had on of those Verite-based cards. It was a Diamond Stealth s220, based on the verite 2100. It had performance almost equal to the Voodoo1 card that my friend had, but much better image quality. I remember when I finally got quake2 working in hardware mode. I was completely blown away.
    • "I note that the history of this article starts in 1996 . . . one year after Rendition's Verite chip became the first consumer add-on 3D accelerator."

      Just playing Devil's Advocate here, but it wasn't until 3DFX hit the market that it became a mainstream gaming card. Creative Labs didn't invent the sound card, but they sure made that market blossom.
  • by dada21 ( 163177 ) <adam.dada@gmail.com> on Monday November 10, 2003 @10:16AM (#7433998) Homepage Journal
    I was excited to load this article up and hope to see my first VGA card by Paradise. I believe it was called Paradise 256K or something like that. I had a Sony VGA monitor, and my friends and I were blown away by some 320x200 x 256 color graphic of a parrot. Then we found a nude GIF. Whoa. I think I had that card about 2 years before any game really supported it, although Police Quest in EGA mode was nothing like we could imagine.

    I'd love to see a history of all video cards for the PC platform...
    • I had a paradise card. Wow that takes me back. When pictures almost looked like real pictures.
    • What were the big players back then? Paradise, Trident, and Tseng, right? Man. MCGA rocked.
      • What were the big players back then? Paradise, Trident, and Tseng, right? Man. MCGA rocked.

        Hmmm yeah, Trident and Tseng go way back. Oak Technology made a lot of cards... Diamond was hot for awhile, but in my memory, at least, they came in to the game pretty late, at least I didn't see them till the mid 90s.

      • Hercules was the defacto "professional" monochrome card. I remember my first PC, a buddy had a hercules card and I got some EGA card. He had to use some sort of CGA emulator to play games.

        It was a 10Mhz XT, Sweeeeeet!
    • I remember when the PeeCees had EGA or lowly CGA (which looked terrible, by the way) or even no graphics at all other than the graphics characters available to MS-DOS. PeeCee graphics cards were expensive to get even rudimentary high-res and color (16 if you were lucky) whereas "home" computers like the Amiga and ST had higher resoltiom, greater colour depth and some hardware acceleration (blitting). These machines were never taker seriously because their advanced graphics and sound capabilities were consid
      • I remember when the PeeCees had EGA or lowly CGA (which looked terrible, by the way) or even no graphics at all other than the graphics characters available to MS-DOS. PeeCee graphics cards were expensive to get even rudimentary high-res and color (16 if you were lucky) whereas "home" computers like the Amiga and ST had higher resoltiom, greater colour depth and some hardware acceleration (blitting). These machines were never taker seriously because their advanced graphics and sound capabilities were consi

      • You beat me to it. It's so sad when people got excited about PC graphics cards. It wasn't/isn't because they were good, it's because they were /finally/ able to start doing what other platforms had been doing for years. Even then, the performance was poor - that's just when they started being able to display the same number of colours. The lowly Commodore 64 had better graphics than a PC with CGA graphics!
        • by Waffle Iron ( 339739 ) on Monday November 10, 2003 @11:05AM (#7434363)
          It's so sad when people got excited about PC graphics cards. It wasn't/isn't because they were good, it's because they were /finally/ able to start doing what other platforms had been doing for years.

          OTOH, most of the peers of the early PCs had total crap text modes; they couldn't do what the PC could do. (Yes, this includes the Apple. There were no Macs yet.) This is one of the major reasons the PC ended up dominating; text mode was simply more important. Remember that back then most all business use and a good amount of home use was in text mode (word processing, spreadsheets, financial, etc.).

          The original IBM PC and its clones usually came with a specially designed monochrome text mode monitor with relatively high resolution (720 x something, no dot pitch to worry about). The monitors had a very long persistence phosphor that totally eliminated flicker. The monochrome text-mode video cards had a very nice serif font stored in their ROMs. IBM's intent was to recreate the feel of their expensive dedicated mainframe text terminals.

          This setup had a very high quality feel, and you could stare at it all day without getting eye strain. Early color graphics monitors, OTOH, were horrible at showing text. This was compounded by the crappy fonts that were shipped with most early graphic OSes. This made most of the PC's early competitors pretty useless for doing long stretches of serious work.

          IBM's attempt to provide color graphics did suck big time [*]. Originally, you had to buy two graphics adapters and two separate monitors to get text and graphics on the same machine. One of Compaq's claims to fame was getting the patent on unifying the PCs high-quality text mode and its graphics modes on a single graphics adapter and monitor.

          [*] The original 16-bit color mode of the EGA cards and VGA cards must have been designed by somebody who was high on crack. You can't get at the pixel memory without setting up a bewildering array of registers that control mandatory and mostly non-useful logic operations on your bits. The memory is accessed as 4 independent planes, so you have to unnaturally slice every pixel up into individual bits and have a PhD in boolean logic to get them on the screen as you intended. It easily could take a newbie a whole day of reading manuals and hacking before they could get a single white dot on the screen.

      • lowly CGA (which looked terrible, by the way)

        What are you talking about? On CGA, you had complete choice! You could use either the rasta red/green/yellow palette, or the nuclear pink/cyan/white palette! What more do you want?

    • by vasqzr ( 619165 ) <vasqzr@@@netscape...net> on Monday November 10, 2003 @10:32AM (#7434111)

      Back then, the hardware specs (so you could program the device) came with all the accessories you bought for your PC. Imagine that.

      Printers had a book with all the Escape codes, Video cards told you which modes they supported, modems had AT command set references...

      Try getting the specs to a PCI card nowadays....
    • I had a Sony VGA monitor, and my friends and I were blown away by some 320x200 x 256 color graphic of a parrot.

      Man - that brought a smile to my face. I remember that picture. And the clown? Remember the clown at 320x240? It could really show you what a good picture could look like even at that low resolution.

      I remember the first game I had to make use of that more (on a PS/2 Model 70): Mean Streets. Anyone remember that? Had a blast playing that Christmas morning.

      And then there was fractint, which coul
    • That must have been back around 1989? I remember seeing the first 386's with VGA graphics, and the demos featuring 320x200 256 colour graphics of a castle. By 1990, my paradise card was superceded by a $700 Hercules Graphics Station Card (TMS34010 processor) with came with 1 Megabyte of VRAM (the double-buffering option cost another $300) and four 24-bit colour images; A head-scan, a party-pup (don't even ask!!!), a fashion model and somebody leaning out of a window. I managed to write a SGI image format vi
  • Nostalgia (Score:3, Funny)

    by CrayHill ( 703411 ) on Monday November 10, 2003 @10:16AM (#7434000)
    Aahh, 1996...the good old days...

    I remember when we would write ASCII graphics contouring programs for line printers!
    • Re:Nostalgia (Score:2, Interesting)

      by Lumpy ( 12016 )
      HUH? in 1996 I was playing 3d accelerated games on my Virge 3D card. 3d gaming was around for a while at that point...

      Bring me back to 1994 when the real 3d cards were $3000.00 and only in the CAD workstations.

      1996.. not long ago at all.
      • His point is that fondly looking back on graphics cards in '96 is like anxiously checking Rolling Stone every issue to find out if 90's music is retro yet. Computers without monitors... now that's history.
  • Well, sort of. (Score:5, Informative)

    by ultramk ( 470198 ) <ultramk@pacbel l . net> on Monday November 10, 2003 @10:18AM (#7434008)
    The Voodoo card was simply a 3D graphics accelerator, which was plugged into a regular 2D video card; known as piggy backing.

    This isn't entirely correct, as any Voodoo 1 user could tell you. The card took up its own slot, and used a pass-through video cable to connect the monitor: When a Voodoo-compliant video signal was detected, it hijacked the output to the monitor and took over.
    Nice design, for the time. The best thing was, it was CHEAP for the time (considering the performance). I think I paid $199.

    M-
    • Comment removed based on user account deletion
    • Re:Well, sort of. (Score:4, Informative)

      by daBass ( 56811 ) on Monday November 10, 2003 @10:34AM (#7434133)
      What they also forgot to mention was that you could daisy chain cards to get even better performance.

      At the ISP I worked at I had two Voodoo 2 cards, which, on a lowly PII-350, ran Unreal Tournament with full detail in 1024*768 at a massive framerate!
    • I owned a Voodoo 1 and Voodoo 2 card. Didn't the Voodoo2 series have the ability to be cabled *directly* to another Voodoo2 card for greater performance? I forget what they called this piggybacking, but maybe he's confusiong the passthrough video cabling with this ability.

      • It was called SLI... and basically the cards interleaved, one doing the odds and the other doing the evens.

        Which is a pretty simple way to get double the performance. I wonder why noone's done this recently...
        • Which is a pretty simple way to get double the performance. I wonder why noone's done this recently

          They are, in the chip itself, sorta. Modern all-in-one GPUs have multiple texture pipelines, which does split some of the load on the silicon level. It's not SLI, but it's the same concept.

          The problem is SLI only doubles the fillrate. Both 3D chipsets need to work on the exact same dataset. SLI was a great boost back when 3D hardware was slow at texturing. These days the hardware can pump a couple thousand
          • SLI was a great boost back when 3D hardware was slow at texturing. These days the hardware can pump a couple thousand frames per second worth of textures, it's fancy multipass rendering and dynamic shaders (and to some extent, the geometry) that take up all of the frame generation time. SLI could speed some of this up, but it wouldn't help with most of the bottlenecks. It would be like putting new tires on a car that needs an engine tuneup.

            But if both cards are on the AGP bus (I know, only one client devi
            • by green pizza ( 159161 ) on Monday November 10, 2003 @11:33AM (#7434586) Homepage
              It is possible to scale performance that way, but the result will be less than double the frame rate, simply because the time to generate a frame does not scale linearly with resolution.

              To do 60 frames per second, you have roughly 16ms to generate a frame. A couple of those ms will be gobbled up with I/O transactions and various wait states, so you're already at the point where double the power is only going to result in 1.75x the performance. This will also be highly dependant on how well the 3D code can be parallelized (are there a lot of read_pixels callbacks that require both GPUs and both banks of memory to talk to each other? etc).

              This has actually been done by SGI for awhile now. A couple years ago they took their Origin 3000 architecture and stuck on dozens of V12 GPUs and tiled the graphics for higher performance. That concept has been tweaked for their Onyx4 sytems... one large single computer with up to 34 ATI FireGL X1 GPUs. 16 GPUs work on each display in a 4x4 grid. Each GPU generates its 400x300 piece and 16 of those are composited in real time to make up a 1600x1200 display. I believe the biggest such machine to date has 32 GPUs powering two zippy fast 1600x1200 displays and 2 GPUs driving an additional 4 lesser powered displays. SGI gets quite a speedup by doing it that way, with 16 GPUs per display, but there's also a lot of overhead (even more in SGI's case, with 34 AGP 8X busses in said system). Their implementation of OpenGL and OpenGL Performer is tweaked for this, though.

              So yeah, it can be done, but the fact that the GPUs will spend a significant amount of time doing non-rendering tasks (I/O, waiting for data, copying the final result to frame buffer, etc) means that you won't see a nice linear scaling. The cost of making custom hardware and custom drivers also adds up. With top-end PC 3D accelerators costing $400 already, I can't picture many users shelling out $1000+ for a dual GPU card.
      • ScanLine Interleaving. To get over the fillrate bottleneck, one card pumped pixels for even-numbered scan lines, the other worked on the odd-numbered scan lines. Back in the days when a dual PII/400 and dual Voodoo2 was the gamer's ultimate machine. There were even a few companies that stuck two Voodoo2 chipsets on a single card.

        A lot of professional/expensive 3D systems before the Voodoo2 used a similar technique. If one of the texture ram modules comes loose on an SGI Indigo2 MaximumImpact, textured mode
    • Nice design, for the time. The best thing was, it was CHEAP for the time (considering the performance). I think I paid $199.

      The Voodoo2 cards started at $250 and $350 (or somewhere around there) for the 8MB and 12MB models, respectively. The only way to get the 1024x768 mentioned in the article was to have 2 12MB cards in SLI mode (which meant connecting the 2 V2 cards with a small ribbon cable between the two cards inside the case). Additionally, the pass-through cables that came with most V2 cards cause
    • Didn't they have relays too? When a 3dfx-supported game started you'd get a 'clunk-clunk' as the relay clicked in? I'm sure they did.
    • Re:Well, sort of. (Score:2, Redundant)

      by N Monkey ( 313423 )
      The card took up its own slot, and used a pass-through video cable to connect the monitor: When a Voodoo-compliant video signal was detected, it hijacked the output to the monitor and took over.
      Nice design, for the time. The best thing was, it was CHEAP for the time (considering the performance). I think I paid $199.


      I personally think (but I am biased) that the PowerVR PCX1/2 solution was nicer. It also piggy-backed off the 2D card but, instead of using cables, it squirted pixels (in packets) across the P
  • by fault0 ( 514452 ) on Monday November 10, 2003 @10:18AM (#7434009) Homepage Journal
    but it'd nice to have a history of things before 1996 (i.e, pre-voodoo cards). Video card history between 1996-2000 was very well documented, perhaps thanks to all of the articles that came out near/after 3dfx's demise, and most of us remember everything within the last three years.
  • by dolo666 ( 195584 ) * on Monday November 10, 2003 @10:19AM (#7434016) Journal
    I remember my first Voodoo cardie. I was playing TWCTF [thunderwalker.net] alot with my buds, and many of them had fast systems (at the time) running glquake/glqw [gamespy3d.com]. Finally after being a software user for so long, getting decent lag-frags, I did the unthinkable and ditched the software client for some better visuals with my very own piggybacked Voodoo card, from 3dfx. Gaming has changed quite a bit since then, but you have to understand how much fun it was playing Quake in software mode. The mods were cool too, but everything about that experience was killer fast. Now since then, games have mostly slowed down on PC. Quake 2 and Quake 3 were much slower. The speed of play for TW back with software, was intense. You had to hold your adrenaline rush to the bitter end of any match. By the time I was playing for ZFA [142.179.67.73], everyone had a 3d card. I can remember the Q2 LAN parties when guys would show up with their configs all set for zero textures and no coloured lighting. The levels would all be just plain white, and guys would be saying how awesome it was they could get 100fps doing this. To me, it always took something away from the game to run configs like that, even if it could give you an edge in matches.

    When I saw Quake 2 CTF for the first time at the Respawn LAN party, Zoid [iwarp.com] showed us on this decked out system, how totally amazing it was. I remember how georgeous q2ctf1 looked for the first time my eyes caught it. It was magic. I even wrote about it. You could never have seen it if it wasn't for the people at 3dfx, who pretty much paved the way for all the gaming rigs we've got now. It's a shame that the same people who built this dream had to shut their business down.

    I guess, that's how we award our innovators today... with steady, constant competition, or you're out. Seems cold, doesn't it?
    • "Gaming has changed quite a bit since then, but you have to understand how much fun it was playing Quake [...] Now since then, games have mostly slowed down on PC. Quake 2 and Quake 3 were much slower."

      What are you talking about? Wing Commander's a bit choppy on my machine, but F19 Stealth Fighter screams along. So smooth. I tried that Quake game, but it was hard finding space on my 80MB hard drive. Then it was unplayably slow. How can these other games you mention be even slower - you can't get slow
      • "What are you talking about? Wing Commander's a bit choppy on my machine, but F19 Stealth Fighter screams along."

        My point was specifically about gameplay, which has slowed in FPS games since the early ones. I think I was trying to correlate this slowed gameplay to the raised detail levels in games, and that the raised detail levels were only a result of faster, better 3d cards.

        Don't believe me? Load up a game of TW and see what I mean, with the harpoon gun. You can't get that kind of speed from any other
  • by loconet ( 415875 ) on Monday November 10, 2003 @10:19AM (#7434019) Homepage
    FastSilicon.com - not so fast anymore.
  • I remember when I was craving for a Voodoo card so that I could run Quake better. I finally sprung for a Voodoo II card when they had a $50 rebate. I was so excited to get online with my ISDN line and frag everyone in OpenGL graphics that I threw away my Voodoo II box along with its product bar code. No proof of purchase no $50 rebate. Doh! Damn, that hurt my wallet.
  • by Acidic_Diarrhea ( 641390 ) on Monday November 10, 2003 @10:20AM (#7434029) Homepage Journal
    From the article: "The cards released then were rather nuke warm. Nothing really special, nothing too different brought to the table..."

    Nuke warm cards huh? How many fans do you need for one of those?

    The Internet needs an editor or two hanging around.

  • Slashdotted (Score:2, Funny)

    by kinnell ( 607819 )
    Maybe they should change their name from fastsilicon to smokingsilicon.
  • XGL? (Score:4, Interesting)

    by Doc Ruby ( 173196 ) on Monday November 10, 2003 @10:23AM (#7434042) Homepage Journal
    How long until history catches up with the X Windows System, and I can get an X server that renders entirely to the OpenGL API? I'd love all those panel edges, drop shadows, animated buttons, textured skins, and other 3D "embossed" window decorations to come from my video card. The X server code could be much smaller, factoring out all the rendering code that it could reuse by calling the OpenGL API. And the X graphics primitives could become unified behind a widely cross-platform API, already implemented by blazingly fast hardware in the most competitive market in computing. And once XGL implemented the current style of X server display, we'd have an open, popular, and modular platform for experimenting with 3D spaces for "desktop" visualization. Let a thousand raytraced xscreensavers bloom!
    • by Malc ( 1751 )
      This is /. ... I'm surprised you haven't been modded down for trolling with flamebait. If what you were suggest were to happen, rabid /.ers would look at the screenshots and complain about bloat and how it's all unnecessary! ;)
    • Or you could just get a Mac and run OSX, which is pretty close to the same thing.

      Nathan
    • You look like a man that could use Mac OS X. This is what Quartz Extreme does. People say that the OS X GUI takes up too much CPU, while in fact it takes up almost none. All of the windows, shadows, etc, are being done with the video card though Quartz Extreme, no programming necessary from the app writer to take advantage of this either.

      It's pretty nifty, using expose can be quite addicting. Big..small..big..small..big..
      • Re:XGL? (Score:2, Informative)

        People say that the OS X GUI takes up too much CPU, while in fact it takes up almost none. All of the windows, shadows, etc, are being done with the video card though Quartz Extreme, no programming necessary from the app writer to take advantage of this either.

        This isn't quite true. Most of the desktop rendering is still done by the CPU in the same way that it was done before QE was added to OS X. It's simply the individual windows that are rendered by QE, and OpenGL handles the 'surface' which is handed
  • by MtViewGuy ( 197597 ) on Monday November 10, 2003 @10:23AM (#7434045)
    I think what finally brought 3-D graphics acceleration into the mainstream was the introduction of graphics card chipsets that could combine decent 3-D acceleration with fast 2-D graphics all at once.

    nVidia's pioneering RIVA 128 chipset was the first chipset that could compare itself in 3-D mode against the vaunted Voodoo cards of that period; once nVidia unveiled the groundbreaking TNT chipset it was pretty much over for Voodoo's separate board approach. This is what spurred ATI into developing the Rage Pro and later Rage 128 chipsets in the late 1990's, starting the competition between ATI and nVidia that has lasted to this day.
    • I think what finally brought 3-D graphics acceleration into the mainstream was the introduction of graphics card chipsets that could combine decent 3-D acceleration with fast 2-D graphics all at once.

      ATI's RageII/RageII+/RageII+DVD, the forerunner to the RagePro, was a decent all-in-one performer for certain games. The drivers were AWFUL, though, so its hard to tell if the problems were with the silicon itself. I don't think there was ever even a DVD Player that made use of the RageII+DVD. ATI's head was
  • I had one (Score:4, Interesting)

    by chunkwhite86 ( 593696 ) on Monday November 10, 2003 @10:24AM (#7434050)
    I had one of these original Voodoo I PCI boards. It had a VGA passthru connector on the back. The card didn't even have any heatsink or fan at all on it! I remember it ran at 43 Mhz or something like that, but I had overclocked mine to a whopping 47 Mhz! I glued a motherboard northbridge heatsink to the Voodoo chip to dissipate the extra heat, but I lost the neighboring PCI slot due to the size of the heatsink.

    Ah... those were the days.
    • One more thing... The card had 6 MB of RAM on it. 4 MB main memory and a 2 MB texture buffer I think.
      • You're probably thinking of the Canopus Pure 3D. It was good card and had TV out well before it was a standard feature. I remember hooking Quake ][ up to the big screen in the fraternity house and watching people ogle over it. Also, it was a 2MB framebuffer (like all Voodoo I cards) and a 4MB texture buffer.
  • Memories... (Score:3, Interesting)

    by vasqzr ( 619165 ) <vasqzr@@@netscape...net> on Monday November 10, 2003 @10:24AM (#7434051)

    We sold Diamond Monster 3D's like hotcakes back at Best Buy in the mid 90's.

    Then the Voodoo Rush came out. All in one. It stunk.

    Then the Voodoo II came out. Remember you could buy 2 of the cards and some games (SLI) would run faster than with just one!

    Then they did the combiniation card again...Voodoo Banshee. Worked pretty well.

    Then NVIDIA wiped them off the face of the earth.
  • At some point shouldn't we just have really versatile CPUs? All these 3D cards are just kludges that happen to be tuned for 3D processing. They can do other general purpose processing as well. Thus the CPU can do their processing, given enough versatility.
    • sure, any modern cpu can perform any task, handle any input or output, but the question is how fast can it do it, and how much are you paying for the cpu.

      speed/cheap/general purpose

      pick two

    • When your CPUs floating-point throughput is a factor of 1000 better, that's when. In other words, at the rate at which general-purpose CPU technology advances, you'll be at that level of performance in about 15 years.
    • Thus the CPU can do their processing, given enough versatility.

      That's a good point. Video cards, NICs etc... all with their own processors and RAM. This is out of control! I log for the days when downloading a file or viewing a graphic would actually tie up your machine.

      Oops, did I type that out loud?
  • by Bagels ( 676159 ) on Monday November 10, 2003 @10:25AM (#7434064)
    From the article...
    The GF2MX was a small step down, it cut off two of the pixel pipelines, and took the fill rates down to 350 pixels per second.

    Erm. That's not even enough to fill in a single horizontal bar of the screen (unless you're running in 320*240 resolution). Perhaps they meant megapixels? This was hardly the only such error that I noticed, though - these guys really need to have someone proofread their articles.

  • by malf-uk ( 456583 ) on Monday November 10, 2003 @10:31AM (#7434097)
    Now there was a card that announced it was taking over the monitor - the not-so-delicate *clang* as its mechanical switch moved.
  • by boogy nightmare ( 207669 ) on Monday November 10, 2003 @10:32AM (#7434117) Homepage
    I remember plugging a 'piggy back' 12mg voodoo2 into my 4mg (or was it 8mg) Hercules graphics card, i remember installing UnReal and firing it up, when you get out of the ship for the first time and see that waterfall with the music playing i thought it was the most amazing thing i had ever seen. To this day it still ranks up there with the first time i saw a dinosaur in Jurassic Park and thinking 'this is the way to go' and being seriously in awe of all things to do with computer graphics.

    Now I have a 256mg geforcefx 5600 (some letters after it) and all games look amazing, in my other pc i have a 64mg geforce2 4400 (i think) and all games look good. Shame they dont play like Unreal did :(

    ps that voodoo2 is still going, its running on a p3 500 with a 8mg rage card, still can use it to play quake3 in a 800x600 res with pretty good textureing and fast as well :)

    ahhh any other memories or first time looks at the games that made you go 'ohhhhh thats pretty' ?
    • That very same moment getting out of the ship, I had to stop playing and just *look*. I'm sure there weren't any badguys there so you could just have your mind blown.

      I was so taken by the 'great outdoors' that I grabbed the cheats off the web and just flew around that area. Absolutely one of the most impressive games I have played, eyecandy-wise. Not to mention the run down the hallway after the 'creature' and *hearing* your shipmates get torn apart on the other side of the door.

      All with the magic of a

  • I first saw this card when a friend bought one to play Tomb Raider. I was blown away; the game went from chunky, halting software-rendered 3d to beautiful, smooth, detailed hardware 3d. I immediately bought one of my own (from Canopus, the Canopus Pure 3D), which I proceeded to use for several years. I can remember the big pit in Tomb Raider where a couple of lions and gorillas were running around in the fringes of darkness. I thought it was so cool that you could see these animals from far away and rat
  • There's nothing in there about chips from PowerVR, S3, Rendition, etc.

  • by alpharoid ( 623463 ) on Monday November 10, 2003 @10:37AM (#7434153)
    Video card history going back to 1996 isn't really necessary -- if you're around 25 and bought the Voodoo 1 back when it came out, you can probably recite all the facts from 1996-2003 from the back of your head.

    And if it's just 3D chipsets that count, what about the [near useless] S3 Virge, before the Voodoo? What about the extra details, like 3dfx buying out STB to manufacture its own integrated 2D/3D solutions (Voodoo3 onwards), effectively pissing off an entire industry?

    Oh well. Maybe next time.
  • by green pizza ( 159161 ) on Monday November 10, 2003 @10:37AM (#7434154) Homepage
    Does anyone else notice that this "Video Card" history starts off with about the 3rd consumer 3D accelerator? They didn't even mention the groundbreaking Rendition Verite. Nor any of the non-PC 3D systems that came before it (Jim Clark / SGI's Geometry Engine based systems in 1983 or the image processors from Evans & Southerland).

    And if it's a Video Card history, why no mention of EGA/CGA?

    Sounds more like "the 3D accelerator world since the Voodoo" history. It's articles like this that make me wish the slashdot editors would remember they have some readers that are older than high school age.

    [end rant]
  • This days... (Score:5, Interesting)

    by dark-br ( 473115 ) on Monday November 10, 2003 @10:38AM (#7434159) Homepage
    I have never understood how this breed of cards exists to this day. Really... the difference between a "stock" GeForce and a workstation class Quadro GeForce... just doesnt justify the cost difference anymore.

    When you go back about 3 or 4 years ago... when you contrasted a Oxygen video card, or a FireGL vs a TNT or 3DFX card, you could see where the extra money went. But now, todays commerical grade video cards are more then capable. In fact, alot of people I know that work as graphic artists, use traditional Radeon or GeForce 4's in their workstation machines. Outside of say... Pixar, I just dont understand people buying the workstation class cards.

    • Actually, the cost difference is easily justified. A workstation class card may use the same GPU as a gaming card, and the gaming card may be faster (in games). The work done on a workstation card is on the drivers. They are MUCH more stable, and designed to work with apps like Lightwave and 3DStudio MAX.
  • by pegr ( 46683 ) * on Monday November 10, 2003 @10:38AM (#7434160) Homepage Journal
    What? No mention of the IBM CGA card that you could destroy by putting it into video modes it didn't support? One of the few circustances in which PC hardware could be broken by software. That in itself should be worth mentioning!
  • by jest3r ( 458429 ) on Monday November 10, 2003 @10:38AM (#7434163)
    Sadly, the last card 3dfx constructed was the Voodoo 5 6000, which was rarely seen at all. That is rather hard to believe seeing that it's one of the biggest graphics cards I have ever seen. It's equipped with 4 GPUs (That's right, 4.) and 128 megabits of memory. This card was mostly only seen in servers though.

    This card was massive and would never have been used in a server.

    • Gee, I just came back to criticize that statement too. Who is the moron that wrote that "history"?!

      First, the Voodoo 5 6000 was NEVER sold at retail or OEM. Accordingly, how could anyone buy and install it into a server?! Second, why would anyone put a 3d card in a sever?! Heck, do server boards even have AGP slots?!

      The "history" is not worth the code it was written in.

  • That was a horrid timeline.

    Let alone the historical inaccuracies, the guy writes like he's in the 4th grade.

    Here's my favorite typo, "As time went on, ATI and NVIDIA battled between themselves, releasing card after card. The cards released then were rather nuke warm."

    Yeah. We wish.

    Tal
    • Or, my personal favorite:

      Sadly, the last card 3dfx constructed was the Voodoo 5 6000, which was rarely seen at all. That is rather hard to believe seeing that it's one of the biggest graphics cards I have ever seen. It's equipped with 4 GPUs (That's right, 4.) and 128 megabits of memory. This card was mostly only seen in servers though

      WHAT? In servers? OK buddy.

      Not to mention the fact that he completely missed the original TNT. What a dipshit.
  • There was a patch or something for Quake 1 that let you run it with a voodoo card, and its why I bought a voodoo card to begin with,

    I still have the Q1 CD, but it occured to me -- can I even run it and get good graphics without a voodoo card, or am I stuck with software rendering? IIRC the Q1 patch was voodoo specific.

    I'm also wonder if Q1 wasn't a DOS game as well, which might make it impossible to run on XP, unless a subsequent Windows version was released.
    • iD released a win32/OpenGL version of quake as unsupported, though free, software a couple of years after the original, called GLQuake [bluesnews.com].
    • i'm aware of two 3d accelerated versions of quake for windows back in the day: vquake, for rendition's verite line of cards, and glquake, which worked with any opengl compliant video card. glquake at first was used almost exclusively with 3dfx voodoo boards and so people thought it was 3dfx specific. in fact, 3dfx's voodoo card only supported a subset of the opengl api, hence they provided a "mini gl" driver that implemented only so much of the spec as glquake required. if you're fiddling with it today and
  • by Graemee ( 524726 ) on Monday November 10, 2003 @10:41AM (#7434189)
    What about the early cards, TIGA, 8514/A & other 3D attempts like RIVA, Mystique, Virge? What about the cheats on PC benchmarks, back in VGA, now in 3D tests? What happened to Number 9, ELSA and other "Big" names in cards that are nolonger? Reads more like a Time magazine article then a serious attempt at a history of video cards Most glaring to me is the ATI 8500/Nvidia GF3 omission.
  • by Obiwan Kenobi ( 32807 ) * <evan@@@misterorange...com> on Monday November 10, 2003 @10:46AM (#7434226) Homepage
    I had a 3dfx Monster3d (Voodoo 1) back in Winter 1996, when it first came out. I remember the passthru cable that connected to my turbocharged 2MB video card and my overclocked P150 (to a P166, yeah baby!), and I certainly recall the brilliance of GL Quake and the absolutely gorgeous Grand Theft Auto (1!) after it supported Glide.

    I also recall the controversy of transparent water in Quake and how that was considered "cheating" by en large. Those poor non-accelerated folks had to get in the water first to see anything!

    Me, I'd just wait until they all jumped in the water and fire off that Lightning Gun. Sure it's suicide, but is it really suicide when you get to roast at least 5 or more people at the same time? DM3, how I miss thee.
  • Not accurate. (Score:2, Informative)

    by El_Ge_Ex ( 218107 )
    NVIDIA bought them out in December of 2000.

    This text has some flaws... Nvidia didn't buy 3dfx nor its assets. It won them in a lawsuit with 3dfx.

    -B
  • Matrox (Score:3, Funny)

    by mr.henry ( 618818 ) * on Monday November 10, 2003 @10:59AM (#7434321) Journal
    If you trying to decide between an ATI and an Nvidia, don't forget Matrox! Both ATI and Nvidia have been busted for pumping frame rates, but not Matrox! Sure, you may only get 15-20 fps, but at least you know your Matrox got them honestly. They will look really beautiful too.
  • I was wondering, which was the first option for 24 bit color (truecolor)?

    Both at high end workstations, and for home desktop? I remember seeing ads for true color boards in 1989 Mac magazines. When were them avaliable to the PC? Where they at all avaliable to Intel PCs or Amigas earlier than for Macs?

    If someone is kind enough to answer in a nice way (I could not find an answer in google), please consider making it ready for a write up at E2 [everything2.net].
  • I wonder what HEMOS called those cards that were needed to supply pictures for CGA (Cool Graphics Adapter) back in the early '80s. Wait! That's when about half of you guys were born!
  • by johnthorensen ( 539527 ) on Monday November 10, 2003 @11:46AM (#7434694)
    Hi, Just got done instant messaging with the editor of Fastsilicon.com, Nathan Odle. He asked me to post here that he's pretty frustrated that the article was released without his editing, it wasn't ready yet and would have been quite different had his red pen gotten ahold of it. You can read the other articles on the site and see that this is the case - Nathan's standards for the content are VERY high, some heads are going to roll because of what happened here. I ask you ALL to check out the other content there, it's definitely well worth reading.

    -JT
  • by exp(pi*sqrt(163)) ( 613870 ) on Monday November 10, 2003 @01:43PM (#7435723) Journal
    For example there was a fascinating pre-history of graphics cards from before they were released to the general public. Many developers on /. were surely involved in developing for these things even though they never finally made it to market. Many companies were involved before the appearance of the 3dfx chipset: Cirrus Logic, Yamaha, LSI, Oak, 3dlabs, Nvidia and so on.

    Some of my favorite cards were the 'decelerators' such as the Yamaha device. They hadn't yet figured out how to do 'perfect scan' so if you rendered a pair of triangles with a common edge then the pixels on that edge would be rendered in both triangles. If you rendered a square tesselated as triangles in the obvious way then the corner pixels were rendered 6 times. I had arguments with the guys about performance. They told me my drivers sucked as I couldn't match their laboratory performance. It's astonishing that a company could bring a device as far as first silicon without knowing how to rasterize a triangle correctly! Even without such mistakes they were still slow as the PCI bus was no way to send 3D instructions, geometry and textures anywhere. It would often take longer to format the data and send it to the device than simply rasterize directly to screen memory in software - even on early Pentiums!

    Then there was the first nvidia card that you may or may not know about. My God this thing was bad. Now I can't remember the exact details (this is many years ago) but it was very like the Sega Saturn's 3D system. (I don't think there's a coincidence here, the card came with a PC version of Virtua Fighter so I guess Sega and Nvidia were working together). Basically it was a sprite renderer. A square sprite renderer. But it had been hacked so the spans of the sprites could be lines that weren't raster aligned. So you could render twisted rectangles. With some deviousness you could render polygons with perspective and you had a 3D renderer. But it basically always 'pushed' an entire sprite. So it was pretty well impossible to do any kind of clipping. It was next to impossible to map the functionality to any kind of 3D API and so could only run applications dedicated to it. Again they complained that we were unable to write proper 3D drivers for their card. Admittedly their design did at least allow some games to run fast but I'm still amazed by the lack of understanding by the early nvidia guys. So when they eventually overtook 3dfx I was completely blown away.

    And then there was the question of APIs. In the old days there was no Direct3D. There was OpenGL but most PCs were a long way from having the power for a full OpenGL implementation. Early on only one company was interested in OpenGL - 3dLabs. They were the only company who understood what they were doing on PCs in those early days. So there was a variety of APIs: Renderware, Rendermorphics, and BRender among others. Rendermorphics was eventually bought by MS and became Direct3D. The first few revisions were terrible but as they always do MS eventually 'got it'. Renderware [renderware.com] is still going. They are part of Canon. Anyone who knows Canon will be aware that they patent everything. If you dig out the early Canon patents you'll find they patented fast rendering of speculars by a technique which meant they didn't actually move as the viewpoint move. (If you know 3D you should be laughing loudly right about now.) But Renderware did get their act together and now have a 3D API that runs on a few consoles. And some of the earliest and coolest 3D hacks were first patented by them. BRender just disappeared though Jez San, the guy behind it, recently received an OBE for his influence on the British computer games industry. (Gossip tidbit: at one point SGI were looking for a 3D API for PCs and chose BRender over OpenGL for their FireWalker game development system.) If you dig into the pre-pre-history of 3D accelerators you'll see that San's company, Argonaut, developed the first commercial 3D accelerator (though not PC card) - the FX chip for the SNES, used for Starfox.

    And this is all from memory so please accept my apologies for errors and post corrections!

  • by Enonu ( 129798 ) on Monday November 10, 2003 @02:09PM (#7435906)
    With today's CPUs having more than enough power for most tasks done by the average user, when will we get to the point where we don't need a video accelerator? The six month cycle makes upgrading to a new video card an expensive and risky proposition afterall.

    For example, I wonder how many FPS a P4 3.2 or an Operon can pump out @1024x768x16bit in Q3 with only software rending.

It is now pitch dark. If you proceed, you will likely fall into a pit.

Working...