Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

Video Card History 390

John Mathers writes "While searching the net for information on old Voodoo Video Cards, I came across this fabulous Video Card History article up at FastSilicon.com. It's nice to see that someone has taken the time to look back on the Video Cards that revolutionized the personal computer. Here's a quote "When 3dfx released their first card in October of 1996, it hit the computer world like a right cross to the face. That card was called Voodoo. This new card held the door open for video game development. The Voodoo card was simply a 3D graphics accelerator, which was plugged into a regular 2D video card; known as piggy backing."
This discussion has been archived. No new comments can be posted.

Video Card History

Comments Filter:
  • Well, sort of. (Score:5, Informative)

    by ultramk ( 470198 ) <{ultramk} {at} {pacbell.net}> on Monday November 10, 2003 @11:18AM (#7434008)
    The Voodoo card was simply a 3D graphics accelerator, which was plugged into a regular 2D video card; known as piggy backing.

    This isn't entirely correct, as any Voodoo 1 user could tell you. The card took up its own slot, and used a pass-through video cable to connect the monitor: When a Voodoo-compliant video signal was detected, it hijacked the output to the monitor and took over.
    Nice design, for the time. The best thing was, it was CHEAP for the time (considering the performance). I think I paid $199.

    M-
  • by MtViewGuy ( 197597 ) on Monday November 10, 2003 @11:23AM (#7434045)
    I think what finally brought 3-D graphics acceleration into the mainstream was the introduction of graphics card chipsets that could combine decent 3-D acceleration with fast 2-D graphics all at once.

    nVidia's pioneering RIVA 128 chipset was the first chipset that could compare itself in 3-D mode against the vaunted Voodoo cards of that period; once nVidia unveiled the groundbreaking TNT chipset it was pretty much over for Voodoo's separate board approach. This is what spurred ATI into developing the Rage Pro and later Rage 128 chipsets in the late 1990's, starting the competition between ATI and nVidia that has lasted to this day.
  • Here's the text. (Score:1, Informative)

    by Anonymous Coward on Monday November 10, 2003 @11:24AM (#7434053)
    When 3dfx released their first card in October of 1996, it hit the computer world like a right cross to the face. That card was called Voodoo. This new card held the door open for video game development. The Voodoo card was simply a 3D graphics accelerator, which was plugged into a regular 2D video card; a practice known as piggy backing. (Oh the days of piggy backing, I remember them well. Having that black cord sticking out of the back of your case was especially annoying). A few months later a new card was introduced, the Voodoo Rush. It was the 3D and 2D card all rolled into one. However, it ran significantly slower than the normal Voodoo. This, combined with driver issues caused the Rush to be seen as a flop by the community.

    In all races, there must be competitors. ATI (A company that has been around for twice as long as NVIDIA or 3DFX.) and NVIDIA had cards out shortly after that to compete with 3dfx. They were named Rage, and RiVA 128. This was long before they both took the 3D giant 3DFX completely out of the race though. They were just tiny blips on the radar for 3DFX at the time. To counter the new competition, 3DFX released the Voodoo2 in March of 1998. It was a vast improvement over the Voodoo, having a 90 MHz core clock and a whopping 12 Mb of video memory. Voodoo2 could produce a resolution up to 1024 x 768, and had a blistering fast 3.6 Gb memory bandwidth - top of the line back then. As before, the Voodoo Banshee came out after the Voodoo 2, and like the Voodoo Rush; it was a waste of money due to performance issues. Incidentally, the Voodoo2 was still a piggy-backer; they did not drop that method of 3D graphics card integration until later.

    In March of 1999, 3dfx came out with the Voodoo3. This time, the Voodoo 3 was separated into different steps to cover different consumer needs (sound familiar?). The Voodoo3 2000 was the low-end budget card, and it had a core speed of 143 MHz to offer. On the next rung was the Voodoo3 3000, which offered up a 166 MHz clock speed. At the top was the 3500 version, which featured a TV-out port, and a 183 MHz clock speed. All these cards were offered in PCI and AGP versions (a new concept, also shared with an ATI card called the 3D Rage Pro).

    Like many underdogs, the competing companies started catching up to the hardware giant. NVIDIA released a card around the same time as the Voodoo3, called the TNT 2. The TNT 2 was the successor of the TNT, and upped the ante from 8 million to 10.5 million transistors - a huge jump in complexity. It also offered 32-bit color support, and digital flat panel support. The Voodoo3 barely beat the TNT2 in pure FPS, but the TNT2 had much higher visual quality, so people started checking out the competition. It didn't cripple 3dfx, but it let them know that they better have something groundbreaking with their next release. ATI, possibly one of the cleverest (or maybe luckiest) of all three companies was content to sit in the corner and watch NVIDIA and 3dfx battle it out. ATI still released new cards - they weren't spectacular, but by no means were they horrible. The cards were just enough to keep them in the race. ATI's strategy seemed to be to lie in wait for their time to strike, which wouldn't come until later.

    On October of 1999, NVIDIA dealt the final blow to the 3D giant, with the introduction of the Geforce 256, 3dfx didn't have anything to combat the new card with, so they took the blow right to the face (we saw this same situation happen to NVIDIA later on). The revolutionary Geforce 256 brought much to the table, including four pixel pipelines at 120 megahertz, DDR ram support, along with many other new features. 3dfx had two cards that were very highly anticipated but delayed long past the original schedule (Sound familiar?). But once the voodoo4 and 5 did come out, they were well accepted, but far too late to do damage to NVIDIA. Basically, they just added more GPUs and more RAM to beef up the new cards. Which was fine and dandy, but it made the cards about twice as big as the previous models, fo
  • by Bagels ( 676159 ) on Monday November 10, 2003 @11:25AM (#7434064)
    From the article...
    The GF2MX was a small step down, it cut off two of the pixel pipelines, and took the fill rates down to 350 pixels per second.

    Erm. That's not even enough to fill in a single horizontal bar of the screen (unless you're running in 320*240 resolution). Perhaps they meant megapixels? This was hardly the only such error that I noticed, though - these guys really need to have someone proofread their articles.

  • Forgotten cards (Score:1, Informative)

    by Anonymous Coward on Monday November 10, 2003 @11:25AM (#7434067)
    He totally forgot the ATI's RAGE which if I remember right was one of the first cards, and it supported SEGA Satrun games.

    He also slights 3DFX a bit. The Voodoo 2 was huge, although I had TNT 1, every one I knew was running Voodoo 2.
  • by Anonymous Coward on Monday November 10, 2003 @11:27AM (#7434078)
    What bullshit! The Rendition Verite supported bi & tri filtering, 32 bit color and a whole bunch of other 'now common' 3d features. The chip was well ahead of its time. It was the same problem nvidia first had. It had great features, but wasn't as fast as 3dfx. If Rendition would have released another card during the Riva128/TNT days (they did release the Vx2200.. which was nice, but a bit slow) with a tad more speed, we might be talking about Rendition, Nvidia, and ATi instead of just the latter two. All in all.. i can still remember playing VQuake, the first 3d version of quake, back when Carmack fully supported Verite and their far superior 3d technology.

    Any article which try to encapsulate the history of 3d cards but fails to mention the Verite cards is a piss-poor article right from the get-go.
  • by turgid ( 580780 ) on Monday November 10, 2003 @11:33AM (#7434125) Journal
    When your CPUs floating-point throughput is a factor of 1000 better, that's when. In other words, at the rate at which general-purpose CPU technology advances, you'll be at that level of performance in about 15 years.
  • Re:Well, sort of. (Score:4, Informative)

    by daBass ( 56811 ) on Monday November 10, 2003 @11:34AM (#7434133)
    What they also forgot to mention was that you could daisy chain cards to get even better performance.

    At the ISP I worked at I had two Voodoo 2 cards, which, on a lowly PII-350, ran Unreal Tournament with full detail in 1024*768 at a massive framerate!
  • by Malc ( 1751 ) on Monday November 10, 2003 @11:34AM (#7434135)
    You beat me to it. It's so sad when people got excited about PC graphics cards. It wasn't/isn't because they were good, it's because they were /finally/ able to start doing what other platforms had been doing for years. Even then, the performance was poor - that's just when they started being able to display the same number of colours. The lowly Commodore 64 had better graphics than a PC with CGA graphics!
  • by merlin_jim ( 302773 ) <.James.McCracken. .at. .stratapult.com.> on Monday November 10, 2003 @11:37AM (#7434152)
    I note that the history of this article starts in 1996 . . . one year after Rendition's Verite chip became the first consumer add-on 3D accelerator

    And absolutely no mention of Matrox whatsoever... despite the fact that their add-on 3D accelerator was arguably superior to the voodoo, and the parhelia is the ONLY 3d solution to support 3 display devices.
  • by jest3r ( 458429 ) on Monday November 10, 2003 @11:38AM (#7434163)
    Sadly, the last card 3dfx constructed was the Voodoo 5 6000, which was rarely seen at all. That is rather hard to believe seeing that it's one of the biggest graphics cards I have ever seen. It's equipped with 4 GPUs (That's right, 4.) and 128 megabits of memory. This card was mostly only seen in servers though.

    This card was massive and would never have been used in a server.

  • by Graemee ( 524726 ) on Monday November 10, 2003 @11:41AM (#7434189)
    What about the early cards, TIGA, 8514/A & other 3D attempts like RIVA, Mystique, Virge? What about the cheats on PC benchmarks, back in VGA, now in 3D tests? What happened to Number 9, ELSA and other "Big" names in cards that are nolonger? Reads more like a Time magazine article then a serious attempt at a history of video cards Most glaring to me is the ATI 8500/Nvidia GF3 omission.
  • Re:Well, sort of. (Score:3, Informative)

    by PainKilleR-CE ( 597083 ) on Monday November 10, 2003 @11:46AM (#7434221)
    Nice design, for the time. The best thing was, it was CHEAP for the time (considering the performance). I think I paid $199.

    The Voodoo2 cards started at $250 and $350 (or somewhere around there) for the 8MB and 12MB models, respectively. The only way to get the 1024x768 mentioned in the article was to have 2 12MB cards in SLI mode (which meant connecting the 2 V2 cards with a small ribbon cable between the two cards inside the case). Additionally, the pass-through cables that came with most V2 cards caused some degredation of the signal going to the monitor, so the graphics tended to be a bit dark, but was easily fixed by buying a better cable.

    The performance was definitely solid, though, since the V2 cards I had were originally passing the 2D signal of a Riva128, and then a TNT, and finally a TNT2Ultra was the card that made me decide to pull out the V2 cards (not to mention that the V2s I owned did not have fans on the boards/chips, which meant that one of them burned up within about 6 months).

    The combination of the lack of real OpenGL support, lack of 32-bit colour, and the speed of the TNT2 Ultra was what finally put 3dfx to bed, as the Voodoo 3 couldn't keep up and the Voodoo 4 was delayed far too long while 3dfx kept talking about how raw framerates were more important than features, and that no one could see the difference between 24-bit (the V3 supposedly output 24-bit colour through some tricks) and 32-bit colour anyway. Quake 3 proved them wrong quite quickly, as anyone could show with a few screenshots at the time.
  • by green pizza ( 159161 ) on Monday November 10, 2003 @11:47AM (#7434233) Homepage
    Which is a pretty simple way to get double the performance. I wonder why noone's done this recently

    They are, in the chip itself, sorta. Modern all-in-one GPUs have multiple texture pipelines, which does split some of the load on the silicon level. It's not SLI, but it's the same concept.

    The problem is SLI only doubles the fillrate. Both 3D chipsets need to work on the exact same dataset. SLI was a great boost back when 3D hardware was slow at texturing. These days the hardware can pump a couple thousand frames per second worth of textures, it's fancy multipass rendering and dynamic shaders (and to some extent, the geometry) that take up all of the frame generation time. SLI could speed some of this up, but it wouldn't help with most of the bottlenecks. It would be like putting new tires on a car that needs an engine tuneup.
  • by Anonymous Coward on Monday November 10, 2003 @11:48AM (#7434241)
    Check this one:

    http://www.accelenation.com/?ac.id.123.1
  • by akiro ( 645099 ) on Monday November 10, 2003 @11:50AM (#7434257) Homepage
    iD released a win32/OpenGL version of quake as unsupported, though free, software a couple of years after the original, called GLQuake [bluesnews.com].
  • Re:Bullshit (Score:5, Informative)

    by Malc ( 1751 ) on Monday November 10, 2003 @11:55AM (#7434294)
    EGA with 16 colours better than a Commodore Amiga? HAHAHAHA. In Ham mode, the Amiga was kicking out 4096! 16 colours are just garish. The Atari ST lead the charge, then the Commodore Amiga. The performance of the VGA graphics on my 386DX25 were dreadful. I added extra memory to my Paradise card so that it could handle 256 colours @ 640x480 under Windows and you had to watch it draw the screen line-by-line. The Commodore Amiga had been blowing it away for years by then. And for those who cared about improving the image on the Amiga, most of them went for a SCART connection rather than wasting their money on a monitor. PC owners didn't have a choice.
  • Not accurate. (Score:2, Informative)

    by El_Ge_Ex ( 218107 ) on Monday November 10, 2003 @11:57AM (#7434309) Journal
    NVIDIA bought them out in December of 2000.

    This text has some flaws... Nvidia didn't buy 3dfx nor its assets. It won them in a lawsuit with 3dfx.

    -B
  • by sznupi ( 719324 ) on Monday November 10, 2003 @11:59AM (#7434322) Homepage
    And I'll add only that Matrox basically invented (or at least first implemented in commercial product) video ram something like quarter century ago and that they had API capable of hardware accelerating 3d aps in a window in the times of win 3.11 (several years later Voodoo couldn't do it)
    And no mention about that company whatsoever :/
    But hey, what can you expect from (probably) fps kiddie biased negatively towards Matrox among others - because if he'd be JUST fps kiddie (not anti Matrox and...) he'd mention the fact that for ~half a year in 99 Matrox was the leader BOTH in performance and quality...too bad since then only the second holds true.
  • by green pizza ( 159161 ) on Monday November 10, 2003 @12:09PM (#7434394) Homepage
    http://www.ati.com/support/driver.html [ati.com]

    ATI generally releases an new WHQL Windows driver about once a month and a new Linux driver about every 6 weeks. I've had no problems with their XFree86 4.3 driver. They don't have a FreeBSD driver, though, but I guess a PowerBook would give somewhat of the same experience (BSD-based OS, XFree86-based X envrionment, Radeon 9600, plus Quartz/DisplayPDF and access to Mac apps). Mac OS X also has the ATI (and nVIDIA) drivers built-in and are updated with the software update utility.

    ATI's Windows drivers are offically updated once in awhile, and are generally rock solid, but there are occasionally problems that aren't resolved for months at a time.
  • Re:XGL? (Score:2, Informative)

    by PainKilleR-CE ( 597083 ) on Monday November 10, 2003 @12:10PM (#7434404)
    People say that the OS X GUI takes up too much CPU, while in fact it takes up almost none. All of the windows, shadows, etc, are being done with the video card though Quartz Extreme, no programming necessary from the app writer to take advantage of this either.

    This isn't quite true. Most of the desktop rendering is still done by the CPU in the same way that it was done before QE was added to OS X. It's simply the individual windows that are rendered by QE, and OpenGL handles the 'surface' which is handed to it by Quartz and QE. So OpenGL mostly comes in to handle effects (like Expose, the fast user switching animation, and the opening/closing animations) and shadows, QE handles the windows and passes the textures to OpenGL when an effect is needed, and the CPU still does it's thing for the desktop (until you hand off the desktop as a surface to OpenGL for the user switching animation).

    In other words, things that were basically eye-candy and were really slowing OS X down quite badly before QE are now handled by QE, but the base 2D engine that utilizes the CPU is still working the same way it does with most other operating systems.

    That being said, free eye candy is free eye candy, and although there are many people out there that prefer things to be stripped down, I'd rather have something with a little flash that can be done just as quickly.
  • by B1ood ( 89212 ) on Monday November 10, 2003 @12:11PM (#7434413) Homepage
    i'm aware of two 3d accelerated versions of quake for windows back in the day: vquake, for rendition's verite line of cards, and glquake, which worked with any opengl compliant video card. glquake at first was used almost exclusively with 3dfx voodoo boards and so people thought it was 3dfx specific. in fact, 3dfx's voodoo card only supported a subset of the opengl api, hence they provided a "mini gl" driver that implemented only so much of the spec as glquake required. if you're fiddling with it today and don't have a voodoo card, you should remove the opengl32.dll in your quake directory - if memory serves, the one that came with glquake was 3dfx specific i believe and will load before your systemwide opengl library. both vquake and glquake were true win32 apps, so nothing of the dos based origins of quake should keep it from running in XP. and if it doesn't, i'm sure somebody out there has fixed it (the source is gpl'd now after all) and provides a patch and binaries for XP.
  • by green pizza ( 159161 ) on Monday November 10, 2003 @12:33PM (#7434586) Homepage
    It is possible to scale performance that way, but the result will be less than double the frame rate, simply because the time to generate a frame does not scale linearly with resolution.

    To do 60 frames per second, you have roughly 16ms to generate a frame. A couple of those ms will be gobbled up with I/O transactions and various wait states, so you're already at the point where double the power is only going to result in 1.75x the performance. This will also be highly dependant on how well the 3D code can be parallelized (are there a lot of read_pixels callbacks that require both GPUs and both banks of memory to talk to each other? etc).

    This has actually been done by SGI for awhile now. A couple years ago they took their Origin 3000 architecture and stuck on dozens of V12 GPUs and tiled the graphics for higher performance. That concept has been tweaked for their Onyx4 sytems... one large single computer with up to 34 ATI FireGL X1 GPUs. 16 GPUs work on each display in a 4x4 grid. Each GPU generates its 400x300 piece and 16 of those are composited in real time to make up a 1600x1200 display. I believe the biggest such machine to date has 32 GPUs powering two zippy fast 1600x1200 displays and 2 GPUs driving an additional 4 lesser powered displays. SGI gets quite a speedup by doing it that way, with 16 GPUs per display, but there's also a lot of overhead (even more in SGI's case, with 34 AGP 8X busses in said system). Their implementation of OpenGL and OpenGL Performer is tweaked for this, though.

    So yeah, it can be done, but the fact that the GPUs will spend a significant amount of time doing non-rendering tasks (I/O, waiting for data, copying the final result to frame buffer, etc) means that you won't see a nice linear scaling. The cost of making custom hardware and custom drivers also adds up. With top-end PC 3D accelerators costing $400 already, I can't picture many users shelling out $1000+ for a dual GPU card.
  • DirectFB - OpenGL? (Score:2, Informative)

    by Doc Ruby ( 173196 ) on Monday November 10, 2003 @12:33PM (#7434593) Homepage Journal
    Looks like DirectFBGL [directfb.org] is 16 months old, although XDirectFB [directfb.org] has a v1.0rc5 that's only 6 months old. I can't tell how you'd "make install" the two together, and whether existing apps would "just work", but someone else seems to be working on it [directfb.org].
  • by johnthorensen ( 539527 ) on Monday November 10, 2003 @12:46PM (#7434694)
    Hi, Just got done instant messaging with the editor of Fastsilicon.com, Nathan Odle. He asked me to post here that he's pretty frustrated that the article was released without his editing, it wasn't ready yet and would have been quite different had his red pen gotten ahold of it. You can read the other articles on the site and see that this is the case - Nathan's standards for the content are VERY high, some heads are going to roll because of what happened here. I ask you ALL to check out the other content there, it's definitely well worth reading.

    -JT
  • by QuietYou ( 629140 ) on Monday November 10, 2003 @02:44PM (#7435726)
    Actually there were 4 planes, and you could set a variety of resolutions besides 320x240 by tweaking the vga registers. One of my favorites was 512x384 because it was relatively high resolution with square pixels, but unfortunately it didn't work on all monitors. 320/360x400/480 were pretty stable and were used in Quake. 320x400 was used for the Win9x startup screen.
  • by SmackCrackandPot ( 641205 ) on Monday November 10, 2003 @03:27PM (#7436081)
    That must have been back around 1989? I remember seeing the first 386's with VGA graphics, and the demos featuring 320x200 256 colour graphics of a castle. By 1990, my paradise card was superceded by a $700 Hercules Graphics Station Card (TMS34010 processor) with came with 1 Megabyte of VRAM (the double-buffering option cost another $300) and four 24-bit colour images; A head-scan, a party-pup (don't even ask!!!), a fashion model and somebody leaning out of a window. I managed to write a SGI image format viewer, and thought viewing SGI's "helping build a better dinosaur" advert was the coolest image in my collection. At this time, Windows 3.1 didn't support 24-bit colour 3D graphics, so the only way to write your own extensions using TIGA.

    A rough sequence of the video standards would be:

    1981 MDA,Hercules,CGA (IBM)
    1983 PCjr (IBM)
    1984 EGA (IBM)
    1986 TIGA (Texas Instruments), 8514/A (IBM)
    1987 MCGA,VGA (IBM)
    1990 XGA, XGA-2 (IBM)
    1990+ SVGA,XGA,SXGA,UXGA (various manufacturers)
    1990+ VESA (manufacturers form consortium)

    The early 1990's was probably the time of greatest change, when all the manufacturers were trying to outperform each other on resolution, refresh rate, video memory size, and finally 2D acceleration. The resulting chaos and incompatibility between cards led to the formation of the VESA consortium.

    The mid 1990's were the time in which many of the innovators formed: 3Dfx (1994, from MediaVision) ,nVidia (1993), and Rendition (1993)
    NVidia introduced the NV-1 in May 1995,
    3Dfx introduced the Voodoo card in October 1996.
    ATI had been around since 1985, but didn't introduce the 3D Rage until September 1996 (but with No Z-buffer).
    Matrox introduced the m3D in 1997 (piggybacking onto a VGA card). ...
    Rendition introduces one of the first MiniGL drivers in 1998.
    By 1999, each company was releasing new graphics cards every six months.
    Any good gaming web site will give you the product history of each manufacturer.

Thus spake the master programmer: "After three days without programming, life becomes meaningless." -- Geoffrey James, "The Tao of Programming"

Working...