Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

A History of 3D Cards From Voodoo To GeForce 320

Ant sends us to Maximum PC for an account of the history and current state of 3D video cards (single print page). "Try to imagine where 3D gaming would be today if not for the graphics processing unit, or GPU. Without it, you wouldn't be [trudging] through the jungles of Crysis in all its visual splendor, nor would you be fending off endless hordes of fast-moving zombies at high resolutions. For that to happen, it takes a highly specialized chip designed for parallel processing to pull off the kinds of games you see today... Going forward, GPU makers will try to extend the reliance on videocards to also include physics processing, video encoding/decoding, and other tasks that [were] once handled by the CPU. It's pretty amazing when you think about how far graphics technology has come. To help you do that, we're going to take a look back at every major GPU release since the infancy of 3D graphics. Join us as we travel back in time and relive releases like 3dfx's Voodoo3 and S3's ViRGE lineup. This is one nostalgic ride you don't want to miss!"
This discussion has been archived. No new comments can be posted.

A History of 3D Cards From Voodoo To GeForce

Comments Filter:
  • Thanks (Score:3, Interesting)

    by arizwebfoot ( 1228544 ) * on Tuesday May 19, 2009 @02:10PM (#28015697)
    Personally, I found the article quite nice - it was a nice trip.
    • Re:Thanks (Score:4, Interesting)

      by vertinox ( 846076 ) on Tuesday May 19, 2009 @02:45PM (#28016281)

      Yeah. I started to get misty eyed seeing all the S3 and Matrox cards.

      I used to work in a computer shop back in the late 90's and for home users who didn't 3d games, we'd always suggest the S3 cards over ATI simply because of stability issues with Win95 and 98.

      I mean back then no one really needed the 3d part except gamers which were kind of rare.

      Now 3d is integrated with the desktop. How times have changed.

    • Re:Thanks (Score:4, Informative)

      by The Grim Reefer2 ( 1195989 ) on Tuesday May 19, 2009 @03:17PM (#28016773)

      Personally, I found the article quite nice - it was a nice trip.

      Me too, but it also made me realize that I've spent way too much money on video cards over the years. My first 3D card was a Monster Voodoo 1 w/ 4 MB of RAM, which I returned when I found a 6 MB Voodoo 1 from Canopus for the same price. It paired nicely, at the time, with a 4 MB Matrox Millennium.

      I was kind of surprised that they missed quite a few cards though. There was a company nameded Obsidian (or maybe that was the name of their cards) that made $1000+ cards with up to at least 4 (I think they had a 8 GPU board) Voodoo 1 chips at the time.

      Since they also mentioned some other flops, I thought they'd have mentioned the Matrox Mystique and some of the other cards that were more CPU dependent.

    • I agree. At one point or another I owned a S3 Virge, Verite 1000, Matrox G200, and an original Voodoo.

      I can distinctly remember playing GLQuake for the first time on the Voodoo card. I was completely amazed at the speed it ran the given resolution. Very few times have I personally witnessed such a big leap forward in a technology. It was similar in going from analog TV to 1080p HD where you just mumbled incoherently thinking 'WOW!'

  • at best buy a couple weeks ago... too bad the box was supposed to contain a Nvidia 260... s3, 3dfx, all kinds of old ass graphics boards in the box.. but no 260...

  • Had beautiful graphics and ran on a 386sx with a 128 MB VGA card and a 2D GPU.

    So I call Bullshit- the only reason a high powered GPU is necessary is because game programmers have become LAZY.

    • Re:7th Guest (Score:5, Insightful)

      by Thornburg ( 264444 ) on Tuesday May 19, 2009 @02:26PM (#28015953)

      Had beautiful graphics and ran on a 386sx with a 128 MB VGA card and a 2D GPU.

      So I call Bullshit- the only reason a high powered GPU is necessary is because game programmers have become LAZY.

      I call bullshit. 128MB "VGA" cards never existed. The only reason for a card to have more than a few MB of RAM (back in the day) was 3D graphics (i.e. textures). Even today, 16MB of VRAM should be enough for 32bit color depth at 2560x1600. In the days of the 386sx, having 4MB of VRAM was quite a lot. Heck, having 4MB of system RAM wasn't too bad, in those days.

      • You're right. I'm so used to today's supercomputers that I forgot that standard VGA was 128K, not 128MB.

      • Re: (Score:2, Insightful)

        by machine321 ( 458769 )

        Get off my lawn. Back in the days of the 386sx, the only reason for more video RAM was so you could get more color depth at a certain resolution (which is X * Y * D bits). There was no 3D, there were no textures.

        • True indeed. Ahhh, the memories - getting a new screen & popping a hernia to lift it onto the desk, then having to buy a new card to be able to experience the highest resolution.

          Then - discovering - after getting updated drivers via the (snail) mail - that your card could not support a decent refresh rate at the highest resolution...

          Then...finding out that your PC could not actually keep up with the data that certain apps wanted to write.

          Then, oh the joy of AMD 486 overclocked Intel clones that drove t

          • Re: (Score:3, Informative)

            Then, oh the joy of AMD 486 overclocked Intel clones that drove the VGA straight of the CPU pins - what was that called again? -

            VESA Local Bus (VLB). They had it on Intel DX/2s as well. Adaptec (and others) made SCSI controllers for the VLB also.

      • Re: (Score:3, Insightful)

        by Burning1 ( 204959 )

        In the days of the 386, you would be lucky to have 16MB of main memory. I suspect that the GP meant 128KB, which was a relatively common quantity of video memory in the day.

        I think it's also a bit of a troll. The 7th guest was a pre-rendered multimedia game, and came out some time after the heyday of the 386. The nature of multimedia games grants good visuals with little overhead at the cost of a lot of interactivity. Calling a modern game programmer lazy when they've bettered the visuals of T7G in a first

    • 7th guest was pre-rendered.
    • Re:7th Guest (Score:4, Insightful)

      by vertinox ( 846076 ) on Tuesday May 19, 2009 @02:38PM (#28016149)

      Lol. I think the mods missed your humor, but yeah, before Quake... The games technically "looked" better because they were pre-rendered cut scenes.

      Remember:

      Under a Killing Moon
      Phantasmagoria
      7th Guest
      Myst

      I could go on but before Quake there were a lot of games that ran on a 386/486 (actually I don't know if Killing Moon ran on a 386) and looked good because they were pre-rendered.

      The real reason for the advent of the 3d card was to allow user interaction with the game world. I mean it looked like you were interacting with those games but it was just all pre-rendered.

      • Re: (Score:3, Informative)

        by rhyder128k ( 1051042 )
        The first I-War game was designed for software rendering and the 3D was bolted on afterwards. This meant that the ingame rendering on a 3DFX card was noticeably higher quality than the pre-rendered cut scenes.
    • Had beautiful graphics and ran on a 386sx with a 128 MB VGA card and a 2D GPU.

      As beautiful as any prerendered game back in the day.

    • You're confusing full motion video with real-time 3D.

    • "So I call Bullshit- the only reason a high powered GPU is necessary is because game programmers have become LAZY."

      The 7th guest was pre scripted and pre-rendered, you didn't have *any* freedom the 7th guest was entirely on rails not a fair comparison at all.

  • by D Ninja ( 825055 ) on Tuesday May 19, 2009 @02:15PM (#28015771)

    Without it, you wouldn't be [trudging] through the jungles of Crysis in all its visual splendor

    Hmmm...is anybody able to play Crysis in all its visual splendor?

    • Yes. It's been awhile since the game has come out. Very few graphics cards could handle it when it was released, but many of the current generation cards can handle it just fine.

    • Re: (Score:3, Funny)

      by laiquendi ( 688177 )
      Yeah, but if I try to run my microwave at the same time it trips the breaker.
    • is anybody able to play Crysis in all its visual splendor?

      I'm getting 1000fps on my 4x3 pixel monitor with all the eye-candy turned on.

  • The 'MeTaL' acceleration was bullshit. On UT99 I think the software renderer looked about the exact same as the MeTaL. When I popped in a 12 meg Voodoo2, I promptly tore the S3 board apart and threw it away, and popped in a Matrox Millenium MGA card for 2D stuff.

    • I disagree with this.

      Sure, Virge "Accelerated" games ran slower than software rendering :)

      But boy, they were pretty. Virge had 24 bit rendering and decent filterings, so going to a voodoo was quite a step back in image quality (for example in Descent). But things you do to get 5 times the framerate...

      • by afidel ( 530433 )
        The S3 Virge GX ran faster than software rendering unless you had a top of the line Intel CPU, if you were running Cyrix or AMD with their weaker FPU's then the GX won out. I ultimately bought a Voodoo monster and then a Voodoo 3 3000 but that was because they were all but a requirement for Diablo 2 due to the fact that GLIDE was way faster AND better looking. Today I use a Glide->D3D wrapper to play and get better framerate then even a Voodoo 5 could have achieved =)
        • I had the very very first version of the Virge (which wasnt even able to use all of its 4MByte outside of 3d...)

          Back then i think i had an Pentium 133.

          Of course this wasnt a fair comparison. Descent, for example, ran on the CPU in 320x200 in 256 colours, while the Virge version was running in 640x480, 32bit.

          But this was WAY before resolution played a part in performance considerations (as _everything_ was running at 320)...

          • Re: (Score:3, Interesting)

            by vadim_t ( 324782 )

            Resolution played a huge part in performance considerations. But there was also the problem with card support.

            Remember UniVBE? You needed that to get high res display, but it was very, VERY slow. I managed to get 1024x768 with 256 colors on my 386, and about all it was good for was viewing photos because they took about a second to draw. I managed to get Win 3.1 run at that resolution and could watch the windows slowly appear on the screen.

            One of the first games I remember playing in high res graphics was a

            • Re: (Score:3, Interesting)

              by Cimexus ( 1355033 )

              Wow! Thanks for reminding me about UniVBE. I used that a lot back in the day but had completely forgotten about its existence ;)

              Pretty sure I used it on my 486 DX4/100 to get all manner of games running at 800x600 (which was the preferred resolution on the 15" monitor I had at the time).

        • Which wrapper are you using? psVoodoo [sourceforge.net], or some other one?

          I searched out a few dozen, but most of them haven't been updated in ages, or only work with specific programs.

        • Hrm... I currently play Diablo II in Wine... wonder if I could configure a GLIDE wrapper for that.
    • Re:Ugh, s3 Virge... (Score:4, Informative)

      by UncleFluffy ( 164860 ) on Tuesday May 19, 2009 @02:36PM (#28016113)

      The 'MeTaL' acceleration was bullshit.

      Given that "MeTaL" was for Savage3D, not Virge, it's not surprising that it didn't do very much for you.

      • by Khyber ( 864651 )

        S3 Virge, not regular Virge. There was a difference. S3 Virge used MeTaL. Regular Virge/VX/DX/Trio3D did not use metal. S3Virge cards did.

        • Re:Ugh, s3 Virge... (Score:5, Informative)

          by UncleFluffy ( 164860 ) on Tuesday May 19, 2009 @03:18PM (#28016787)

          S3 Virge, not regular Virge. There was a difference. S3 Virge used MeTaL. Regular Virge/VX/DX/Trio3D did not use metal. S3Virge cards did.

          Sorry, I think your memory is somewhat faulty there. MeTaL was definitely Savage series only, I know because I helped write it.

          • Re: (Score:3, Informative)

            by Khyber ( 864651 )

            Umm.

            http://www.savagenews.com/drivers/s3/s3metal.php [savagenews.com]

            MeTaL Drivers for the S3D ViRGE GX2 AND Savage cards. The S3D ViRGEGX2 was an AGP card that used MeTaL. I used it for UT'99 and UT'99 recognized it as a MeTaL device.

            • Re: (Score:3, Informative)

              by UncleFluffy ( 164860 )
              Those drivers are all for Savage: it says "Supported Savage Cards" at the top. You're correct, there were AGP Virge cards. The native API for these cards was called "S3D" not "MeTaL" and was a different (and older) codebase. UT99 would have definitely blown goats on any of the Virge series.
        • Re: (Score:3, Funny)

          by Ecuador ( 740021 )

          WTH are you talking about? S3 is the manufacturer of ViRGE. There is only S3 ViRGE, and it is the "regular" ViRGE.
          Since you seem at an automobile analogy comprehension level, I will make it even easier: "Regular" Prius is also "Toyota Prius".

  • by Anonymous Coward on Tuesday May 19, 2009 @02:17PM (#28015811)

    But the market never accepted them because no matter how thin they made the peripheral slots, the damn things would just fall through the case.

  • In fact, I've got 3 old 3dfx Voodoo cards I'm willing to part with... cheap! 2 of them complete with TV tuners. Good luck finding Vista drivers for them!
  • I used to have an old IBM CGA color monitor that I used on an IBM PC-XT. 8 colors IIRC, including lovely shades of magenta and "brown". I was the envy of everyone on the block. When the first addon graphics cards came out, I got a Hercules card and that absolutely ruled for running Flight Simulator.

    These days I'm content to run a year or two behind state of the art. The cards and games (the few I play) are cheaper that way. I think it's pretty much a no-brainer to say that gamers have driven this indus

    • Re: (Score:3, Informative)

      by Yvan256 ( 722131 )

      Actually CGA was only 4 simultaneous colors [wordpress.com], from only two color palettes [wikipedia.org].

      P.S.: read the "160x100 16 color mode" part. Interesting stuff.

    • CGA was only able to display 4 colors (from a palette of 16) in graphics mode. Usually white, black, cyan and magenta, though sometimes programs used the alternate palette which had brown and some other color I don't remember (red or green maybe).
  • I remember that I actually had *one* game that supported my Virge/DX - Descent 2.

    It did look a better, but was slow enough to make you want to switch to software rendering immediately.

    The name "Diamond Multimedia Stealth 3D 2000 PRO" did sound rather impressive on paper.

    The article would have been more impressive with screenshots of the games, though.

    • by anss123 ( 985305 )

      It did look a better, but was slow enough to make you want to switch to software rendering immediately.

      Never owned a Virge but I remember getting X-wing Alliance to run in OpenGL mode on my ATI 3D Charger. It was the first and only time I ran anything 3D on that card. It's a tad ironic that the first popular 3D card, the Voodoo, sacrificed image quality for speed but since it ran the games with a fluid framerate it looked better anyhow.

      The name "Diamond Multimedia Stealth 3D 2000 PRO" did sound rather impressive on paper.

      Heh. 3D 2000 Pro. I got this image of the marketing department "we need to convey that the card is more than just 3D, but futuristic and professional too!"

    • That's ViRGE, "Video and Rendering Graphics Engine". I had one, and I recall that on the MS flight-sim of the day, software rendering was actually faster, although only by a couple of percentage points.

  • Graphics and Stuff (Score:5, Insightful)

    by D Ninja ( 825055 ) on Tuesday May 19, 2009 @02:20PM (#28015859)

    So, don't get me wrong. I love beautiful graphics. I love the immersive environments that they create. The atmosphere of games like Bioshock are great. Even WoW, which arguably has very scaled down graphics, is extremely involved and really pulls you into the game.

    HOWEVER...

    For as much as I like these graphics, games just do not hold my attention like they used to. I know I'm going to sound like "The Old Guy" with his nostalgic memories, but I spent hours and hours on games where graphics wasn't the primary draw (even for that time period). Heck, I didn't get Legend of Zelda (the original) until well after SuperNES has been out for quite some time. But, I spent so much time on that game, my original Nintendo practically burned itself up.

    Basically, the point I'm trying to make is that, while graphics are important to the gaming experience, if a company really spends time on the storyline (Fallout 3, or Bioshock for example), or focuses on the fun factor (Smash Brothers!) games can be just as awesome and fun. It's not just about (or at least should not be just about) the "visual splendor."

    • by snarfies ( 115214 ) on Tuesday May 19, 2009 @02:35PM (#28016091) Homepage

      Wait, Fallout 3 had a story? I thought it was just pointless wandering and about two hours worth of fetching stuff for your father.

    • Same story here. You're just growing up, that's all. Other things have more value to you.

    • Re: (Score:3, Insightful)

      by Burning1 ( 204959 )

      Perhaps it isn't that gaming has changed. Perhaps it's you.

      I went back and played through a lot of old super Nintendo games. What I discovered in the process is that many older games greatly extended their playtime through drudgery. As soon as you have the reload and rewind keys, Contra 3 became a much shorter game. Final Fantasy III (6) was a fantastic on it's own, but the fast forward key really cut-down on a lot of drudgery.

      What's changed the most about gaming in the last 15 years? Me.

      My willingness to r

      • by Hatta ( 162192 )

        As soon as you have the reload and rewind keys, Contra 3 became a much shorter game

        Well yeah, if you cheat it's going to be easier. Playing the same section of a game over and over again because you keep dying is how you get better at it. When you put in the hours, and actually beat it fairly, it's a much bigger accomplishment than when you abuse save states. If you're not going to beat the game on its own terms, why bother playing?

        Final Fantasies of course, are a different issue. There's not really much

  • Remember when Intel was claiming that you couldn't run an i740 on anything other than an Intel chipset, due to "incompatibilities"? Didn't stop me from using it with a K6, and since Intel did provide documents for that chip, it ran in Linux, too.

    • Given that the i740 was perhaps the earliest card to make aggressive use of system RAM over AGP(which turned out to be a terrible plan, since the PCI version that didn't do that was often faster; but it was a fairly novel one), then comparatively new, it wouldn't surprise me if there was a long list of chipsets that would generate huge delays, lock up, or otherwise fall into a variety of screaming heaps.

      The other possibility, of course, was that intel was hoping to push more sales of their own core logic
  • Kind of ironic title when you take into consideration that on black friday, October 15, 2002, 3dfx's assets were purchased by nvidia. The geforceFX was built using a lot of ex-3dfx engineers, so there was a very literal translation from voodoo to 3dfx. PS, I used to LOVE 3dfx cards, still would, but I've been running radeons since the 9700pro beat the living snot out of the entire geforcefx line.
    • Bah, me again, fail. s/voodoo to 3dfx/voodoo to geforce/
    • by afidel ( 530433 ) on Tuesday May 19, 2009 @02:37PM (#28016127)
      Then you've lived through some really terrible drivers and I'm sure more than your share of BSOD's. ATI might make great hardware but they don't seem to be able to write a decent driver to save their life.
      • Surprisingly, no. I've had very few bsods, probably because I would only upgrade every year or so. If a new game didn't work unless it took a bad set of drivers, i'd wait a few revisions until there was one set that was accepted by the community. In fact, in the 5, count em 5, years I ran my 9700pro, the only, ONLY blue screens of death I'd get was when I plugged in an svideo cable when the computer was on. And that was NEVER fixed, to this day, the system will bsod if you plug the cable in while windows is
        • by afidel ( 530433 )
          Meh, I'm not a fan of anyone really all vendors suck, it's just a matter of degrees. For my personal preference I could put up with slightly slower framerate but the BSOD's I experienced and helped troubleshoot that were the result of bad ATI drivers just weren't acceptable to me.
          • For my personal preference I could put up with slightly slower framerate but the BSOD's I experienced and helped troubleshoot that were the result of bad ATI drivers just weren't acceptable to me.

            Amen to that. ATI drivers are crap on all platforms except OSX, and one suspects that's because Apple helped. A lot. This has ALWAYS been true. I had some intel boards with Mach64 onboard and even THOSE caused me problems.

          • I think that the bsods came about because people were just installing and installing and ATI didn't have a good cleanup for driver upgrades. On the systems (i created a server with leftover parts from an upgrade), running the old 9700pro with semi-recent drivers, there were very few, if ever bsods that weren't me created. So my anecdote negates yours, but I realize that i tend to have better luck with hardware because i run my systems leaner (less resident processes) and do research into what I'm installing
      • Re: (Score:3, Informative)

        by MrHanky ( 141717 )

        ATI's driver quality is fairly decent these days. I used to get hard locks when playing Oblivion on my old Radeon 9800 Pro, but those disappeared as soon as I got a better cooler and fan on it. I don't think I've ever had a bluescreen or crash in Windows XP with my current X1950 Pro. The Linux drivers have been a different matter, but at least the open drivers seem stable enough.

  • The intro says to include ... other tasks that [were] once handled by the CPU.

    In fact, there is a regular cycle of inventing video add-on processors, seeing them spread, then seeing the CPUs catch up and make the older video processor technology obsolete, moving the work back to the CPU. Then, of course, someone invents a new video co-processor (;-))

    Foley and Van Dam, in Fundamental of Interactive Computer Graphics called this "the wheel of karma" or the "wheel of reincarnation", and described three generations before 1984.

    I suspect the current effort is more directed toward building fast vector processors, rather than short-lived video-only devices. Certainly that's the direction one of the Intel researchers suggested she was headed.

    --dave

  • I think I have one of each of these in a desk drawer in my house. Everytime I stick my hand in there I get cut.

  • My first computer was a 233mhz pentium with a crazy case layout that wouldn't allow a graphics card to be installed. I remember playing Half Life and Quake 1&2 using software rendering at something like 400x320 resolution, and thinking it looked amazing. How times have changed since then, and my first computer is downright modern compared to many others' here.
    • Re: (Score:3, Interesting)

      by imsabbel ( 611519 )

      I remember buying a voodoo for my P133. I had a lanparty at the same day (just 10 or so of us guys from school).

      I build it in during the lanparty, and the first thing to try was glQuake.
      I had run it before a couple of parties back, and people where like "AWESOME how this looks. Too bad there is only a frame every 5 second" (no joke, Gl software wrapper was slow as fuck. But pretty).

      Well, it ran on the voodoo, just as nice looking, with 30fps.
      Even though we were all kids without income, the majority of peopl

  • I still remember my first Diamond Monster 3dfx video card. I bought it moments after seeing a demo, because it was just that awesome.

    I then remember downloading the 3dfx patches for games like Tomb Raider and Interstate '76 (what a great game that was)...

    Good times. We take so much for granted these days.

  • by account_deleted ( 4530225 ) on Tuesday May 19, 2009 @02:48PM (#28016321)
    Comment removed based on user account deletion
  • Matrox Millenium (Score:4, Interesting)

    by Ngarrang ( 1023425 ) on Tuesday May 19, 2009 @02:51PM (#28016355) Journal

    I remember the days of my trusty Matrox card playing Descent and Duke Nukem. Anything that ran on DOS seemed fast.

    For shear enjoyment, Rise of the Triad and all of its 2D-ness still gets my vote for all-time game. Who can forget such classic weapons like the Drunk Missile and the Fire Wall? Just pray you don't cross into a hallway that someone had targeted with the firewall at the wrong time.

    Good times.

    • Re: (Score:3, Informative)

      Those games you mentioned didn't utilize any 3D acceleration.

  • ... There was not a single VLB (VESA Local Bus) accelerator on that list. As I recall, the VLB slot was made for video acceleration, so they rather missed the boat by omitting those cards. Starting at Voodoo (except they started with ViRGE) is not a very comprehensive history of 3d acceleration.
    • by 0racle ( 667029 )
      They also didn't have any ISA cards there. VLB existed for a short time between E/ISA and PCI. The article was about 3d accelerator/accelerated cards. By the time they came along, PCI was king of the hill. There may have been some VLB cards made after that in the same way that AGP held on, but PCI was where it was at and VLB cards would have just been the same chip set on another bus.

      VESA cards were solidly 2d cards with 3d effects being software rendered.
  • From the section on the Voodoo2

    this time the image quality was improved, particularly at higher resolutions (1024x768) where the Voodoo1 struggled.

    Interesting, considering the Voodoo2 had a 800x600 [wikipedia.org] resolution limitation

  • GOD was that thing a damn oven. But damn if the games didn't look (comparatively) sweet on it!

  • Ok Ok (Score:5, Insightful)

    by ericrost ( 1049312 ) on Tuesday May 19, 2009 @03:12PM (#28016713) Homepage Journal

    I can live with bad grammar in the submissions, and of course in the comments, but can Technical Journalists PLEASE take a few goddamned English courses?

    ...causing the ViRGE to be unaffectionate dubbed the first 3D decelerator.

    Just how far has graphic cards come in the past 15 years?

    the original Rage 3D didn't have a whole going for it

    The last official drive update for the Savage 3D was posted in 2007, though the modding community has continued to support the card with most recently release (2007) showing support for Vista.

    Canadian-based Matrox first got start producing graphic solutions in 1978, ...

  • I owned the Voodoo1 piggyback and it was good, but didn't satisfy me. First chance I had, I got a hold of the 5500 beast and had to use the Dremel on my case to squeeze that mother in.

    But it was a killer card, giving killer frame rates at high quality.

  • My first 3D video card was a Rendition Vérité 1000; IIRC it was the first card that could do transparent water effects in Quake. Truly a defining influence on my college career.

  • I had an AGP Permedia 2 with 16MB when everyone else was dicking around with PCI Voodoo 2s with 12. The V2 was slightly faster, but the Permedia would not only let you do larger-window OpenGL (XGA, for example) but it also had superior lighting effects. And, you know, actual OpenGL, not just MiniGL. I kept that until the GEforce 2 came out and have only deviated once... to an ATI Radeon 9600 XT that was a total lemon.

  • by logicassasin ( 318009 ) on Tuesday May 19, 2009 @03:52PM (#28017355)

    There's a couple of blatant omissions fron this article:

    1. Matrox Millenium I/II - Matrox's best card until the G200 came along. The Millenium II was, at the time, one of very few cards that could be bought with up to 32MB of RAM. Many entry-level 3D workstations running NT4 shipped with such a configuration.

    2. Matrox Mystique/Mystique220 - I STILL have one of these AND it's in service. Matrox believed that speed was king, so they designed the MGA1064SG chip for just that, but failed to add features like bilinear filtering, transparency, and mip mapping. As a result, games flew on these cards, but tended to look like utter crap. Both versions had the ability to be upgraded to 8MB or RAM or to full-on video capture and compression using the Rainbow Runner capture daughtercard (which is why I still have/use mine).

    3. PowerVR PCX2 - Superceded the PCX1, faster than the original and also an add-in accellerator like the Voodoo1/2 with one two major differences: 1. It didn't require a pass-through cable for operation. 2. It could render 3D in a window as well as full screen. It was also one of two 3D chipsets with native API support by Unreal at it's launch (the other being The Voodoo chipsets). It had, in my eyes, only one major problem - no alpha blend transparency. It could do transparencies, just not alpha-blended. It did have it's own API, PowerSGL, and games coded in it (like Unreal and a Japanese game called "Pure Vex") could look quite good and were pretty fast as well. A few games had after-the-fact patches that added PowerVR support (Mechwarrior 2). Interestingly, the PCX2 could scale much better than the faster cards of the day. I'm not sure of what it's upper limit was, since most reviewers stopped testing it after a while.

    4. Savage4 - The Savage series of chips from S3 had their own API called MeTaL. Unknown by many, Unreal (in later patches) and Unreal Tournament both supported MeTaL and through it S3TC. Unreal Tournament 99 looked it's absolute best when run with a Savage4 and the extra textures installed from the second CD. The S4 also had full scene AA, though I doubt anyone ever bothered using it.

    5. S3 Virge - The 3D image quality of the S3 Virge was rivaled only by the Voodoo (this was repeated several times in magazine reviews). No other card delivered 3D that looked as good at the time... It was still unbearbly slow.

    6. i740 - The Intel chip was one of VERY few that could run Quake III Test when it first appeared thanks to its complete OpenGL ICD.

    7. 3DLabs Permedia 2 - Known, but not known... The Permedia 2 was everywhere for a minute. Most card companies were pushing this entry level 3D workstation chip as a 3D gaming platform. Performance wise... well... it kinda sucked. It was missing some features, but thanks to 3DLabs' bulletproof OpenGL ICD, it was one of few cards on the market that could properly render the particle effects on Quake II AND could run Q3T on arrival. Superceded by the Permedia 3, which WAS a better chipset in every way, but still not competitive against the likes of Nvidia and 3Dfx.

    There's also the Matrox G400/450, which I still have 4 of in service at home (DH for the wife and 450's for three of my kids).

  • by Francis ( 5885 ) on Tuesday May 19, 2009 @05:40PM (#28018945) Homepage

    I have to say, this article didn't sufficiently emphasize the importance of the introduction of the GeForce and the GeForce 3. Almost every other graphics card was just "more" and "faster", but not the huge game-changing revolution that these two graphics cards represented.

    Before GeForce, everything was all about accelerating rasterization - the act of filling in triangles.

    With the first GeForce, lighting and transform was put into silicon. This was *huge* - this means that real math processing units were put into hardware. Scene complexity went up drastically, since we were finally able to push a lot of the more expensive operations into hardware.

    With the GeForce 3, we had the introduction of the *programmable* graphics pipeline. This was a huge game changer - for the first time, the developer was limited only by their own intellect and creativity what kinds of things could go into the hardware. This was the beginning of what could be considered the first mass produced commercial stream processing unit. The graphics card has become a general purpose computational unit, a blazingly fast computational unit with applications into fields that have absolutely nothing to do with computer graphics.

    I'm not sure what the ultimate evolution of the stream processor will be, but it still has the potential to really change the fundamental architecture of how future computers will be designed. Stream processors might eventually displace CPUs as the main computational workhorse in a computer.

Dinosaurs aren't extinct. They've just learned to hide in the trees.

Working...