Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

A History of 3D Cards From Voodoo To GeForce 320

Ant sends us to Maximum PC for an account of the history and current state of 3D video cards (single print page). "Try to imagine where 3D gaming would be today if not for the graphics processing unit, or GPU. Without it, you wouldn't be [trudging] through the jungles of Crysis in all its visual splendor, nor would you be fending off endless hordes of fast-moving zombies at high resolutions. For that to happen, it takes a highly specialized chip designed for parallel processing to pull off the kinds of games you see today... Going forward, GPU makers will try to extend the reliance on videocards to also include physics processing, video encoding/decoding, and other tasks that [were] once handled by the CPU. It's pretty amazing when you think about how far graphics technology has come. To help you do that, we're going to take a look back at every major GPU release since the infancy of 3D graphics. Join us as we travel back in time and relive releases like 3dfx's Voodoo3 and S3's ViRGE lineup. This is one nostalgic ride you don't want to miss!"
This discussion has been archived. No new comments can be posted.

A History of 3D Cards From Voodoo To GeForce

Comments Filter:
  • Graphics and Stuff (Score:5, Insightful)

    by D Ninja ( 825055 ) on Tuesday May 19, 2009 @03:20PM (#28015859)

    So, don't get me wrong. I love beautiful graphics. I love the immersive environments that they create. The atmosphere of games like Bioshock are great. Even WoW, which arguably has very scaled down graphics, is extremely involved and really pulls you into the game.

    HOWEVER...

    For as much as I like these graphics, games just do not hold my attention like they used to. I know I'm going to sound like "The Old Guy" with his nostalgic memories, but I spent hours and hours on games where graphics wasn't the primary draw (even for that time period). Heck, I didn't get Legend of Zelda (the original) until well after SuperNES has been out for quite some time. But, I spent so much time on that game, my original Nintendo practically burned itself up.

    Basically, the point I'm trying to make is that, while graphics are important to the gaming experience, if a company really spends time on the storyline (Fallout 3, or Bioshock for example), or focuses on the fun factor (Smash Brothers!) games can be just as awesome and fun. It's not just about (or at least should not be just about) the "visual splendor."

  • Re:7th Guest (Score:5, Insightful)

    by Thornburg ( 264444 ) on Tuesday May 19, 2009 @03:26PM (#28015953)

    Had beautiful graphics and ran on a 386sx with a 128 MB VGA card and a 2D GPU.

    So I call Bullshit- the only reason a high powered GPU is necessary is because game programmers have become LAZY.

    I call bullshit. 128MB "VGA" cards never existed. The only reason for a card to have more than a few MB of RAM (back in the day) was 3D graphics (i.e. textures). Even today, 16MB of VRAM should be enough for 32bit color depth at 2560x1600. In the days of the 386sx, having 4MB of VRAM was quite a lot. Heck, having 4MB of system RAM wasn't too bad, in those days.

  • Re:7th Guest (Score:4, Insightful)

    by vertinox ( 846076 ) on Tuesday May 19, 2009 @03:38PM (#28016149)

    Lol. I think the mods missed your humor, but yeah, before Quake... The games technically "looked" better because they were pre-rendered cut scenes.

    Remember:

    Under a Killing Moon
    Phantasmagoria
    7th Guest
    Myst

    I could go on but before Quake there were a lot of games that ran on a 386/486 (actually I don't know if Killing Moon ran on a 386) and looked good because they were pre-rendered.

    The real reason for the advent of the 3d card was to allow user interaction with the game world. I mean it looked like you were interacting with those games but it was just all pre-rendered.

  • Re:7th Guest (Score:2, Insightful)

    by machine321 ( 458769 ) on Tuesday May 19, 2009 @03:53PM (#28016381)

    Get off my lawn. Back in the days of the 386sx, the only reason for more video RAM was so you could get more color depth at a certain resolution (which is X * Y * D bits). There was no 3D, there were no textures.

  • Ok Ok (Score:5, Insightful)

    by ericrost ( 1049312 ) on Tuesday May 19, 2009 @04:12PM (#28016713) Homepage Journal

    I can live with bad grammar in the submissions, and of course in the comments, but can Technical Journalists PLEASE take a few goddamned English courses?

    ...causing the ViRGE to be unaffectionate dubbed the first 3D decelerator.

    Just how far has graphic cards come in the past 15 years?

    the original Rage 3D didn't have a whole going for it

    The last official drive update for the Savage 3D was posted in 2007, though the modding community has continued to support the card with most recently release (2007) showing support for Vista.

    Canadian-based Matrox first got start producing graphic solutions in 1978, ...

  • by Chrutil ( 732561 ) on Tuesday May 19, 2009 @04:18PM (#28016791)

    What does a vacuum do? It sucks.

    > It's also very noisy.

  • by Burning1 ( 204959 ) on Tuesday May 19, 2009 @04:23PM (#28016873) Homepage

    Perhaps it isn't that gaming has changed. Perhaps it's you.

    I went back and played through a lot of old super Nintendo games. What I discovered in the process is that many older games greatly extended their playtime through drudgery. As soon as you have the reload and rewind keys, Contra 3 became a much shorter game. Final Fantasy III (6) was a fantastic on it's own, but the fast forward key really cut-down on a lot of drudgery.

    What's changed the most about gaming in the last 15 years? Me.

    My willingness to replay the same part of a game over and over as I loose lives, my willingness to devote vast amounts of time to collecting every last trinket, and my patience for boring games.

  • by yoshi_mon ( 172895 ) on Tuesday May 19, 2009 @05:21PM (#28017799)

    The AC troll here forgets to mention that that table is benchmarked at 1680x1050 with 4AAx16AF.

    Crysis can work just fine on all types of video cards when your not trying to run it at the highest settings. Just like nearly every damn FPS shooter since we started this video card race.

    I don't know why Crysis has gotten such a rep as being unplayable unless you have a supercomputer but my guess is that it has to do with epeening.

    I played Crysis from start to finish on my I'd say average gaming machine with modest, which still looked damn good, settings and it was just fine. Only a few noticeable slowdowns. That vs say something like Fallout 3 which very much did slow down when I would go into VATS.

  • Re:Thanks (Score:5, Insightful)

    by uberjack ( 1311219 ) on Tuesday May 19, 2009 @05:40PM (#28018085)
    In the words of Bender, "that's not ironic - it's coincidental"
  • Re:7th Guest (Score:3, Insightful)

    by Burning1 ( 204959 ) on Tuesday May 19, 2009 @06:27PM (#28018793) Homepage

    In the days of the 386, you would be lucky to have 16MB of main memory. I suspect that the GP meant 128KB, which was a relatively common quantity of video memory in the day.

    I think it's also a bit of a troll. The 7th guest was a pre-rendered multimedia game, and came out some time after the heyday of the 386. The nature of multimedia games grants good visuals with little overhead at the cost of a lot of interactivity. Calling a modern game programmer lazy when they've bettered the visuals of T7G in a first person shooter (Crysis) is a bit... Loopy?

  • by Francis ( 5885 ) on Tuesday May 19, 2009 @06:40PM (#28018945) Homepage

    I have to say, this article didn't sufficiently emphasize the importance of the introduction of the GeForce and the GeForce 3. Almost every other graphics card was just "more" and "faster", but not the huge game-changing revolution that these two graphics cards represented.

    Before GeForce, everything was all about accelerating rasterization - the act of filling in triangles.

    With the first GeForce, lighting and transform was put into silicon. This was *huge* - this means that real math processing units were put into hardware. Scene complexity went up drastically, since we were finally able to push a lot of the more expensive operations into hardware.

    With the GeForce 3, we had the introduction of the *programmable* graphics pipeline. This was a huge game changer - for the first time, the developer was limited only by their own intellect and creativity what kinds of things could go into the hardware. This was the beginning of what could be considered the first mass produced commercial stream processing unit. The graphics card has become a general purpose computational unit, a blazingly fast computational unit with applications into fields that have absolutely nothing to do with computer graphics.

    I'm not sure what the ultimate evolution of the stream processor will be, but it still has the potential to really change the fundamental architecture of how future computers will be designed. Stream processors might eventually displace CPUs as the main computational workhorse in a computer.

  • by cptnapalm ( 120276 ) on Tuesday May 19, 2009 @07:23PM (#28019465)

    Speaking for the other 99.9% of 6 digit id people: GLQuake.

  • by Nom du Keyboard ( 633989 ) on Tuesday May 19, 2009 @08:11PM (#28020039)

    is anybody able to play Crysis in all its visual splendor?

    I'm getting 1000fps on my 4x3 pixel monitor with all the eye-candy turned on.

  • by wildstoo ( 835450 ) on Wednesday May 20, 2009 @06:08AM (#28023225)

    I played (I don't play WoW any more) a Warrior - a tank. On my old PC I had abysmal framerates on raids to the point I was endangering the raid because my FPS had dropped to 5 and just moving and targeting was a major problem. I had to turn eye-candy down in raids just to be an effective tank. Being an eye-candy fan, this irritated me.

    If your are a class that has to move around a lot and timing is a major factor, then a massive FPS drop does affect your gameplay. If you're willing to accept such FPS drops, I would guess you play a mage or something, and I would also guess you don't PvP much.

    On my new PC my framerates rarely dropped below 60 with EVERYTHING on (except in Dalaran, of course). The game just looked much better when everything was moving smoothly.

    I might be talking out of my ass here, but I think high framerates even help the "sharpness" of the display on LCD monitors thanks to their innate response time. More frames = greater "interpolation" of the moving image = neighbouring pixels changing less dramatically = better image quality/less "smearing". Hard to describe what I mean, but I felt the image quality increase when my framerate did.

For God's sake, stop researching for a while and begin to think!

Working...