Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Graphics Software

7 Years of 3D Graphics 264

xtra writes "At Accelenation they are running a nice timeline about 7 years of pc 3d graphics contains much info and even talks about some of the not so well known players anyone still remember rendition? or BitBoys?" How many cards on their timeline chart have you used?
This discussion has been archived. No new comments can be posted.

7 Years of 3D Graphics

Comments Filter:
  • Number Nine (Score:1, Interesting)

    by guamman ( 527778 )
    I remember when Number nine came out with the first consumer level graphics card to use 128MB of memory. There was no GPU, but you has an almost infinite amount of resolution settings. And that was over two years ago.
    • Re:Number Nine (Score:2, Interesting)

      by ackthpt ( 218170 )
      No mention of Tseng Labs. Maybe the 4000 chipset wasn't considered a Graphics Accelerator, but having one of those in a new 386DX25 was pretty cool and what I first played Return to Castle Wolfenstein and a few ID games on.
      • Wow, so you are still running that 386DX/25Mhz? Only way I could see you even attempting to play Return to Castle Wolfenstein is if you had it now. Of course, I would thinnk a 386DX/25Mhz would be incapable of playing RTCW, seeing as how the CPUI is so slow, likely wouldn't support enough memory, and no graphics card it could take would be supported for 3D...
    • Re:Number Nine (Score:5, Insightful)

      by Jeffrey Baker ( 6191 ) on Tuesday March 19, 2002 @01:10PM (#3188157)
      Uh? I don't remember any #9 cards with 128MB of memory. In fact, #9 was out of business two years ago. You may remember the #9 i128 series of cards, but those are very old and do not have 128MB memory.

      You only need 16MB to handle th highest resolution computer graphics displays ever made.

      • Re:Number Nine (Score:2, Informative)

        by Zurk ( 37028 )
        Actually i think number 9 had the first 128bit graphics chip on board not 128 megs of memory.
        again i could be wrong.
        • Yeah that was it, 128 bit memory path to RAMDAC or some such thing. The early '90s, when graphics cards were measured in WinMarks or some such thing, some kind of benchmark on how much they accelerated the basic geometry features. I also remember the arguments over the card drivers that would return done even though the operation was queued, not completed. Gave insanely high 'WinMarks' or whatever, but some people didn't agree with the buffering concept.

          I also remember the horrid Windows 3.1 drivers. S3 was known for the best drivers. My Trident card had lousy drivers as well. ATI was notorious for bad drivers, funny how that reputation lingers.

      • Re:Number Nine (Score:4, Informative)

        by Milalwi ( 134223 ) on Tuesday March 19, 2002 @01:38PM (#3188380)

        You only need 16MB to handle the highest resolution computer graphics displays ever made.

        This is true for 2D displays, but when you start having double and triple buffering plus z-buffers it starts to add up. Then add the texture requirement and you can see why most newer cards have 64-128MB of memory on the cards.

        Milalwi
        • You only need 16MB to handle the highest resolution computer graphics displays ever made.

          This is true for 2D displays, but when you start having double and triple buffering plus z-buffers it starts to add up. Then add the texture requirement and you can see why most newer cards have 64-128MB of memory on the cards.

          Technically, though, he's still correct. Sure, your card would be awfully slow playing something like Quake III without on-card memory for textures, etc. :) However, any 3d rendering could theoretically be done in software and then (slowly) displayed on a 2d-only card, right?
      • More memory (Score:4, Informative)

        by Traa ( 158207 ) on Tuesday March 19, 2002 @02:06PM (#3188628) Homepage Journal
        > You only need 16MB to handle the highest
        > resolution computer graphics displays ever made

        you will allways need more memory (in 3D graphics accelerators), even if the display resolutions don't increase. Lets say we settle for a nice 2000x2000 ish display. Thats 4M pixels, at 32-bit is 16MB for the display.
        At least double (32MB) but preferably triple (48MB) buffer this so you can create a new frame while the old one is being displayed. Then we need a Z-buffer (or W-buffer) to hold the depth values (24bit values) for each pixel, so we know what is in front of what, typically you might want to do some stencil effects to (8-bits, can be packed with the Z-buffer) that would be another 16MB. Now we have the basics for a 3D graphics display and are at 48-64MB.

        But we are not done yet, now for some more interesting effects:
        - Texture memory. Typically use the leftover graphics memory and swap the rest from host memory (but we don't like swapping, so preferably all textures should be in onboard mem) 2-64MB
        - 2x Antialiasing (1 Backbuffer + 1 Z-buffer 2*2*size of display buffer) = 64MB (4x antialiasing = 256MB)
        - Shadowbuffer (rendering into a kind of Z-buffer from the lightsource to create realistic shadows) 16MB
        - Accumulation buffer effects like motion blur (very expensive, a good blur could take 4 to 32 frames) or depth of view could make us want another 4-32*16=64-512MB

        I for one could easily use more then 1GB of onboard graphics memory.
    • Re:Number Nine (Score:3, Informative)

      by ram.loss ( 151102 )
      I think you mean the first consumer level graphics card to use a 128 *bit* data path. I remember seeing them bundled with Dells in the back cover of Byte a few years ago.
  • Only 7 years? (Score:2, Interesting)

    by Thnurg ( 457568 )
    I was playing Elite in 1984. Damn, that was a fine game.
    • Re:Only 7 years? (Score:2, Interesting)

      by Bilestoad ( 60385 )
      Elite lives on - if you have a PDA there are three or four clones, a couple of them GREAT, right down to the 3D combat. When I did my knee on a snowboard trip a year ago, the Visor Prism with an elite-clone helped me forget about the snow.
  • by ackthpt ( 218170 ) on Tuesday March 19, 2002 @12:58PM (#3188059) Homepage Journal
    At the going rate, the board with CPU and chipset will be a daughterboard of the graphics motherboard. :]
    • Or perhaps better yet the motherboard will just be a backplane for Graphics, Sound, Physics, AI, Network modules and the such.

      Some machines might just need Math modules (all you SETI junkies.)

      Though unfortunately computing for the home seems to be moving towards "all in one" motherboards and the such. :/
      • Physics specalized processors? Can anyone show some nice linkage for them? That sounds like the next step for games today complete and utterly lifelike physics engines instead of scripted crap. Would make mapping much easier as well, imho. I know of geomod tech from the people [volition-inc.com] that did red faction and freespace but what else is out there up to and including programs or languages for astrophysics and geomorphology simulations?
        • The only reason I mentioned the engine is because I'd love to design/write (I'd rather design) games one day, and that seems like something that would be a great thing for games, rather than re-implimenting Newtonian physics over and over again, there would be something akin to openGL that would have preset functions for things.

          The processor itself would likely be a little specialized to handle (x,y,z,vx,vy,vz) style location/speed vectors and the such.

          The closest thing I've seen to it was something one of my Materials Science TA's wrote to show and simulate forces on beams.
          • The processor itself would likely be a little specialized to handle (x,y,z,vx,vy,vz) style location/speed vectors and the such.
            Well, I don't think it would be easy to store the exact values of x,y,z, and vx,vy,vz. This processor would need lots of qubits, I'm affraid, and could be very expensive.
  • STB velocity 4400..And no real need to upgrade yet!
  • Why 7 years? (Score:3, Interesting)

    by Jacek Poplawski ( 223457 ) on Tuesday March 19, 2002 @01:02PM (#3188089)
    Why only 7 years of 3D graphics on PC?
    What about Stunts, Elite, and other 3D games?
    • Re:Why 7 years? (Score:2, Informative)

      by Strog ( 129969 )
      From the article:This article should really be called "7 Years of 3D Graphics" because 3D acceleration seemed a suitable starting point..

      The point was accelerated 3D not "software" 3D. I still remember bombing runs from Ace of Aces on a Commodore 128 though it definitely wasn't a hardware accelerated game.

      • I still remember bombing runs from Ace of Aces on a Commodore 128 though it definitely wasn't a hardware accelerated game.

        It wasn't even 3D. Sublogic's Flight Simulator and "Jet" were 3D, which is why they ran at about 1fps on a C64!


      • The point was accelerated 3D not "software" 3D.


        The amiga had filling of polygons in 1985 using the blitter. $DFF058 was the register use to fire up that beast if I remember correclty.
      • But there have been accelerated 3D boards for a lot longer - I was programming some boards from SubLogic (remember them?) and then from SGI (IRISvision - Personal IRIS graphics on a PC with a *blazing* 14k polygons/second with hardware Z-buffer! :)

        The article might better be titled "7 Years of *Consumer* 3D Graphics Cards"...
      • What about Stunts, Elite, and other 3D games?

      Given that ZX81 (aka TS1000) emulators for the PC were around from day dot (or a ZX81 emulator running on an Atari ST emulator running on a PC...) how about 3D Monster Maze [u-net.com]? It's not that much more primitive than Doom. Run! Run from the scary Tyrannosaur!

      (Yes, I know, the article is probably about hardware that draws triangles real fast, but it's Slashdotted hard, so we may as well have some fun reminiscing. If nothing else, it'll confuse the young 'uns ;-) )

  • by TrollMan 5000 ( 454685 ) on Tuesday March 19, 2002 @01:02PM (#3188092)
    Can graphics technology possibly get any faster? Well the GeForce2 GTS chip ran Quake3 at 80fps in May of 2000. Just twenty-two months later a GeForce4 Ti4600 can run Quake3 over three times faster. On that reckoning the GeForce6 in two years time should be running Quake3 at over 700fps. Is that fast enough for you!

    Is there really much visual difference between 700 fps and 135 fps? I'm not really sure if the human eye can make the distinction. They're sure pretty-looking numbers, but do the results show for it?

    And how long before video cards can render essentially photo-realistic graphics? Soon games will be more like interactive movies.
    • It's not the speed anymore, but the visual quality that matters. The Geforce4 can play games in 1600 by 1200 resolution with all effects maxed out at decent frame rates. Anything past 60 FPS doesn't really matter to a lot of real game players anymore. What matters is the visual quality of the game. How real does the smoke look? How real do the characters move? etc.
    • > Is there really much visual difference between
      > 700 fps and 135 fps?
      1. 700FPS means when things are really hectic you drop to 300FPS instead of 30FPS (this is basically the argument for 150FPS over 70FPS; the average is fine, but the most detailed parts will start pushing it into being jerky)
      2. 700FPS means you can up the detail ~7 times and get ~100FPS. Unreal 2 is already pushing about 4 times as many polygons as the orignal engine; you may well get 700FPS now, but you certainly won't in the top end games in 2 years time.

      Remember, Quake 3 is fairly old now; already games like MoH have parts that will make most above-average machines struggle (like that mission with all the trees); newer engines, larger cheaper monitors etc are only going to push that further.
    • For whatever reason, in Quake3, having > 200 fps allows you to jump to a height a tiny bit higher than having 200fps (probably because you actually hit the 'peak' of the jump in a frame instead of 'between frames' (ie, the peak of your possible jumpable height happens on a frame instead of a point interpolated between frame, so the physics engine picks up on this and lets you up there.))

      So it may not help the visuals, and aside from using the extra frames for motion blur, etc, it also provides you physics engine with a more 'correct' version of your path .. although this is probably dependant on how one implements their physics engine.
    • Is there really much visual difference between 700 fps and 135 fps? I'm not really sure if the human eye can make the distinction. They're sure pretty- looking numbers, but do the results show for it?

      Quake 3 just happens to be a benchmark (and an old one at that) whose numbers are relative, but not necessarily realistic : Imagine if they benchmarked harddrives by always saying "The IBM 75GXP can load 40,000 10KB hello.c files / second", to which everyone follows that up by commenting that they only need to load 1 hello.c file, etc. In other words, for a more demanding task like the new Doom, Quake 3 with complex mods like Urban Terror, or much more complex games like Operation Flashpoint, 135fps in the stock Q3 equals ~15 fps in a complex outdoor scene in OpFlash. And as has been recapped many times in the past: We are just touching the surface of realistic environments (i.e. try to model nature in a dynamic fashion and the best boards put out single digits FPS, if that).

  • Only 7 Years? (Score:4, Interesting)

    by green pizza ( 159161 ) on Tuesday March 19, 2002 @01:08PM (#3188141) Homepage
    Heck, I still remember the "which is better, Silicon Graphics Reality Engine or Ferrari Testarossa?" threads in the USENET from the summer of 1992. Even the dual pipe / dual head SGI VGXT "Skywriter" from 1989 was pretty damned impressive. Even many, many years later.
  • The Rendition cards, like most cards, were quite good presuming that the folks coding the games knew how to utilize the power of the card. My personal favorite game (Grand Prix Legends) is a little over 3 years old but was written specifically for Rendition cards and Voodoo2 cards - OpenGL support was tacked-on later. And of the three, there was no doubt that Rendition was the best for that game. Nothing inherantly wrong with the cards, imo, but they just got done in by the Voodoo cards of the time (which, of course, got done in themselves not much later).

    Anyway, thanks for asking if I remembered Rendition!

    Cheers.
    • Yah, GPL was pretty much the ONLY reason that the Rendition card was able to stay around so long.

      Pretty darn good game still, hell, damn GREAT game still, heh.

      I remember a looking from at a screenshot from GPL printed on the then Highest Resolution Printer In The World(tm)Lexmark z(whatever). Damn nearly looked like a photograph.

      When I saw screenshots of it directly, hell, it DOES look damn nearly like a photograph!

      Whatever API they used and however it interacted with that chip was damn powerful, blew the living shit out of anything to come for another 2 years or so.
  • In a related note, what the hell ever happened to the Future Crew? Man, I remember waiting with bated breath for Second Reality to download over my 2400 baud modem.

    --saint
    • Re:Demo scene. (Score:3, Informative)

      by 2Flower ( 216318 )

      Future Crew reassembled, broke up, faded out, tried to get into the games biz, etc. Their site has been offline for awhile.

      Fortunately the demo scene lives on; pouet [pouet.net] hosts links to nearly every demo in existence across multiple platforms. And to keep us on topic, most demos nowadays are 3-D accelerated. It's become less a game of "What techie tricks can you do?" and more a game of "How artistically can you use the technology?". There's some visually striking demos being made nowadays, and not just because they have shadebobs or glenz cubes.

    • Wow...haven't thought about Second Reality in forever. I remember getting a new Diamond video card and immediately playing Second Reality to see how much faster it ran -- it was my benchmark.

      Ok...anyone remember Kosmic Free Music Foundation's "Little Green Men"? What about good old Cubic Player and .mod's?

    • Re:Demo scene. (Score:2, Informative)

      by sph ( 35491 )
      In a related note, what the hell ever happened to the Future Crew?

      The actual Future Crew is no more, but many of the members have been active in various projects in for example gaming industry. You may have heard of Max Payne, made by Remedy Entertainment [remedy.fi], or 3DMark, made by MadOnion [madonion.com]. Though not really related to FC, they both employ former FC members, and may be the best known examples. As for other demosceners, some Byterapers members were involved in Rally Trophy, made by Bugbear [bugbear.fi]. It also features some music by Purple Motion/FC.

      Any other examples of demosceners, perhaps from outside of Finland?-)
  • by Zapdos ( 70654 )
    History for Nerds.

    Just seems to be allot of history stuff lately.
  • by carlcmc ( 322350 ) on Tuesday March 19, 2002 @01:21PM (#3188249)
    - Glaze3D announced, August 2, 1999 [shugashack.com]

    - Glaze3D claims 300fps performance, Sept 29, 1999 'The Glaze3D chip is at the moment in the last stages of the silicon implementation, and when ìt is done, the database goes to Infineon for processing. We will get the first prototypes in the very beginning of the next year. We have at the moment a 100% accurate simulationmodel of the chip, which can do millisecond accurate runs. From these we have been able to 'calculate' that to draw one Quake3 frame takes 3-5 milliseconds, which means 200-300fps on behalf of the graphics card.'

    - Glaze3D announcement in March, Feb 15, 2000 [shacknews.com]

    - Glaze3D misses announcement date, Apr 1, 2000 [shacknews.com]

    - Glaze3D shooting for 2001 , Apr 10, 2000 [shacknews.com]

    - John Carmack on BitBoys, Oy?, Jan 9, 2001 [shacknews.com]

    I have never had any contact with Bitboys, let alone seen their technology. I DO NOT ENDORSE THEM, and I am disturbed that they are saying that I do. There is room in the abridged quote to have this be a misunderstanding, perhaps something along the lines of me endorsing some particular track of future technology which they implement. John Carmack

    - BitBoys claim silicon being manufactured, Jan 9, 2001 [aceshardware.com]

    "I know we have been shallow in our comments, but I can reveal that chip design is complete and chip is being manufactured", says Juha Taipale, general manager.

    Much of this was shamelessly compiled from a great news source Shacknews.com [shacknews.com]

  • When I look at timeline on the first page I realize that all the video-cards I've bought since the Voodoo1 was bought within two months of release (Voodoo1, Matrox Millenium2, Voodoo2 and Geforce 256 DDR (which I still uses)). I bought the Voodoo1 in November 1996, and that was really a quantum leap. I remember I had to crack the beta-version of the Tomb Raider 3dfx-patch to make it run with the cd-rip :) And glQuake... ooohhh...

    • Trident 8900c 2mb (with a 386DX40)
      Upgraded to 4mb
      ATI All-in-Wonder Rage Pro (with a K6200)
      Voodoo2 add-on (Needed this for Unreal)
      GeForce (with an Athlon 550) (Needed this for Unreal Tournament)

      I still use the GeForce Athlon combo.
  • ...servers that couldn't be slashdotted, rather than silly little graphics chips.

    Anyone got the story text cached? Because the server is El Hosed.

  • In fact, I think the first time I played accelerated Quake was on a Rendition 2100 card.
    I seem to remeber there was even a custom quake binary written for the Rendition API, because the card didn't have OpenGL drivers when it came out.

    Heh. Remember when a "good" ping was anything under 300? :)

    C-X C-S
  • Damn I'm feeling old. I remember working at Evans and Sutherland in 1991, testing their ESV workstation. For about $100,000 you could get a machine the size of a dorm fridge that would push 100,000 gouraud-shaded, untextured polygons per second. And we liked it. :)

    For fun, I hacked up xbiff to make pexbiff, complete with a 3D mailbox and 3 bouncing point lights. It pushed the limits of the machine -- everytime you got mail, the mailbox would animate and the lights would bounce. Aaahh... the days before spam...

  • by green pizza ( 159161 ) on Tuesday March 19, 2002 @01:36PM (#3188358) Homepage
    ... it's time for the next wave of 3D.

    I love playing with the SGIs at work and I enjoy playing with the wizbang PCs that my roommates and I have, but to be honest, I'm really not that impressed with modern gfx accelerators. The original geForce was pretty neat, and SGI's last big leap (InfiniteReality in '95) was cool... but golly, things really haven't changed much since Clark and his gang from Stanford opened our eyes to 3D in '82.

    We've gone from cabinets to cards to chips to a single chip. We've added some gfx extensions and now do multiple rendering passes to make things look prettier... but really, nothing has changed much in the recent years. It's smaller, faster, cheaper. Steady evolution... but so is the scum growing in my bathroom sink.
    Please excuse me while I yawn.
  • by Rogerborg ( 306625 ) on Tuesday March 19, 2002 @01:46PM (#3188437) Homepage

    I still have nightmares about developing for the Rendition Verite 1000, which was a lovely graphics decelerator on anything faster than a P100. When we got our first batch of Voodoo 1's delivered, there was a brief but very ugly struggle to get our clammy hands on them. You ain't seen pathetic until you've seen geeks wrestling and squealing like stuck pigs over 4Mb graphics cards, let me tell you.

    Question to anyone else who has developed 3D graphics: who did you find driving the demand? In our games house, there was a running battle between the programmers and the artists. Us code monkeys were forever on at the artists to cut down the polygon counts, but they kept trying to slip in models that were barely stripped down from the FMV sequences. In the end, we came to an equitable solution: they won, the game ran at 10fps, and all the programmers left.

    I wonder how many other games were ahead of their time in that regard, and how many of them would be rescuable given cards that scoff at polygons and eat dozens of 256x256 textures before breakfast?

  • I was wondering if anyone had applied Moore's law to 3D graphics. A quick google search and...

    http://www.3dlabs.com/product/technology/moo resla.htm

    Unfortunately it's a company paper and very biased towards the 3Dlabs Wildcat. That, and it's a bit dated. Then I found a Microsoft Research pdf:

    http://amp.ece.cmu.edu/ECESeminar/slides/Whitted /F 00_Whi tted_slides.pdf

    it's an interesting read, but not 100% relevant. Anyone else have relevant info?
    • I don't remember where exactly I read it, but basically, there was a statement that 3D Acceleration was outpacing Moore's law by a monumental amount such that Moore's law was totally inapplicable. Maybe I can dig something up...

      ~GoRK
  • Yep, after all the fanboys started demanding the games in 3d, and then the game companies turned to supplying them, the effective graphic quality of computer games plummeted, and has only now maybe reached the beauty that we had at the pinnacle of sprite-based games. Sure, you could only see one side of the monsters, etc., but they were good-looking monsters - none of these chunky triangular-looking things that didn't even have fingers, toes, etc. and were plastered with dim-looking repetitive textures.

    3d is almost getting good enough that I can stand to look at it. But for a while there, it really made games look a lot worse, just for some undefined promise of realism that was never really satisfied until maybe recently - those early 3d games just looked unrealistic in different ways than the 2d ones had. It's like the gaming industry fired anyone with taste and just kept all the techs.

    OK, I think I'm done ranting now.

    • As an avid console gamer who had stopped gaming for a couple of years after the PSX succeeded the throne of console dominance from the Super NES, I understand _exactly_ what you're talking about. In my absence from the gaming world, I lamented the "death" of 2-D at the hands of ugly, boring, primitive 3-D graphics. Even within the SNES era, people raved over games like StarFox, a 3-D game which had very little appeal for me, but which for many was a vision of how games should look and play. Meanwhile, I foresaw that it would be many years before 3-D graphics would even approach the beauty of sprite-based graphics, and that's turned out to be true IMO.

      However, it should be noted that there are examples of games that utilize 3-D graphics while maintaining 2-D gameplay and feel to great effect. One example that comes to mind for no real reason is ThunderForce V, which is a great horizontal-scrolling shooter (aka shoot-em-up or "shmup [shmups.com]") that uses 3-D graphics which are small enough to be somewhat detailed. The kicker comes when encountering boss enemies, where the camera seamlessly zooms and rotates around the scenery from the standard side view, taking obvious advantage of the 3-D nature of the graphics.

      I now happen to enjoy a lot of games that have made the switch, in all sorts of genres. Some quick examples include Mario (platformer), Zelda (action RPG/platformer), Final Fantasy (RPG), Hundred Swords (SRPG), etc. Street Fighter EX in any incarnation will never be able to replace its 2-D progenitor for me, but in many other ways, I've come to tolerate 3-D graphics in games where its usage adds more to the gameplay than it detracts from the visual appeal.

      In this last regard, I think Nintendo's Legend of Zelda: Ocarina of Time was revolutionary for me (as it's the one game that brought me back to console gaming). It was full of rough graphical edges, but to be fair, its immediate 2-D predecessor on the Super NES used graphics that were small and fairly undetailed, and in comparison were less impressive overall at the time. Ocarina of Time is still beautiful, and 3-D graphics helped that game achieve incredible depth, as well as a fantastic sense of the sheer vastness of the game world.

      < tofuhead >

  • Interesting trend. (Score:3, Interesting)

    by Frag-A-Muffin ( 5490 ) on Tuesday March 19, 2002 @02:07PM (#3188641)
    The first thing that jumped out at me was this interesting trend I see on the chart.

    The companies that have long red lines (meaning the time it took for them to ship since their announcements, ie HYPE) are all gone! :)

    The ones that kept a relatively consistent schedule are still around. Once again, a smart business plan wins, not super-hyped, non proven stuff [xbox.com]. ;) (Yeah .. I know. I'm biased. Nintendo Forever! :)

    (On a side note, I wonder how long the line would've been for the xbox! :)
    • The XBox was announced in March, 2000. The Gamecube was announced in August of the same year. Both were launched in November last year. So the XBox line would have been a whopping 5 months longer :)
  • Note the downward trend in those bars for product life there. They may not be entirely accurate, but note how the ATI and especially Nvidea bars generally get shorter and shorter....

    Heh. Could this have any factor in their success?
  • I still remember the first 3d hardware, and I still have a pc running with a matrox mystique and a 3dfx voodoo I...

    I had been involved in demo coding for a while as an high school student and we had managed to implement a 3d software engine which was working really fine, at higher resolutions as well. The most important thing is, we were having lot of fun.

    Maybe I am getting wrong now, but I believe the first version of quake came out without any kind of 3d acceleration (everything was software made, they just wrote an almost perfect code ...)

    But one day, well, 3d hardware came out and the whole thing wasn't funny any more. In the beginning it was very difficult for a single person to develop something decent using 3d hardware (because of a lack of good docs), while big companies started to produce lots of games using 3d acceleration, which were very badly optimized.

    Well, I don't know, I still think that 3d acceleration took away a big part of the intellectual work due to the optimization process of code in games. Of course there were and there still are exceptions.
  • Someone mentioned elite, etc. Yes, there was 3D graphics before there were dedicated processors on PCI/AGP cards for this purpose. Going by this ethos, shouldn't we also be celebrating the modularization of the sound support and serial line comms support functions of the modern PCs? Why is the birthday of the 3D card celebrated, and not the ISA/PCI/USB modem, or sound card? Or perhaps Mac users should celebrate the day the monitor was split off the case.

    Any processor intensive application will spawn modular add-ons to take some of the burden off the CPU. So long as the task itself, of course, is generic enough to have a sufficiently large market. Basic economics.

    By saying there was no proper 3D graphics before the advent of the accelerators, you are doing a great injustice to the demo [hornet.org] scene as it was back then. Remember the 256 byte competitions? The 1 kb and 4 kb competitions? Now here were people who knew how to milk code for every iota of juice that was there. The (almost) forgotten art of Code Optimisation.

    Heck, there was 3D graphics on my old Commodore 128; I still have Elite. What do you call the original Battlezone? The only difference was, there wasn't any specialised add-on card to do this task on the market back then.

    I don't mean to disrespect current makers, researchers, coders, and gamers. I just think there's got to be many more significant birthdays to commemorate.

    How about a feature on the demo scene on slashdot? The younger crowd will appreciate the demos, and we'll get these funny comments from the war-torn 386 vets about how they used to make their own transistors out of sand...

    • What do you call the original Battlezone? The only difference was, there wasn't any specialised add-on card to do this task on the market back then.
      FWIW, many Atari vector games included extra coprocessors called "mathboxes" for calculating 3D projections. BattleZone was one of the first. Admittedly, it had much more in common with modern FPUs than modern 3D accelerators, but even back then people were dedicating silicon to 3D graphics.
  • OK, he's talking about mainstream gaming stuff...

    But there was some killer high-end stuff for the PC architecture!

    Anyone remember the Intergraph workstations? They had custom 3d hardware. In late '98 (or was it early 99?) we had an Intergraph with Wildcat graphics. 16MB framebuffer and 64MB texture (I may have it backwards). Highly accelerated, and killer. We used it to run ballistics and weather simulations.
  • Matrox isn't going away because of the following key features:

    1: The best multimonitor around
    They starting it, they perfected it, they can do different resolutions under Windows 2000 (they were the first, if not only)

    2: Excellent overlay charactaristics
    Wanna use a TV tuner card at high resolutions? Ignore nVidia. From my experience programs that run overlay really like Matrox's card, w/ the DVD max feature that allows any overlay to be displayed on the secondary monitor you can port divx video out to the TV. Also, overlay works at much higher resolutions than nVidia solutions have. I don't want to turn my 19" down to 1024x768x16 bit just to watch a DVD, my 14" runs more than that.

    3: Acceptable 3D performance, exceptional 3D quality
    Although it's not the fastest card on the block, it will still play virtually all games atleast acceptably. And when you are playing them, they have a low amount of artifacts and the textures are well drawn.

    4: 2D quality
    Although it's much overlooked, it's what most people stare at a majority of the time. Matrox makes thier own boards so they can have a tight control over the filtering components.

    I've used a couple S3 cards (low end), Permedia 1 and 2 cards, Riva128ZX, TNT, and TNT2, Matrox MGA, G400 and 450 cards. And so far I have to give props to Matrox for a product that matches my needs. Granted my needs are different from most.
    (triple monitors w/ TV tuner and alot of video player programs)
    • Bravo. 2D really is better on Matrox cards, isn't it. Got my G200 back in 1999; got my G400 MAX in 2000; G550 in 2001. I have a Radeon for those intensive games but through an innovative solution (switching the monitor cable - hehe) I can still use my Matrox when I'm not playing games. DualHead is, of course, wonderful. I'm thinking of getting a G200-MMS (4 way) - but I'd have to switch my nice 19" CRT for a TFT because of space constraints. Oh well.
    • As a current user of a Matrox G400 DualHead AGP 32 MB card, I can definitely say that the 2-D graphics quality of this card--like all Matrox cards--is second to none. Not even the latest ATI Radeon 8500 series comes close to the amazing sharpness of 2D graphics that the Matrox cards now offer. I've seen the output of the better quality cards that use the nVidia chipsets and they (on the average) don't even come close to the crisp display quality you get from Matrox cards.

      This is why I'm REALLY hoping that Matrox does make another stab at a high-end 3-D graphics card that can compete against the Radeon 8500 and GeForce4 Ti4200/Ti4600 series but still offer the unrivalled 2-D display quality Matrox is famous for. Using the modern 0.13 micron process to make the next-generation Matrox chip, they could easily offer industry-leading graphics acceleration and MPEG-2 decoding equal to that of the GeForce4 series. Such a card--even if it costs slightly more than the cards that use the GeForce4 Ti chipsets--would be instantly lapped up by gamers who want the clearest graphics display.
  • There are some serious omissions here.

    To ignore the early GLINT work from 3DLabs and not give them their own column in the table is a bit unfair.

    The Number9 stuff is missing (no great loss).

    Other early work is missing, for example SGIs PC graphics card which predates all of this by about 5 years.
  • So all these cards do high speed polygon drawing plus fancy stuff.

    Has anyone tried to make a GPU for ray tracing? Good ray tracing scenes can be much better than the scenes drawn by polygon engines.
    Yeah, it would mean a whole change of code for current software. D3D would have to change, or maybe have another API beside it, say DirectRay. But the rendering would really get better. Todays hardware should be able to handle the load. And they should scale well also. More GPU's equals more parallel rendering of pixels.

    Imagine a truely ray traced virtual world. {shudder with anticipation}

    • I fail to see how a ray tracing engine would produce significantly higher quality than todays best triangle rasterizers.

      (other than the vast decrease of memory bandwidth usage by a real scene description language than describing stuffs in terms of triangles)

      Smoother silhoettes due to the use of NURBS than triangles? Sure. We have T&L - just increase the number of triangles in the scene.

      Different lighting effects? Pixel shaders can do that too.

      Refraction and Reflection? Even Voodoos could fake that reasonably well with environmental maps.

      Both approaches fail to render good-looking global illumination effects without faking it with textures or "ambient" lights.

      I say, add support for non-triangle scene description and back that up with hardware-accelerated meshing. Then the quest for the ultimate triangle rasterizer is complete.
  • by Animats ( 122034 ) on Tuesday March 19, 2002 @04:16PM (#3189563) Homepage
    The first real 3D hardware was the Evans and Sutherland Line Drawing System 1 [es.com], in 1969. This was the first graphics device with a hardware 4x4 matrix multiplier, the basic geometry engine component.

    I saw one once, at Case Tech, in 1969. About six racks worth of hardware. Nobody really knew what to do with it.

  • My first "3D" graphic card was the Voodoo2. For a while I resisted buying a dedicated 3D card, that was until I saw glQuake ;)

    I have tried the ATi Rage 128, Voodoo2/3,nVidia Geforce 2 and nVidia Geforce MX. Without exception, all these cards perform better under Linux than under Windows.

    No, this is not a troll. I have got people to try linux after they have seen TacticalOps running on my slackware powered laptop :)

    The best example of this was running Return To Castle Wolfenstein on my Geforce 2. It played OK under Win2K (latest DirectX, drivers etc.) - but I had to run it at 800x600 for it to be playable. Running the same binaries under Transgaming's WineX, I could bump it up to 1200x1024 and get a better framerate than under windows at 800x600!

    The best supported cards I have owned have been the nVidia cards. Regular driver releases available for both Windows and Linux from their web site - I challenge you to find another gfx card supplier that does the same!

    Ok, so part of the drivers are binary only. I say : so what? Nvidia are good at maintaining them, they know the card best and seem happy to support us, so why moan?

    Now I'm waiting for my Geforce 4 Ti4600 to arrive... :)
  • Wasn't Mistery house on the Apple ][ the first 3D game of all time? It was published around 1980, I believe.

    PPA, the girl next door.

Any circuit design must contain at least one part which is obsolete, two parts which are unobtainable, and three parts which are still under development.

Working...