Video Card History 390
John Mathers writes "While searching the net for information on old Voodoo Video Cards, I came across this fabulous Video Card History article up at FastSilicon.com. It's nice to see that someone has taken the time to look back on the Video Cards that revolutionized the personal computer. Here's a quote "When 3dfx released their first card in October of 1996, it hit the computer world like a right cross to the face. That card was called Voodoo. This new card held the door open for video game development. The Voodoo card was simply a 3D graphics accelerator, which was plugged into a regular 2D video card; known as piggy backing."
Well, sort of. (Score:5, Informative)
This isn't entirely correct, as any Voodoo 1 user could tell you. The card took up its own slot, and used a pass-through video cable to connect the monitor: When a Voodoo-compliant video signal was detected, it hijacked the output to the monitor and took over.
Nice design, for the time. The best thing was, it was CHEAP for the time (considering the performance). I think I paid $199.
M-
An interesting tidbit. (Score:5, Informative)
nVidia's pioneering RIVA 128 chipset was the first chipset that could compare itself in 3-D mode against the vaunted Voodoo cards of that period; once nVidia unveiled the groundbreaking TNT chipset it was pretty much over for Voodoo's separate board approach. This is what spurred ATI into developing the Rage Pro and later Rage 128 chipsets in the late 1990's, starting the competition between ATI and nVidia that has lasted to this day.
Here's the text. (Score:1, Informative)
In all races, there must be competitors. ATI (A company that has been around for twice as long as NVIDIA or 3DFX.) and NVIDIA had cards out shortly after that to compete with 3dfx. They were named Rage, and RiVA 128. This was long before they both took the 3D giant 3DFX completely out of the race though. They were just tiny blips on the radar for 3DFX at the time. To counter the new competition, 3DFX released the Voodoo2 in March of 1998. It was a vast improvement over the Voodoo, having a 90 MHz core clock and a whopping 12 Mb of video memory. Voodoo2 could produce a resolution up to 1024 x 768, and had a blistering fast 3.6 Gb memory bandwidth - top of the line back then. As before, the Voodoo Banshee came out after the Voodoo 2, and like the Voodoo Rush; it was a waste of money due to performance issues. Incidentally, the Voodoo2 was still a piggy-backer; they did not drop that method of 3D graphics card integration until later.
In March of 1999, 3dfx came out with the Voodoo3. This time, the Voodoo 3 was separated into different steps to cover different consumer needs (sound familiar?). The Voodoo3 2000 was the low-end budget card, and it had a core speed of 143 MHz to offer. On the next rung was the Voodoo3 3000, which offered up a 166 MHz clock speed. At the top was the 3500 version, which featured a TV-out port, and a 183 MHz clock speed. All these cards were offered in PCI and AGP versions (a new concept, also shared with an ATI card called the 3D Rage Pro).
Like many underdogs, the competing companies started catching up to the hardware giant. NVIDIA released a card around the same time as the Voodoo3, called the TNT 2. The TNT 2 was the successor of the TNT, and upped the ante from 8 million to 10.5 million transistors - a huge jump in complexity. It also offered 32-bit color support, and digital flat panel support. The Voodoo3 barely beat the TNT2 in pure FPS, but the TNT2 had much higher visual quality, so people started checking out the competition. It didn't cripple 3dfx, but it let them know that they better have something groundbreaking with their next release. ATI, possibly one of the cleverest (or maybe luckiest) of all three companies was content to sit in the corner and watch NVIDIA and 3dfx battle it out. ATI still released new cards - they weren't spectacular, but by no means were they horrible. The cards were just enough to keep them in the race. ATI's strategy seemed to be to lie in wait for their time to strike, which wouldn't come until later.
On October of 1999, NVIDIA dealt the final blow to the 3D giant, with the introduction of the Geforce 256, 3dfx didn't have anything to combat the new card with, so they took the blow right to the face (we saw this same situation happen to NVIDIA later on). The revolutionary Geforce 256 brought much to the table, including four pixel pipelines at 120 megahertz, DDR ram support, along with many other new features. 3dfx had two cards that were very highly anticipated but delayed long past the original schedule (Sound familiar?). But once the voodoo4 and 5 did come out, they were well accepted, but far too late to do damage to NVIDIA. Basically, they just added more GPUs and more RAM to beef up the new cards. Which was fine and dandy, but it made the cards about twice as big as the previous models, fo
Some egregious errors here... (Score:5, Informative)
Erm. That's not even enough to fill in a single horizontal bar of the screen (unless you're running in 320*240 resolution). Perhaps they meant megapixels? This was hardly the only such error that I noticed, though - these guys really need to have someone proofread their articles.
Forgotten cards (Score:1, Informative)
He also slights 3DFX a bit. The Voodoo 2 was huge, although I had TNT 1, every one I knew was running Voodoo 2.
Re:Revisionist History? (Score:5, Informative)
Any article which try to encapsulate the history of 3d cards but fails to mention the Verite cards is a piss-poor article right from the get-go.
Want to read more about older video cards? (Score:5, Informative)
Check out Tom's Hardware Guide [tomshardware.com]
http://www20.tomshardware.com/graphic/1997.html [tomshardware.com]
http://www20.tomshardware.com/graphic/1998.html [tomshardware.com]
http://www20.tomshardware.com/graphic/1999.html [tomshardware.com]
http://www20.tomshardware.com/graphic/2000.html [tomshardware.com]
http://www20.tomshardware.com/graphic/2001.html [tomshardware.com]
Re:When will it go back to the CPU? (Score:2, Informative)
Re:Well, sort of. (Score:4, Informative)
At the ISP I worked at I had two Voodoo 2 cards, which, on a lowly PII-350, ran Unreal Tournament with full detail in 1024*768 at a massive framerate!
Re:Ah... those were the days :-) (Score:3, Informative)
Re:Revisionist History? (Score:5, Informative)
And absolutely no mention of Matrox whatsoever... despite the fact that their add-on 3D accelerator was arguably superior to the voodoo, and the parhelia is the ONLY 3d solution to support 3 display devices.
gaming hardware in servers? (Score:3, Informative)
This card was massive and would never have been used in a server.
Not a complete history by any means... (Score:3, Informative)
Re:Well, sort of. (Score:3, Informative)
The Voodoo2 cards started at $250 and $350 (or somewhere around there) for the 8MB and 12MB models, respectively. The only way to get the 1024x768 mentioned in the article was to have 2 12MB cards in SLI mode (which meant connecting the 2 V2 cards with a small ribbon cable between the two cards inside the case). Additionally, the pass-through cables that came with most V2 cards caused some degredation of the signal going to the monitor, so the graphics tended to be a bit dark, but was easily fixed by buying a better cable.
The performance was definitely solid, though, since the V2 cards I had were originally passing the 2D signal of a Riva128, and then a TNT, and finally a TNT2Ultra was the card that made me decide to pull out the V2 cards (not to mention that the V2s I owned did not have fans on the boards/chips, which meant that one of them burned up within about 6 months).
The combination of the lack of real OpenGL support, lack of 32-bit colour, and the speed of the TNT2 Ultra was what finally put 3dfx to bed, as the Voodoo 3 couldn't keep up and the Voodoo 4 was delayed far too long while 3dfx kept talking about how raw framerates were more important than features, and that no one could see the difference between 24-bit (the V3 supposedly output 24-bit colour through some tricks) and 32-bit colour anyway. Quake 3 proved them wrong quite quickly, as anyone could show with a few screenshots at the time.
Re:Confusion with later Voodoo cards? (Score:3, Informative)
They are, in the chip itself, sorta. Modern all-in-one GPUs have multiple texture pipelines, which does split some of the load on the silicon level. It's not SLI, but it's the same concept.
The problem is SLI only doubles the fillrate. Both 3D chipsets need to work on the exact same dataset. SLI was a great boost back when 3D hardware was slow at texturing. These days the hardware can pump a couple thousand frames per second worth of textures, it's fancy multipass rendering and dynamic shaders (and to some extent, the geometry) that take up all of the frame generation time. SLI could speed some of this up, but it wouldn't help with most of the bottlenecks. It would be like putting new tires on a car that needs an engine tuneup.
Re:Revisionist History? (Score:4, Informative)
http://www.accelenation.com/?ac.id.123.1
Re:Quake 1 without Voodoo? (Score:2, Informative)
Re:Bullshit (Score:5, Informative)
Not accurate. (Score:2, Informative)
This text has some flaws... Nvidia didn't buy 3dfx nor its assets. It won them in a lawsuit with 3dfx.
-B
Re:Revisionist History? (Score:5, Informative)
And no mention about that company whatsoever
But hey, what can you expect from (probably) fps kiddie biased negatively towards Matrox among others - because if he'd be JUST fps kiddie (not anti Matrox and...) he'd mention the fact that for ~half a year in 99 Matrox was the leader BOTH in performance and quality...too bad since then only the second holds true.
I'm happy with ATI & Linux (Score:3, Informative)
ATI generally releases an new WHQL Windows driver about once a month and a new Linux driver about every 6 weeks. I've had no problems with their XFree86 4.3 driver. They don't have a FreeBSD driver, though, but I guess a PowerBook would give somewhat of the same experience (BSD-based OS, XFree86-based X envrionment, Radeon 9600, plus Quartz/DisplayPDF and access to Mac apps). Mac OS X also has the ATI (and nVIDIA) drivers built-in and are updated with the software update utility.
ATI's Windows drivers are offically updated once in awhile, and are generally rock solid, but there are occasionally problems that aren't resolved for months at a time.
Re:XGL? (Score:2, Informative)
This isn't quite true. Most of the desktop rendering is still done by the CPU in the same way that it was done before QE was added to OS X. It's simply the individual windows that are rendered by QE, and OpenGL handles the 'surface' which is handed to it by Quartz and QE. So OpenGL mostly comes in to handle effects (like Expose, the fast user switching animation, and the opening/closing animations) and shadows, QE handles the windows and passes the textures to OpenGL when an effect is needed, and the CPU still does it's thing for the desktop (until you hand off the desktop as a surface to OpenGL for the user switching animation).
In other words, things that were basically eye-candy and were really slowing OS X down quite badly before QE are now handled by QE, but the base 2D engine that utilizes the CPU is still working the same way it does with most other operating systems.
That being said, free eye candy is free eye candy, and although there are many people out there that prefer things to be stripped down, I'd rather have something with a little flash that can be done just as quickly.
Re:Quake 1 without Voodoo? (Score:2, Informative)
parallel graphics pipelines (Score:4, Informative)
To do 60 frames per second, you have roughly 16ms to generate a frame. A couple of those ms will be gobbled up with I/O transactions and various wait states, so you're already at the point where double the power is only going to result in 1.75x the performance. This will also be highly dependant on how well the 3D code can be parallelized (are there a lot of read_pixels callbacks that require both GPUs and both banks of memory to talk to each other? etc).
This has actually been done by SGI for awhile now. A couple years ago they took their Origin 3000 architecture and stuck on dozens of V12 GPUs and tiled the graphics for higher performance. That concept has been tweaked for their Onyx4 sytems... one large single computer with up to 34 ATI FireGL X1 GPUs. 16 GPUs work on each display in a 4x4 grid. Each GPU generates its 400x300 piece and 16 of those are composited in real time to make up a 1600x1200 display. I believe the biggest such machine to date has 32 GPUs powering two zippy fast 1600x1200 displays and 2 GPUs driving an additional 4 lesser powered displays. SGI gets quite a speedup by doing it that way, with 16 GPUs per display, but there's also a lot of overhead (even more in SGI's case, with 34 AGP 8X busses in said system). Their implementation of OpenGL and OpenGL Performer is tweaked for this, though.
So yeah, it can be done, but the fact that the GPUs will spend a significant amount of time doing non-rendering tasks (I/O, waiting for data, copying the final result to frame buffer, etc) means that you won't see a nice linear scaling. The cost of making custom hardware and custom drivers also adds up. With top-end PC 3D accelerators costing $400 already, I can't picture many users shelling out $1000+ for a dual GPU card.
DirectFB - OpenGL? (Score:2, Informative)
Please Mod Up - Fastsilicon.com Response (Score:5, Informative)
-JT
Re:Ah... those were the days :-) (Score:2, Informative)
Re:Only 1996 to the Present (Score:3, Informative)
A rough sequence of the video standards would be:
1981 MDA,Hercules,CGA (IBM)
1983 PCjr (IBM)
1984 EGA (IBM)
1986 TIGA (Texas Instruments), 8514/A (IBM)
1987 MCGA,VGA (IBM)
1990 XGA, XGA-2 (IBM)
1990+ SVGA,XGA,SXGA,UXGA (various manufacturers)
1990+ VESA (manufacturers form consortium)
The early 1990's was probably the time of greatest change, when all the manufacturers were trying to outperform each other on resolution, refresh rate, video memory size, and finally 2D acceleration. The resulting chaos and incompatibility between cards led to the formation of the VESA consortium.
The mid 1990's were the time in which many of the innovators formed: 3Dfx (1994, from MediaVision)
NVidia introduced the NV-1 in May 1995,
3Dfx introduced the Voodoo card in October 1996.
ATI had been around since 1985, but didn't introduce the 3D Rage until September 1996 (but with No Z-buffer).
Matrox introduced the m3D in 1997 (piggybacking onto a VGA card).
Rendition introduces one of the first MiniGL drivers in 1998.
By 1999, each company was releasing new graphics cards every six months.
Any good gaming web site will give you the product history of each manufacturer.