Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Graphics Software

The Last Days at 3dfx 219

sand writes "FiringSquad has a detailed account of what happened in the final days at 3dfx. Every 3dfx product that was released or upcoming is discussed by a former 3dfx employee with inside knowledge on what caused the product delays (including an employee who forgot to fly to Asia to pickup the first Voodoo5 chips). He also discusses money mismanagement and the STB merger. It's a very enlightening article for anyone who's interested in 3D graphics and what goes on inside these companies."
This discussion has been archived. No new comments can be posted.

The Last Days at 3dfx

Comments Filter:
  • Re:Voodoo cards (Score:2, Informative)

    by Anonymous Coward on Thursday September 26, 2002 @09:35AM (#4335697)
    Voodoo II SLI completely creamed all competition at the time. It took the TNT2 chipsets for there to be a serious competitor.
  • Re:Glide emulator? (Score:3, Informative)

    by mccalli ( 323026 ) on Thursday September 26, 2002 @10:00AM (#4335902) Homepage
    Doh! :-)

    OK, that search led me to here [clara.net] where a good few are around.

    Sentinel Returns can live again...

    Cheers,
    Ian

  • by Anonymous Coward on Thursday September 26, 2002 @10:28AM (#4336114)
    Actually, they weren't the first makers of the modern soundcard. I think that distinction would go to Adlib [oldskool.org] not Creative Labs. Creative Labs first showing of a soundcard was the incredibly medicore Game Blaster. We also can't forget the incredible for its time Roland MT32 which still has kick ass Midi today.
  • Re:GLIDE (Score:2, Informative)

    by SScorpio ( 595836 ) on Thursday September 26, 2002 @10:29AM (#4336120)

    Sure Direct3D is a closed source API, there is always OpenGL is you want to use only open source APIs

    The main problem with Glide was that it has created by one company and only that company's products could support it.

    Yeah, Direct3D isn't open for anyone to change, but it is a standard that anyone can create a product that adheres to it. Microsoft also seems to be very attuned to market demands and is keeping good relationships with both nVidia and ATI. These relationships allow Microsoft to know and impliment the new desired features into Direct3D.

    These new features can be added to OpenGL via extensions; however, the extensions become proxitory and your end up with different company's extensions doing the same thing but are imcompatible. At least with Direct3D this doesn't happen

  • by Forkenhoppen ( 16574 ) on Thursday September 26, 2002 @11:38AM (#4336706)
    Simple; Creative does it with Creative marketing.

    The Audigy, for instance, is little more than a gamer's card. Any serious review of the card that you come across on the internet will tell you this, or if you bought it hoping for some advanced features, you'll find it out for yourself.

    Here are some examples of this Creative marketing:

    - The Audigy does support 24bit/96kHz sound playback, as advertised, but does not actually play it at that. The second it hits the main chip, it's downmixed to 16/44. So while you can play sound at the higher frequency to it, you're not actually going to hear it. (This is what they mean when they plaster 24/96 all over the boxart.)

    - The Audigy does not have independant recording and playback volume controls on the line in. If you wish to record something on a TV tuner, for instance, then you'll have to either listen to it while it records, or turn off the global volume on your soundcard. (Or turn off the speakers.) This makes it impossible to use an Audigy in a PVR setup.

    - The much-touted sub 100dB SNR is only on playback. On recording, the SNR is much higher.

    I haven't been this disappointed in a card since my SB 128 upgrade ran slower than my SB 64. (I suspect the 64 did the soundfonts in hardware; the 128 did them in software.) Looking at the new Audigy 2, it appears that they'll be offering the 24/96 functionality that was insinuated to be present in the original Audigy, but I don't think I'll bite. I think my next card will be a Hoontech.

    And, of course, this is all off-topic..
  • by travail_jgd ( 80602 ) on Thursday September 26, 2002 @12:32PM (#4337208)
    Sorry, but you're completely wrong.

    1. The Voodoo 3, 4, and 5 all had integrated 2D and 3D.

    2. If OEMs didn't like add-on cards, why did they sell them preinstalled? I was shopping online for my PC way-back-when, and Voodoo 1 (and eventually Voodoo 2) cards were offered as (overpriced) options. Just like you can get NIC's and CD-RWs as options now.

    3. The GeForce and Radeons weren't the main killers of 3dfx. The other contributing factors were:

    a. Technical limitations. The Voodoo 3 and 4 line weren't much more than fast Banshees. My Voodoo 3 card has most of the same limitations as a Voodoo 1 (16-bit color, 256x256 textures), but almost no additional 3D features (primarily higher screen resolution).

    b. Marketing. The Voodoo 1 and 2 lines were always the fastest in benchmarks. NVidia's TNT line was slower (but had more stable framerates), and Matrox was known for picture quality. When the Voodoo 3/4 came out, 3dfx lost the speed crown, and started talking about "image quality".

    c. NVidia's 6-month release cycle. 3dfx couldn't keep up, and their "older" cards had an outdated feature-set. The GeForce was a big advance, but only in terms of fill-rate; there weren't any games (at that time) taking advantage of the new features. 3dfx lost a lot of the hearts of gamers and enthusiasts when they started pushing back release dates.

    d. Buying STB. I don't think that the purchase was the final nail in 3dfx's coffin, but it certainly didn't provide the desired benefits.
  • by Anonymous Coward on Thursday September 26, 2002 @01:05PM (#4337494)
    The Audigy does support 24bit/96kHz sound playback, as advertised, but does not actually play it at that. The second it hits the main chip, it's downmixed to 16/44. So while you can play sound at the higher frequency to it, you're not actually going to hear it. (This is what they mean when they plaster 24/96 all over the boxart.)
    It's actually worse than that. One of the Winamp developers (Brennan Underwood?) did an analysis of the Audigy's audio capabilities, and found that:
    1. The Audigy's driver rejects all audio streams above 16/48. It doesn't even downmix it for you -- it just rejects everything above that outright.
    2. The DACs are 24/96 capable, but the DSP doesn't seem to be. That's how they get away with advertising 24/96.
    3. You could, in theory, get 24/96, but only if you had a 24/96 digital source and outputted it directly to the SPDIF port, bypassing the DSP entirely.
    4. Creative is full of shit.
    Although the last one is not much of surprise to people who dealt with the SB Live! fiasco. (The SB Live!, due to not being 100% PCI compliant, couldn't share IRQs correctly; but ACPI requires that all PCI devices share the same IRQ, so if you had an ACPI-compliant board and OS, you were screwed. Creative's tech support blamed the motherboards, telling people that their boards were unsupported and that they should build new computers with motherboards that didn't enable ACPI's IRQ-sharing feature.)
  • by xmnemonic ( 603000 ) <xmnemonic@@@softhome...net> on Thursday September 26, 2002 @04:31PM (#4339269) Journal
    There was also...
    -first usage of an accumulation buffer ("T-buffer") on a consumer video card, creating the anti-aliasing craze that we have today
    -very fast memory architecture courtesy of Gigapixel subsidiary, said to influence the creation of LMA in the Geforce cards

    Don't forget the Rampage (Geforce 3 killer, taped out days before 3dfx was bought by nvidia, some pictures of it in a lab are floating around on the net) which would have had some features that are only now being explored, such as:

    -ability to accelerate Photoshop filters (potential for 3dlabs new "P10" architecture)
    -maximum memory capacity of 256MB
    -4 way onboard SLI, i.e. scalable multiple chip architecture
    -~12GB/s memory bandwidth, compared to Geforce 3's ~7

Stellar rays prove fibbing never pays. Embezzlement is another matter.

Working...