Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Software

Rumors of a GeForceFX 5800 Ultra Cancelation? 342

chris_oat writes "It seems that nVidia's GeForceFX 5800 Ultra may never see the light of day after months of super-hype and annoying delays. This article on megarad.com suggests that poor manufacturing yields are causing nVidia to rethink plans for its (new?) flagship part. Lack of an "Ultra" type solution from nVidia would leave ATI's Radeon9700 uncontested as the defacto performance part."
This discussion has been archived. No new comments can be posted.

Rumors of a GeForceFX 5800 Ultra Cancelation?

Comments Filter:
  • by RobertTaylor ( 444958 ) <roberttaylor1234.gmail@com> on Sunday February 09, 2003 @01:50PM (#5265489) Homepage Journal
    Glad to see that the difference between 2467 and 2550 frames per second is still very important...

    fp?
    • Important? (Score:2, Insightful)

      Of course its important.

      I'm sick && tired of reading that people say "oh, well the human eye only sees 30 fps, so anything else is over-kill".

      That's a bunch of boloney (pardon my language). People want *clairty* and *SMOOTHNESS* in their gaming performance, and although 30 fps delivers clarity from frame to frame, the transitions of frames only achieves a good smoothness above 60 fps.

      Most Linux apps aim for >= 60 fps. Go checkout Sourceforge for more details.
      • by Patrick May ( 305709 ) on Sunday February 09, 2003 @02:03PM (#5265579)
        I agree that it's important. When I'm MUDding, I need to see that next line of text come up instantly! Otherwise, the next thing I see will be You are looking down at your body from above....
      • Re:Important? (Score:5, Insightful)

        by Jugalator ( 259273 ) on Sunday February 09, 2003 @02:06PM (#5265598) Journal
        It's said that you stop seeing the difference at around 30 fps, but so far I've only heard comparisons with movies and TV. Other "rules" might be in effect when you display the same graphics on a screen much sharper than a movie, where each frame is clearly transitioned to another without any blur inbetween, like on a movie (at least I assume the transitions between frames on a movie screen isn't as defined as on a monitor).

        Would be interesting to know if 30 fps *is* enough (of course only *minimum* 30 fps) or if monitors need an even higher frame rater for humans to not see the transitions.

        What I'm more annoyed about is those who must run games in 1600x1200 and not 1024x768 on a typical 19" monitor, and then complain that a gfx card sucks since it doesn't perform good enough in 1600x1200. It's not like you have enough time to spot the microscopic pixels anyway. :-) And then there's FSAA to remove the pixelation even more.
        • Re:Important? (Score:5, Informative)

          by Camulus ( 578128 ) on Sunday February 09, 2003 @02:30PM (#5265758) Journal
          Just a couple of things to add. For me at least, 30 to 40 fps is a minimum requirement. However, that is the lowest it should possibly go. If it gets below 30 fps it starts turning into a slide show.

          ALso, aside from just the visual effects, more powerful hardware gives you better performance in game litterally. Example, the quake3 engine. In the quake 3 engine, you can jump much further with 150 fps then you can with 30 fps. The way it was coded if you were to jump, the game checks on a frame by frame basis to see where the jump is going. I think it was designed with a baseline of around 90 fps if I remember right. Which means that if you are going under that, your jumps will be shorter and over it, longer. Also, on Return to Castle Wolfenstein, if you fps ever drops below 30, then you rate of fire actually slows down. So, just FYI, fps can mean more then simply "how pretty" it looks.
          • Re:Important? (Score:2, Insightful)

            by CircaX ( 318527 )
            If a higher framerate allows a player to jump farther, run faster, or shoot more rounds per second in a game, it means that the programmers have no clue as to how to properly implement a physics engine. Frame rate ought to be completely independant from any other function the game engine has to handle; a player should jump the same distance no matter how well the graphics card can keep up with the game's world environment. Having a physics engine be dependent on the current framerate shows a flaw in the game's design, and it is just one more reason to stop using the sorely outdated Q3 engine to benchmark new hardware.

            Just imagine if this 'physics tied to framerate' applied to connection speed: people with Radeon 9700s would have gigabits of bandwith to play around with, while people stuck with a RagePro would have to deal with 28.8K rates.
            • Re:Important? (Score:4, Informative)

              by error0x100 ( 516413 ) on Sunday February 09, 2003 @07:06PM (#5267360)

              Having a physics engine be dependent on the current framerate shows a flaw in the game's design, and it is just one more reason to stop using the sorely outdated Q3 engine to benchmark new hardware.

              Just to clarify, again .. this WAS a bug in Quake3Arena. However, it WAS NOT a bug in the "Quake 3 engine". It was a bug in the Quake3 game code. The "Quake3 game" is separate from (and built on top of) the "Quake 3 engine". The engine is the basic graphics and network system, source code NOT available, while the Quake 3 game itself was built essentially as the "default mod" for this game, and the source code is available for it.

              The slightly-frame-rate-dependent jumping in Quake3 was a bug in the game code, and ONLY affected the jumping. The bug was fixed in one of the Quake3 patches. The game was intended to be designed so the physics were NOT frame-rate dependent. As you said, this would be a major flaw in a game design.

              If the physics in a game were frame-rate dependent, you would see a HUGE difference in physics performance between 30, 60 and 90 fps. These sorts of rates affect (badly designed) game physics in a big way - you would notice it quickly. No major commercial game intentionally has such flaws.

          • Re:Important? (Score:5, Informative)

            by error0x100 ( 516413 ) on Sunday February 09, 2003 @06:57PM (#5267319)

            Just to clarify, Quake3Arena wasn't specifically coded to do this, it was actually a bug, and it only affected the jumping physics, nothing else in the game was affected (it was not intentional behaviour, in fact the game was specifically designed to try to NOT have the physics dependent on the frame rate). (You could jump a little bit higher and in some maps this gave a big advantage, e.g. DM13, since you could take a shortcut to the megahealth). The bug was fixed in one of the last patches (I think they made it optional though).

            The jumping performance also wasn't proportional to the frame rate, the bug occurred around specific frame rates, such as 120 fps.

        • Re:Important? (Score:5, Interesting)

          by ergo98 ( 9391 ) on Sunday February 09, 2003 @02:33PM (#5265770) Homepage Journal
          It's said that you stop seeing the difference at around 30 fps

          And 16 million colours is more than the eye can see, and 44,100 samples per second is more than the ear can hear. Throughout the march of technology we've heard these ridiculously arbitrary "limits" of our senses, and invariably they are discounted at a future time. In essence you can consider them a sort of justification.

          but so far I've only heard comparisons with movies and TV

          Actually I've been paying attention at movies having heard the "well movies are 24fps and they look perfect": MOVIES LOOK LIKE TRASH. Seriously the next time you go to the movies pay close attention to any large movements on the screen and you'll be surprized how horrendous 24fps really is. For instance, my wife recently dragged me to see "Two Weeks Notice" and there is a scene where the camera pans laterally across a shelf full of shoes at a rate of about a screen width per 1/2 second-- It looks absolutely atrocious. For fast action most filmmakers either resort to the action taking a small portion of the screen, or they use slow motion effects, again because the action simply looks terrible at 24fps.

          However when you get down to it the root of the "X FPS is more than anyone can see" is people's astoundingly self-centered claims that no-one else can see more than 30fps, or some other metric. This can be disproved instantly via the Q3 command cg_maxfps. Set it to 30 and it looks like a horrendous slideshow. Set it to 45 and it looks like a 1998 computer. Set it to 80 and it feels smooth with a bit of jaggedness. Set it to 90 and it feels nice. You'd think this would disprove the 30fps'ers in an instant, but amazingly they persist.

          and then complain that a gfx card sucks since it doesn't perform good enough in 1600x1200. It's not like you have enough time to spot the microscopic pixels anyway. :-) And then there's FSAA to remove the pixelation even more.

          1600x1200 on a 19" monitor is hardly "microscopic" pixels, however to consider this in a forward thinking manner consider the heavyweight video-card required to do 1080p resolutions on a HDTV set? 1920x1024.

          FSAA, BTW, is tremendously difficult for video-cards to do (because they're actually rendering at 2x or greater resolutions): There is no current video card that could dream of doing even Urban Terror (a Q3 mod) at 1600x1200 with FSAA at acceptable frame-rates.
          • Re:Important? (Score:2, Insightful)

            by Jugalator ( 259273 )
            Actually I've been paying attention at movies having heard the "well movies are 24fps and they look perfect": MOVIES LOOK LIKE TRASH.

            Yes, I know what you're saying and have noticed it myself, although I'm sure it doesn't look as bad as 24fps on a monitor. Again, perhaps the only reason movies are watchable at all is that the bluriness at the frame transitions might make it easier for the brain to "add in" the extra information to interpolate. Yes, movies look kinda jerky to me but at least I tend to forget about it after a short while when I get into the movie story line more. I think I'd have a harder time with a monitor at 24 fps.

            I didn't know that Q3 had such a setting and if it properly fixes the frame rate it might be a decent tool to see the actual "when-you-don't-notice-the-difference" rate, although I'm sure it's individual.

            1600x1200 on a 19" monitor is hardly "microscopic" pixels

            Wow, I'd like to have your eye sight. :-)

            I use 1280x1024 on my 19" usually and even then the pixels are pretty small to me. :-) Sure, they are noticeable on a static display, but I wouldn't notice them if they changed at a rate of something like 70 fps. But that's just me of course. I guess ATi and the likes fear me since I don't need a 1600x1200 res on 19" to not get disturbed by the graphics. :-)
            • 1600x1200 on a 19" monitor is hardly "microscopic" pixels

              Wow, I'd like to have your eye sight. :-)

              I use 1280x1024 on my 19" usually and even then the pixels are pretty small to me. :-) Sure, they are noticeable on a static display, but I wouldn't notice them if they changed at a rate of something like 70 fps.


              In first-person shooters, you're typically looking for small visual details in known locations (when you're not just in a twitch-reflex situation). In Tribes 2, at least, it's nice to be able to spot an enemy without having to pick out the one off-colour pixel in a grainy mountainside texture map, and even better to see what kind of gun he's holding, or that he's repairing something.

              Features like zooming help you with the latter case but not the former (noticing the enemy in the first place).

              While high-resolution displays aren't vital, they definitely are helpful.
            • Re:Important? (Score:3, Informative)

              by JebusIsLord ( 566856 )
              Actually the reason 60hz is hard to view on a monitor while 24fps on a movie screen is fine, is that the entire movie frame is drawn at once, while each of those 60hz consist of a tiny dot drawing the whole screen. Flicker is therefore much more pronounced on a TV or monitor. LCDs by contrast do not do line-drawing and therefore look smashing at 60hz.
            • Re:Important? (Score:2, Informative)

              I'm sure it doesn't look as bad as 24fps on a monitor. Again, perhaps the only reason movies are watchable at all is that the bluriness at the frame transitions might make it easier for the brain to "add in" the extra information to interpolate

              Actually, one of the reason why movies are so horribly jerky is that the actual refresh rate is 48fps, even though the frame rate is 24fps. Each frame is projected twice. The reason for this is to reduce flickering and to protect the film.(projector lights are HOT). Unfortunately, this double exposure messes up the brains visual prediction system, much in the same way a 30fps game on a 60Hz screen, only more so. Since there is a tangible delay between capturing an image in the optic nerve and feeding it to the brain, a lot of prediction is carried out to predict what things are going to look like when you receive the visual stimuli.

              I agree that even a monitor at 48Hz would look worse than a movie theatre, but I expect this has something to do with the relatively low contrast movie screens have. A darker image takes longer to "see" than a bright one, not unlike how a photographer needs a longer exposure to take a picture in a dark environment.

              Ever seen "3d-glasses" that have one dark glass and one perfectly transparent, instead of the normal red and blue/green? Those work on that principle, and the effect is best when the camera rotates clockwise around an object or pans across a landscape from right to left. If you reverse the direction, the 3D-effect is also reversed.

              But i digress:

              My point is that the human vision is incredibly advanced with a lot of special adaptations. There is no framerate of the eye. Fighter pilots have been shown to be able to not only see but also correctly identify a picture of a plane even when the image is displayed just in a 200Hz flash.

              The ideal frame rate is the same rate as the monitor refresh, and to have a constant framerate. I'd much rather have 75fps at 75Hz than 80fps at 85Hz.
              • If you enable vsync (all MS certified drives do by default btw) then the system only draws frames in sync with the monitor, so tearing isn't apparent. So the best idea is to set your monitor at the highest refresh it will do and then let the vsync make things look pretty.
          • by nunofgs ( 636910 ) on Sunday February 09, 2003 @04:04PM (#5266309)
            HAH!!! between all your wisdom I found an error!!! the proper q3 command for capping the frame rates is actually com_maxfps! :)
          • And 16 million colours is more than the eye can see, and 44,100 samples per second is more than the ear can hear . Throughout the march of technology we've heard these ridiculously arbitrary "limits" of our senses, and invariably they are discounted at a future time. In essence you can consider them a sort of justification.

            These limits aren't arbitrary. You can test them the same way you proposed that frame rate limits be tested.

            For colour gradations, make a picture that has a very gradual colour ramp from 0-255 in each colour (or one that sweeps across colour tones, but that changes at most one component by at most one between adjacent bands).

            When I tried this with an old VGA card that used 18-bit colour, I could see banding. I had to stare for a while to let my eyes adjust, but I could see it.

            When I try it on a modern card with 24-bit colour, I see no bands if the monitor's gamma correction is properly adjusted.

            A monitor without gamma correction will end up expanding some brightness ranges and compressing others, with the result that gradations will not be visible at all in some areas and will be (barely) visible in others. Check your configuration before complaining.

            The 24-bit argument applies to distinguishing colours. Similar experiments (not performed by me) have shown that you get about 10 bits of depth in greyscale, as humans have more sensitive black and white vision than colour (which is why everything appears in shades of grey at night with poor lighting; go for an evening walk and look for badly-lit stop signs some time).

            You can do the same kind of tests with sound. It's actually more difficult with modern sound cards, as they have low-pass filters that cut off everything above about 22 kHz (nyquist rate of 44 kHz), but a PC speaker works. Or use a piezo buzzer and a signal generator if you're worried about the speaker efficiency dropping at high frequencies. My hearing, last time I tested it (and last time it was tested by a doctor), dropped out about about 18 kHz.

            The reason why higher frequencies are relevant at all is because of nonlinear behavior both in the speakers and in the human ear. Beat frequencies between high-frequency tones can turn into audible frequencies when interacting with nonlinear systems (this is how that two-tone ultrasonic speaker linked to a while back worked). However, the key is that the final tone you hear is in the audible frequency range. This means you can duplicate the sound perfectly by using a microphone that acts more like the human ear when recording (i.e. that has similar nonlinear effects), or by recording at high frequencies and applying appropriate transformations before downsampling.
            The fact remains that if I played a 20 kHz pure tone at you right now, you wouldn't hear it. And this is easy to verify by experiment.

            In summary, while you're most definitely right about frame rates, your other objections about limits are unfounded.
            • Actually 44khz was chosen for more practical reasons. Since audio signals at 44khz were originally designed for recording TV video to tape. To squeeze audio into a video stream:

              3 samples * 490/2 lines (interlaced) * 60hz = 44100
              Hz.

              How is that for arbitrary?
              • Actually 44khz was chosen for more practical reasons. Since audio signals at 44khz were originally designed for recording TV video to tape. To squeeze audio into a video stream:

                3 samples * 490/2 lines (interlaced) * 60hz = 44100 Hz.

                How is that for arbitrary?


                Like 8-bit colour components, it represents a convenient value. But, like 8-bit colour components, it hasn't been replaced because it's close enough to perception limits to be indistinguishable for practical purposes. This is especially true for sound, as there would be no reason not to go to 4 samples per line if needed (while making higher-fidelity colour components requires sacrificing either ease of use of graphics cards (non-power-of-two sizes for RGBA pixels) or sacrificing the alpha channel).

                [ObDisclaimer about high-fidelity equipment being needed for sound/image processing/compositing, where errors stack and sounds and colour values are rescaled/resampled.]
          • Re:Important? (Score:3, Interesting)

            by Dr. Spork ( 142693 )
            Just a simple question: What is the refresh rate of your manitor? If you can do 1600X1200 at more than 85Hz and notice the improvement, I'll be impressed. The difference you report between 80 and 90 fps almost certainly has to do with the fact that your monitor refresh was set so that you show exactly five "dead" screens per second. When you set it above 85, each of your screen refreshes shows a different frame, so of course it looks better, but it's not for the reason you think. Try running at constant 75 FPS and set your refresh to 75, and you will see it also looks great.
          • "FSAA, BTW, is tremendously difficult for video-cards to do (because they're actually rendering at 2x or greater resolutions): There is no current video card that could dream of doing even Urban Terror (a Q3 mod) at 1600x1200 with FSAA at acceptable frame-rates."

            No. This is bogus. The Kryo2 chip, along with the GF1/2, had a form of FSAA where it basically rendered the image at a much higher res than it was going to be displayed, but these don't exist anymore (at least not in the R300/NV30/NV25/R200). I BELIEVE this is called supersampling--don't quote me on this, I'm not a coder and don't care too much about FSAA modes (I have a GF3. I can't use AA in ANYTHING but the oldest games.). Supersampling takes a much larger performance hit, but a lot of people regard it as looking better than the newer method. This newer method is called multisampling--it actually renders the image multiple times, offsetting it each time. This is why color compression has become so important. 4x MSAA COULD take up to four times the memory bandwidth of normal rendering, but with adequate color compression, you could get it down to two times or 1.5 times the bandwidth. This is part of the reason why nVidia went with a 128-bit bus on the GFFX--it thought it had good enough color compression.

            Anyway, moving right along, there are two forms of MSAA (multisampling antialiasing)--ordered grid and rotated grid (once again, do not quote me on this).

            So basically, FSAA ain't as simple as rendering at 3200x2400 and reducing that to 1280x960 anymore.

          • FSAA, BTW, is tremendously difficult for video-cards to do (because they're actually rendering at 2x or greater resolutions

            While this was true until recently, there has been some significant progress in this area. The Parhelia for instance (bad example, I know, but stick with me) only anti-aliases the edge pixels of triangles that aren't joined to other triangles - spectacularly reducing the quantity of work needed to at least get the beneficial effects of AA if not true AA itself. There's also been some work on using non uniform sample grids, I believe this is how the GF4's AA.

            Go look at some benchmarks, the more modern cards still sweat with AA, but not nearly to the hernia inducing loads that my GF3 would be put under should I ever run AA on it... And this, to me, is what the latest generation of video cards are all about - much the same in terms of framerate, but a better visual quality.

            Dave
          • FSAA, BTW, is tremendously difficult for video-cards to do (because they're actually rendering at 2x or greater resolutions): There is no current video card that could dream of doing even Urban Terror (a Q3 mod) at 1600x1200 with FSAA at acceptable frame-rates.

            Sure. Just keep in mind that real-life video images automatically get "FSAA" when displayed on a TV screen, which is one reason why TV phosphors are nigh-invisible at much lower resolution than computer moniter pixels. The other reason is that you typically sit much closer to your computer monitor.

            You're absolutely right about movies though. The pan-strobe effect is most noticeable if you're sitting in the first ten rows or so.
        • seems to me that i heard a number more like 60 fps for simulations.

          i know these guys shot for 60 Hz..

          http://www.nads-sc.uiowa.edu/multimedia.htm

      • significant digits.

        Please note the extra zeros at the end of 2000 as opposed to 30.

        Believe it or not, they make a difference.

        No, not the sort of difference you're talking about, the sort of difference that means the difference between 2467 and 2556 doesn't make a difference, even though the difference between 30 and 60 does.

        Get the difference?

        KFG
      • The number is higher than 60, and varies depending upon the individual. For an explanation, go to:

        http://www.penstarsys.com/editor/30v60/30v60p1.h tm

        Further, it's not the average frames/second that informed gamers are interested in as much as the minimum fps, which if it ever drop below the optimal amount will mean the display goes from smooth to jerky.

        My flamebait comment of the day: sure, all the half-blind programmers and sysadmins here on /. may not notice the individual pixels on the screen when running 1600x1200 on a 19" screen, but other people can, and it does not look "real".

        Until I can't differentiate visually between what's on my screen and what I see in the big blue room outside my house, 3D graphics are not good enough.
      • Yes this is definitely true. A few years ago I was an (almost) tournament-level Quake 2 player, and I definitely performed MUCH better with a 60fps system than a 30fps system.

        One wrinkle that creeps in at high frame rates is synchronization with the monitor refresh. Typical monitors run 75-100Hz, and when the video card starts breaking into that range of frame rates, you can get bad stuttering effects where a few consective refreshes show different frames and then one refresh repeats a frame because the video card couldn't quite catch up. (or you can turn off "vsync" and get tearing instead of stuttering).

        There is definitely a benefit to an "overpowered" graphics system that can deliver 200fps, because it gives extra headroom for when the action gets intense and the frame rate drops a bit. You might not notice a drop from 200-150fps but you will REALLY notice 60fps-30fps...

        (John Carmack talks about this in a .plan from a few years ago, check the plan archives if you are interested)
      • Of course its important.

        Do any of you insightful people understand that you just can't display 2000+ fps on a video monitor (and LCD displays are even slower than CRT's)? OK, just maybe a frame rate above 30 fps might help a little, but if your system is actualy spending cpu power on rendering any more than the useful number of video frames, then it's really wasing time that it could be better spending on user input or data transfer or something else that really does matter in the game. Of course, this delay is also very small, so only hair splitting fanatics would care about it, but those are just the people going after unrealistically high frame rates.

        Of course, more video power can be applied other ways that do help the user, such as higher resolution, better lighting effects, and so on, but that isn't the issue that many here seem to care about - they just want frame rates that their video display is never going to show, so even if they foolishly think their eye can extremely high frame rates, they miss the basic truth that vido cards could get 100 times faster, but more frames will never reach their eyes.

    • Hmm. Take a look at the next story in the list about 14 minute orgasms and an alleged compound caled Retalanaline. Strange thing is that the term Retalanaline only appears twice in Google even though the article that is linked to [globalcomment.com] says it was written in 2001. One would think that such a compound would have more comment than that if it actually existed.

      While one fake story does not prove the other is fake it does indicate that the journalistic standards are somewhat lax.

    • We'll be pushing the performance barrier for a long time to come. Sure, nobody needs to run GlQuake at 300 fps, or Quake 3 at 120 fps, but Doom 3 will only run at 30 fps (for example.)

      New video card technology means speed in old games and features/visual quality in new games. I can guarantee you that no matter how advanced video cards have gotten in 5 years, artists and designers will still be able to generate content that slows the game to a crawl ;)
  • Does this thing run Linux/OpenGL stuff?
  • Aww nuts (Score:5, Funny)

    by Sayten241 ( 592677 ) on Sunday February 09, 2003 @01:52PM (#5265495)
    Now what am I gonna do with the extra card bay I've been saving below my AGP slot?
    • Nvidia will be doing MB's from now on not graphics cards really its that simple

      they now have a license for the Intel P4 bus
      and can do AMD

      really this is the way to go as you could do funky things
      and farm the chipset off to OEM's

      and all the problems of cooling goes away !

      regards

      John Jones
  • damn (Score:5, Funny)

    by s0rbix ( 629316 ) on Sunday February 09, 2003 @01:52PM (#5265501)
    This is dissapointing. I was hoping to get at least 10 fps in doom 3, but i suppose ill have to settle for 6...
    • Doom III has something in common with Doom I & II. When you wander into the room full of monsters, your first thought isn't "Oh my god I'm going to get blown to bits" it's "Oh my god I'm going to lag to hell and I won't be able to see myself get blown to bits"
  • Yeah... (Score:2, Informative)

    by rastachops ( 543268 )
    This is at Neowin.net [neowin.net] too. Its sad that if true, its never gonna make it to retail. I have been waiting and hoping for this card for at least a year. Im guessing I'll wait and see what the Ati Radeon 9900 is gonna be like.
  • ATI taking the lead (Score:2, Interesting)

    by trmj ( 579410 )
    I've always been a fan of ATI to start out with, so this will probably be biased. Fair warning.

    I know that nVidia cards process OpenGL better than ATI (from what I have seen), however ATI has the advantage of being compatible with almost everything that uses DirectX. This may not seem like much since it's a MS product, but more and more are using DirectX, especially since WinXP comes with default drivers for DX and not OpenGL.

    Plus ATI got their name on the GameCube, which makes them the coolest company ever.
    • Not that great (Score:2, Interesting)

      by CeZa ( 562197 )
      If nVidia falls behind, the "quality" of ATI would degrade. ATI is locked to primarily "Windows DirectX" right now, and if they don't see a need to produce other OS drivers due to increasing sales and weakening comptetitors, it is a loss for all consumers. Consider 1.5 years from now, Microsoft will not try to support the same card anymore. In order to have any support, you would have to use generic drivers. Linux is much better at moving forward and still maintaining previous hardware support.
      • Hey, you can get ATI cards bundled with Desktop Macs, and the Radeon Mobility sets are used in a variety of Apple laptops. So either ATI with Apple's help, or Apple with ATI's help, is definitely producing MacOS X, PowerPC, OpenGL drivers for ATI.
      • Well, one look at Apple, and one can see ATI cannot and will not abandon OpenGL any time soon. That is a likely lucrative contract.

        Plus, the FireGL cards (which are somewhat souped up Radeons) are important to them as well. As popular as DirectX is for gaming, it is worthless in the realm of professional 3D graphics.

        Now what nVidia does spur on is the now fledgling Linux OpenGL support for consumer cards. If nVidia were not in the market touting their Linux drivers, you can bet ATI would go back to not giving a rat's ass about official driver support for consumer level cards. They might even cease what cooperation they have been doing with the open source community if nVidia dropped out. GL support is not threatened, but the 'fringe' users out there would have a harder time getting support.
    • by fault0 ( 514452 ) on Sunday February 09, 2003 @02:13PM (#5265648) Homepage Journal
      > I know that nVidia cards process OpenGL better than ATI (from what I have seen),

      How so?

      > however ATI has the advantage of being compatible with almost everything that uses DirectX.

      What do you mean? NVIDIA is basically as compatable with DirectX as ATI is, and vice versa with OpenGL.

      > This may not seem like much since it's a MS product, but more and more are using DirectX

      More innovation has taken place in DirectX than OpenGL (discounting vendor specific extentions) in the last few years. However, there are important OpenGL-only games in the horizon, such as Doom3. Other newer games such as UT2k3 can run in either OpenGL or DirectX modes.

      > especially since WinXP comes with default drivers for DX and not OpenGL.

      WinXP also comes with opengl32.dll.
      • Actually, ATI currently has better DirectX support, since it's the only one of the two shipping DirectX 9 parts (Radeon 9500/9700) while NVidia's DirectX 9 part seems to be vapourware.

        NVidia's OpenGL ICD is more stable than the Radeon's, but nowhere near as good as the FireGL's (And FireGL's bootstomp Quadro's)

      • Yes, OpenGL did stagnate for a while when SGI had the equivalent of a mid-life crisis. However, OpenGL has been picking up the pace over the last year, and OpenGL 2.0 does look very promising.
    • by Anonymous Coward on Sunday February 09, 2003 @03:45PM (#5266218)
      I am a graphics programmer at a "leading" PC game dev studio. I just wanted to say you're basically full of bunk. :P

      "ATI has the advantage of being compatible with almost everything that uses DirectX" Hmm.. nVidia, SIS, Trident, Matrox, S3, all are too so ATI doesn't have an advantage, they're simply on par with the competition. Don't forget the Xbox with the nv2a chipset (basically GF3 with better DXT1 support and 2 vertex pipelines) is only programable via DirectX 8.

      I'm not sure what you mean by nvidia cards process OpenGL better than ATI. I think you mean they more fps in QuakeIII. Maybe so, but this would happen if id used Direct3D as well. If you mean ATI's drivers have traditionally been poor (poor not just in terms of running applications slowly, but poor in terms of running applications incorrectly or not at all) then that is unfortunatly very true. ATI has always struggled to deliver more than 50% of a product (they make good hardware, but without good software drivers you've got a bad product. 3D chip companies have more people on their software teams than their hardware teams!), but that is more of a 2001 issue than a 2003 issue. They've really come a long long way.

      BTW every major game, even those from id, use DirectX. You're actually talking about Direct3D, not DirectX, from what I can tell.
  • Shacknews (Score:5, Informative)

    by Deltan ( 217782 ) on Sunday February 09, 2003 @01:54PM (#5265521)
    Shacknews and HardOCP have news on its demise as well.

    HardOCP: http://www.hardocp.com/index.html#6494-1 [hardocp.com]

    Shacknews: http://www.shacknews.com/onearticle.x/24906 [shacknews.com]
    • Re:Shacknews (Score:5, Insightful)

      by ergo98 ( 9391 ) on Sunday February 09, 2003 @02:08PM (#5265606) Homepage Journal
      Have you noticed that they're all just using each other as "sources", and at the root of the entire debacle is a single posting by an anonymous user on a message board? I'm not saying that this isn't true, and really the reviews that have come out for the FX have been pretty lousy (especially given the fan solution), but this looks like manufactured news, and we all flock to read some other guy give a single paragraph and a link to some other guy with a single paragraph.

      Where's the cold, hard facts?
      • Would that be the posting by the anonymous ATI rumour-mill-starting-dept user?

        It wouldn't be the first time ;-)

        (moderating comment - never bought nVidia, big ATI fan)
    • Anyone more than me always thinking of Robocop when hearing about the site HardOCP? :-P I'm not even a Robocop fan but that damn abbreviation got stuck in my head. :-(
  • Vaporware (Score:4, Funny)

    by Anonymous Coward on Sunday February 09, 2003 @01:54PM (#5265522)
    And they say vaporware only applies to software.....
  • by Bullseye_blam ( 589856 ) <bullseye_1.yahoo@com> on Sunday February 09, 2003 @01:55PM (#5265532) Journal
    The story linked in this article, as well as others I've seen on the 'net, have stated that at least the pre-orders of the Geforce FX are being fulfilled, but that the card will never make it to retail. A suttle difference, perhaps, but a difference nonetheless.
  • Rumors, rumors (Score:5, Interesting)

    by eamber ( 121675 ) on Sunday February 09, 2003 @01:57PM (#5265547) Homepage Journal
    This story originated in a message board somewhere (can't recall where) - then it got posted to the Inquirer (and we all know how reputable they are)... now it's here on Slashdot.

    Gotta love the grapevine.
    • TheInquirer (Score:3, Insightful)

      by bstadil ( 7110 )
      the Inquirer (and we all know how reputable they are)

      You are wrong here. TheInquirer [theinquirer.net] have one of the best track records of any Tech Site that care to think rather than regurgitate press releases from corporations.

      • You are wrong here. TheInquirer [theinquirer.net] have one of the best track records of any Tech Site that care to think rather than regurgitate press releases from corporations.

        Yeah...... With headlines like [nationalenquirer.com]:

        OJ Attacks Daughter!
        Britney love triangle turning violent
        Motley Crue rocker is deadbeat dad

        They sure seem reputable to me!
  • ...and...? (Score:5, Interesting)

    by ottffssent ( 18387 ) on Sunday February 09, 2003 @02:02PM (#5265567)
    I see lots of comments predicting doom and gloom for nVidia already. The GFFX has been somewhat of a disappointment, both for consumers and for NV - it's too slow, too hot, and too hard to make. nVidia is not going to go into bankruptcy because of this however - they will still sell a few and will work madly on the next generation aimed for smaller design rules and will learn from their mistakes this time around. The GFFX isn't the death knell for the company, it's just an unpleasant reminder of what minor manufacturing difficulties can do in a nasty business like video card manufacture. They're already hard at work on the next-gen part, and I'm sure they've learned a lot with this one.

    Meanwhile ATI will enjoy higher profits and will have a bit of breathing room. Hopefully, they will use this time to extend their product offerings viz the R350 core, continue pouring money into driver development, and keep working on R400 or whatever their next-gen core ends up being called. In any event 6-9 months from now we will see these next-generation parts coming to market, and they will be just that much better.
    • Re:...and...? (Score:5, Interesting)

      by fault0 ( 514452 ) on Sunday February 09, 2003 @02:16PM (#5265674) Homepage Journal
      > nVidia is not going to go into bankruptcy because of this however - they will still sell a few and will work madly on the next generation aimed for smaller design rules and will learn from their mistakes this time around.

      People said this about 3dfx right when it released the long delayed, big, noise, power hungry Voodoo 5 5550 (while Nvidia had long taken the lead).

      Same thing seems to be have happening to Nvidia, only this time with ATI taking the lead.
      • As another poster mentioned, nvidia are branching out, the GFFX may well be a failure (or not even turn up according to this article) but the NForce2 is already a big success, they have a contract with M$ to supply the GPU for the XBox.

        I may be a little biased, just finished building my new compy with an NForce2 based mobo and quite simply it rocks.
      • Hardly 3dfx (Score:5, Interesting)

        by Namarrgon ( 105036 ) on Sunday February 09, 2003 @04:02PM (#5266294) Homepage
        3dfx alienated their OEM customers, and had no other income apart from their consumer gfx cards. When that was delayed & out-competed (again), it was inevitable that they'd go under.

        nVidia are a larger company with a string of huge successes to date. They have a much more diversified income, including some very popular OEM chips, the successful nForce2 (and less-successful Xbox) chipsets, a well-regarded pro card line, and a significant share of the Apple market too. Not to mention quite a bit of cash in the bank.

        A single high-end chip(which is a small % of their total revenue anyway), even if it failed completely, is not going to impact their bottom line that much. It'll have more impact on their image as graphics leader, but they have the resources to learn, move on, redesign and try again.

      • But did 3DFX have guaranteed revenue from sales of MS consoles? That may actually end up saving Nvidia in the short term, giving them enough time to get their act together.
    • nVidia's plans for the FX were greater than what actually happened. If this had been released with support for 256 bit memory, I think it would've stomped ATi big time.

      Sometimes it takes a brilliant failure like this to catapult R&D to the next level. Let's hope that happens here.

    • You are right that their demise is not just around the corner, but nVidia does have only a limited time to respond.

      This year's performance card is next year's value card, so it is not just a question of whether they will be able to produce a competitive performance card within a reasonable time. They need to produce a value card that is about as good as the Radeon 9700 within about a year ( a year and a half might be good enough). If nVidia is still playing catch-up a year from now then they are doomed.
  • So even if the rumours are true and this manufacturing process isn't working, they still have completed months of research, and still have the experience of trial and error. There's no reason they can not take another approach at a similar type of card if the current model doesn't work out. Sure it costs a lot of money, but I assume they have lots of money comparatively. ATI can stay on top for a while. There's no reason that nVidia can't overtake them at another point in time.
  • by MarcoAtWork ( 28889 ) on Sunday February 09, 2003 @02:05PM (#5265589)
    I mean, the Geforce 5 6000^H^H^H^H^H^H5800 while having A LOT of drawbacks (noise, takes up two slots, probably lots of heat) doesn't seem to have very much going for it over the ATI's offerings. The only thing right now is driver quality, but as far as I am hearing ATI is getting better at this lately...

    Note, I'm not an ATI fanboy (actually I'm running a GeForce1 right now) but I'm really appalled at what 3dfx^H^H^H^HNVidia was thinking when they created this card...
  • by Quaoar ( 614366 ) on Sunday February 09, 2003 @02:07PM (#5265603)
    Video card giant nVidia officially changed it's name today to 3dfx. Five minutes later the company declared bankruptcy.
  • It gets worse... (Score:5, Informative)

    by waytoomuchcoffee ( 263275 ) on Sunday February 09, 2003 @02:12PM (#5265640)
    Lack of an "Ultra" type solution from nVidia would leave ATI's Radeon9700 uncontested as the defacto performance part

    The Radeon 9900 [theinquirer.net] is expected out next month, with the new R350 core.

    I am glad I don't have Nvidia stock right about now.
  • by macado ( 114240 ) on Sunday February 09, 2003 @02:25PM (#5265726)
    I work at Best Buy (unfortunately) and we were instructed to stop selling all Preorder GeForce FX's and destory the boxes and give all the free stuff to the employees or whoever wanted them. Apparently at least the pre-orders will be fulfilled but I don't think the card is going to make it to the stores for quite some time do to "extremely limited supplies" (according to the store memo). At least I got a free Nvida t-shirt and Hat out of it. :o)

    -macado
  • by doormat ( 63648 ) on Sunday February 09, 2003 @02:30PM (#5265755) Homepage Journal
    http://www.visiontek.com [visiontek.com]

    Make sure you have your speakers on..
  • Vendor Confirmation (Score:5, Interesting)

    by l33t-gu3lph1t3 ( 567059 ) <arch_angel16@NOspAM.hotmail.com> on Sunday February 09, 2003 @02:31PM (#5265762) Homepage
    [H]ardOCP [hardocp.com] has confirmation that GeforceFX5800 Ultra graphics cards will not make it to retail, and are available as pre-order items only, for a limited time. However, the GeforceFX5800 non-ultra model *will* make it to retail, sans the elaborate cooling mechanism, and running at 400MHz GPU / 800MHz RAM.

    Additionally, it seems the "Radeon9900" information at Xbitlabs might be less accurate than it appears.

    This isn't the greatest news for Nvidia, but it doesn't exactly break the bank: Nvidia still has the lion's share of the graphics market, and will probably continue to keep that market simply due to Tier 1/2 OEM sales, as well as their reputation - even though ATI has faster hardware, Nvidia has had a history of rock-solid drivers 4 generations back. Although ATI's driver quality has improved significantly in recent times, they're still not up to par with Nvidia's. And be sure that Nvidia will capitalize on that, since they don't have bragging rights for their hardware currently.
  • Does this have anything to do with the Low-K dielectric yield problems that many (all?) fab vendors have been having in their .13u processes?

  • not good for nvidia (Score:4, Interesting)

    by MagicMerlin ( 576324 ) on Sunday February 09, 2003 @02:49PM (#5265848)
    NVidia's future looks quite uncertain. It appears they might be headed for a free fall. You can blame their problems with some bad business decisions, like backing AMD, but the real problem is that ATI's tech team is pulling ahead. The 9700 simply had a better designed core. Their position is remarkably similar to that of 3dfx during the introduction of the tnt2. The handwriting was on the wall, and there was nothing they could do about it, having sunk millions into technology consumers were just not interested in.
    • Backing AMD a bad business decision? If they are hurting in that area (I have not heard), it is not because they make chipsets for AMD motherboards, it is because their chipset is nothing special. Their experience with graphics card hasn't translated that well to motherboards, so the chipsets aren't worth any premium. Maybe if they put GPUs on the motherboards more frequently that weren't underpowered, they would have done better. I really haven't been keeping close track of nVidia and the nForce (hell, I still have a Voodoo3 running in this thing), but AMD has been doing well as far as I can tell, so there is certainly a market to exploit.
  • ATI Still Not There (Score:2, Interesting)

    by N8F8 ( 4562 )
    My most recent video card purchase, an ATI OEM 8500LE just died a little over a year after I bought it. Add too that the fact that the Mobility M4 in my $3K laptop still doesn't do all the tricks ATI promised. Not very impressive considering my TNT2 has been chugging away for years.
  • I kinda figured that with the way nVidia's hardware likes to pop like lightbulbs, eventually they'd release a card that was dead out of the box. To date, I've had a grand total of seven nvidia products die on me. The first was a TNT2 Ultra, followed by a TNT, followed by the replacement Ultra, then two TNT2 ultras at work, then my Geforce4 4400, then it's replacement, another 4400. This doesn't include the number of cards that my friends have had fail. The number would then be into the twenties.

    If the 4200 I'm using as a replacement for the 4400 dies, I'm going to ATI, and not looking back.
  • by Anonymous Coward on Sunday February 09, 2003 @03:34PM (#5266144)
    Everyone and their brother seems to be bellyaching about "losing the PCI slot" to the cooling system, and how horrible this is.

    How stupid.

    On practically every motherboard out there today, PCI 1 and the AGP slot share resources, so you're crippling your system performance by putting a card in each.

    As I remember it, PCI has four specific special IRQ channels allocated for it, and thus the original spec is for one IRQ for each. Modern motherboards get away with this by having different slots share the bus mastering, so that two devices can piggyback on one slot. Usually, the onboard IDE controller piggybacks on one slot, and the last two slots (usually PCI 5 and 6) are often coupled together. By the same token, the AGP slot often shares an IRQ with PCI 1.

    So, in short, if you're going to complain about the cooling system, complain about it being loud. You weren't losing anything on your motherboard that you could even use to begin with.
  • by The Baron (nV News) ( 649004 ) on Sunday February 09, 2003 @03:40PM (#5266187) Homepage Journal
    Yes, 5800 Ultra is gone. 5800 will be for sale, but at 400/800, it's not going to win any speed awards (unless drivers manage to improve its performance by 20%, which won't happen by the time R350 (successor to the Radeon 9700) comes out). The chips used for the Ultra will still be used, however, in the QuadroFX 2000. ATI's R300-based FireGL cards are still poor performers, and even with relatively poor drivers from nVidia, the FX 2000 will beat it handedly (the Quadro4 is beating the FireGL cards as well, according to Tom's Hardware).

    But, it's not just a rumor anymore. When it first came to [H], everyone regarded it as BS. It was a rumor posted on a board that spread incredibly rapidly. But, apparently it's been confirmed by either OEMs or nVidia itself to those with good contacts. BFG has stopped taking preorders, AFAIK, because...

    "According to an e-mail John Malley sent out a couple of days ago, BFG is concerned that pre-sales may exceed their allocation of units."

    So, yes, the 5800Ultra is gone. Oh well. NV35 in June, according to some.

  • nooo (Score:3, Funny)

    by handsome_robot ( 649001 ) on Sunday February 09, 2003 @04:20PM (#5266406)
    Damn! I had several dozen of these things on pre-order, too.

    My plans to build a hovercraft are smoot!!

    *shakes fist upwardly*
  • by X-Guy ( 459026 )
    Nothing but rumors. Sites like the Inquirer post every rumor they hear, even when it's ridiculous. Remember when they were saying NV30 was definitely a two-chip solution. Remember people saying it definitely had a 256 bit memory interface? All it takes is one bozo posting to a forum and claiming he has inside information and the Inquirer will post it and you get dozens of fan sites acting like it was true.
  • So this is why they bought 3DFX's IP. Presumably 3DFX had patented this business model? :-)

    Maybe this way they can change name to 3DFX as well before the end...

    - Chris
  • Pure poetry (Score:5, Informative)

    by badasscat ( 563442 ) <basscadet75@@@yahoo...com> on Sunday February 09, 2003 @05:25PM (#5266789)
    Am I the only one that sees how freakin' poetic this is? This card was touted as the first real tangible result of the marriage between NVidia and 3Dfx (one of the reasons for the "FX" moniker, supposedly), and the company's having the exact same problems as 3Dfx did with their Voodoo 4 and 5's. Namely, that they're not as fast as people expected, they use too much power and generate too much heat. And their competition is passing them by.

    Still, I don't see NVidia in the same precarious position as 3Dfx was at the time. NVidia likes to point out that after the latest Radeons were released by ATI, NVidia's market share actually went up, not down. The super-performance market is actually a very small market, and NVidia still offers the best value out there for mainstream users in the GeForce 4 Ti4200. For most people, the extra $250 they'd spend on a Radeon 9700 Pro vs. a Ti4200 is just not worth it - the extra few frames per second you'd get in most games are generally not even that noticeable, and there are a lot of better ways to spend that money. I don't really think NVidia's got a lot to worry about, then - unless the performance gulf and manufacturing problems become so pronounced that public perception (or misperception) filters down to even the mainstream products (as has been ATI's bugaboo over the years).

    Still, it looks like the GeForce FX has been NVidia's first real dud in some time. No doubt the "stock" FX 5800's will be a good value once the NV35 is released (just as the Ti4200's are a good value now), but at the moment the card doesn't seem to really fit in any niche. Performance gamers will choose the Radeon 9700 Pro, mainstream gamers will choose the Ti4200, and low-end or business users will continue choosing ultra low-cost but perfectly capable cards like the GeForce 2 Ti.
  • by Graymalkin ( 13732 ) on Sunday February 09, 2003 @06:39PM (#5267235)
    If this rumor mongering is all true, as I'm not convinced, it is yet another eerie 3Dfx parallel attached to the GFFX (E3DP?). Since the Radeon 9700 was released I've been really anxious to see what nVidia was going to answer with in the form of the NV30. I'm not one to buy the high end obsolete within a week video cards but I really want to know what chip I'm going to see in discounted cards in six months.

    I was seriously unimpressed with the GFFX. This is an odd feeling as new nVidia cards have in the past been truly impressive and something to lust after.

    "I sense something. A presence I've not felt since..."

    While 3Dfx was not in the exact same position as nVidia is market penetration wise and financially it seems nVidia is pulling a technological page from their book. The GFFX 5800 Ultra Megazord seems a great deal like the Voodoo 5. It is a power hungry beat of a video card that doesn't live up to all of the hype that's been surrounding it since August when the Radeon 9700 needed an answer by nVidia.

    Of course the GFFX will improve and in six more months they'll have a GFFXMXKY that comes as the toy in a box of Count Chocula. Sharing many similarities with the Voodoo 5 isn't going to necessarily Doom the card (get it?) but it is giving ATi a huge shot in the arm. They've got a 5 month old card that performs about as well as nVidia's latest offering, that is something they haven't been able to boast before. All ATi has to do is not screw up and they will get back a bunch of users who abandoned them when the GeForce smoked the Radeons like fat chronic blunts with a mere driver upgrade.

    Even though ATi has the advantage now I think nVidia will come back with a really strong chip PDQ. They aren't going to accept defeat because their card requires an onboard RTG to run decently. If ATi keeps their momentum going they could top even the next NV chip nVidia will release. Do I care one way or the other? Hell no. I don't want to see either of them lose out, I want as much competition as possible to I get more frames with excellent visual quality for the buck. It will be great to be able to enable all of Doom 3's visual effects with AA and still be able to play the game, especially after people like Raven or Rogue license the engine and build the next Jedi Knight or Alice with it.

One person's error is another person's data.

Working...