Rumors of a GeForceFX 5800 Ultra Cancelation? 342
chris_oat writes "It seems that nVidia's GeForceFX 5800 Ultra may never see the light of day after months of super-hype and annoying delays. This article on megarad.com suggests that poor manufacturing yields are causing nVidia to rethink plans for its (new?) flagship part. Lack of an "Ultra" type solution from nVidia would leave ATI's Radeon9700 uncontested as the defacto performance part."
bloody 'leet gamerz' (Score:5, Funny)
fp?
Important? (Score:2, Insightful)
I'm sick && tired of reading that people say "oh, well the human eye only sees 30 fps, so anything else is over-kill".
That's a bunch of boloney (pardon my language). People want *clairty* and *SMOOTHNESS* in their gaming performance, and although 30 fps delivers clarity from frame to frame, the transitions of frames only achieves a good smoothness above 60 fps.
Most Linux apps aim for >= 60 fps. Go checkout Sourceforge for more details.
Re:Important? (Score:5, Funny)
Re:Important? (Score:5, Insightful)
Would be interesting to know if 30 fps *is* enough (of course only *minimum* 30 fps) or if monitors need an even higher frame rater for humans to not see the transitions.
What I'm more annoyed about is those who must run games in 1600x1200 and not 1024x768 on a typical 19" monitor, and then complain that a gfx card sucks since it doesn't perform good enough in 1600x1200. It's not like you have enough time to spot the microscopic pixels anyway.
Re:Important? (Score:5, Informative)
ALso, aside from just the visual effects, more powerful hardware gives you better performance in game litterally. Example, the quake3 engine. In the quake 3 engine, you can jump much further with 150 fps then you can with 30 fps. The way it was coded if you were to jump, the game checks on a frame by frame basis to see where the jump is going. I think it was designed with a baseline of around 90 fps if I remember right. Which means that if you are going under that, your jumps will be shorter and over it, longer. Also, on Return to Castle Wolfenstein, if you fps ever drops below 30, then you rate of fire actually slows down. So, just FYI, fps can mean more then simply "how pretty" it looks.
Re:Important? (Score:2, Insightful)
Just imagine if this 'physics tied to framerate' applied to connection speed: people with Radeon 9700s would have gigabits of bandwith to play around with, while people stuck with a RagePro would have to deal with 28.8K rates.
Re:Important? (Score:4, Informative)
Having a physics engine be dependent on the current framerate shows a flaw in the game's design, and it is just one more reason to stop using the sorely outdated Q3 engine to benchmark new hardware.
Just to clarify, again .. this WAS a bug in Quake3Arena. However, it WAS NOT a bug in the "Quake 3 engine". It was a bug in the Quake3 game code. The "Quake3 game" is separate from (and built on top of) the "Quake 3 engine". The engine is the basic graphics and network system, source code NOT available, while the Quake 3 game itself was built essentially as the "default mod" for this game, and the source code is available for it.
The slightly-frame-rate-dependent jumping in Quake3 was a bug in the game code, and ONLY affected the jumping. The bug was fixed in one of the Quake3 patches. The game was intended to be designed so the physics were NOT frame-rate dependent. As you said, this would be a major flaw in a game design.
If the physics in a game were frame-rate dependent, you would see a HUGE difference in physics performance between 30, 60 and 90 fps. These sorts of rates affect (badly designed) game physics in a big way - you would notice it quickly. No major commercial game intentionally has such flaws.
Re:Important? (Score:5, Informative)
Just to clarify, Quake3Arena wasn't specifically coded to do this, it was actually a bug, and it only affected the jumping physics, nothing else in the game was affected (it was not intentional behaviour, in fact the game was specifically designed to try to NOT have the physics dependent on the frame rate). (You could jump a little bit higher and in some maps this gave a big advantage, e.g. DM13, since you could take a shortcut to the megahealth). The bug was fixed in one of the last patches (I think they made it optional though).
The jumping performance also wasn't proportional to the frame rate, the bug occurred around specific frame rates, such as 120 fps.
Re:Important? (Score:2)
Re:Important? (Score:5, Insightful)
One reason is that Quake3 suffers far worse at 24 fps than a movie is that your camera pan rate is typically MUCH quicker than you'll ever see in a movie. When playing Q3, you often need to pan your camera up to 180 degrees horizontally in less than a quarter of a second (I'm being generous, thats if you're slow). So thats a camera pan rate of 720 degrees/sec. At 24 fps, that means a delta of 30 degrees per frame; those are pretty big jumps, each image will be quite different, and your brain has to work pretty hard to perceive the motion. I doubt you'll ever really see a camera pan that fast in a movie, except in very rare and particular cases.
In Quake, your brain is also trying to do a lot more work to analyse the image its getting, while in a movie you are normally fairly relaxed and don't concentrate that hard on the image.
Its easier to pick up image "choppiness" in your peripheral vision. If you sit fairly close to the screen in the cinema, and you're looking at the center, you can fairly easily pick up jerkiness in motion at the sides of the screen (out of the "corners of your eyes").
Re:Important? (Score:5, Interesting)
And 16 million colours is more than the eye can see, and 44,100 samples per second is more than the ear can hear. Throughout the march of technology we've heard these ridiculously arbitrary "limits" of our senses, and invariably they are discounted at a future time. In essence you can consider them a sort of justification.
but so far I've only heard comparisons with movies and TV
Actually I've been paying attention at movies having heard the "well movies are 24fps and they look perfect": MOVIES LOOK LIKE TRASH. Seriously the next time you go to the movies pay close attention to any large movements on the screen and you'll be surprized how horrendous 24fps really is. For instance, my wife recently dragged me to see "Two Weeks Notice" and there is a scene where the camera pans laterally across a shelf full of shoes at a rate of about a screen width per 1/2 second-- It looks absolutely atrocious. For fast action most filmmakers either resort to the action taking a small portion of the screen, or they use slow motion effects, again because the action simply looks terrible at 24fps.
However when you get down to it the root of the "X FPS is more than anyone can see" is people's astoundingly self-centered claims that no-one else can see more than 30fps, or some other metric. This can be disproved instantly via the Q3 command cg_maxfps. Set it to 30 and it looks like a horrendous slideshow. Set it to 45 and it looks like a 1998 computer. Set it to 80 and it feels smooth with a bit of jaggedness. Set it to 90 and it feels nice. You'd think this would disprove the 30fps'ers in an instant, but amazingly they persist.
and then complain that a gfx card sucks since it doesn't perform good enough in 1600x1200. It's not like you have enough time to spot the microscopic pixels anyway.
1600x1200 on a 19" monitor is hardly "microscopic" pixels, however to consider this in a forward thinking manner consider the heavyweight video-card required to do 1080p resolutions on a HDTV set? 1920x1024.
FSAA, BTW, is tremendously difficult for video-cards to do (because they're actually rendering at 2x or greater resolutions): There is no current video card that could dream of doing even Urban Terror (a Q3 mod) at 1600x1200 with FSAA at acceptable frame-rates.
Re:Important? (Score:2, Insightful)
Yes, I know what you're saying and have noticed it myself, although I'm sure it doesn't look as bad as 24fps on a monitor. Again, perhaps the only reason movies are watchable at all is that the bluriness at the frame transitions might make it easier for the brain to "add in" the extra information to interpolate. Yes, movies look kinda jerky to me but at least I tend to forget about it after a short while when I get into the movie story line more. I think I'd have a harder time with a monitor at 24 fps.
I didn't know that Q3 had such a setting and if it properly fixes the frame rate it might be a decent tool to see the actual "when-you-don't-notice-the-difference" rate, although I'm sure it's individual.
1600x1200 on a 19" monitor is hardly "microscopic" pixels
Wow, I'd like to have your eye sight.
I use 1280x1024 on my 19" usually and even then the pixels are pretty small to me.
Gaming resolution. (Score:3, Insightful)
Wow, I'd like to have your eye sight.
I use 1280x1024 on my 19" usually and even then the pixels are pretty small to me.
In first-person shooters, you're typically looking for small visual details in known locations (when you're not just in a twitch-reflex situation). In Tribes 2, at least, it's nice to be able to spot an enemy without having to pick out the one off-colour pixel in a grainy mountainside texture map, and even better to see what kind of gun he's holding, or that he's repairing something.
Features like zooming help you with the latter case but not the former (noticing the enemy in the first place).
While high-resolution displays aren't vital, they definitely are helpful.
Re:Important? (Score:3, Informative)
Re:Important? (Score:2, Informative)
Actually, one of the reason why movies are so horribly jerky is that the actual refresh rate is 48fps, even though the frame rate is 24fps. Each frame is projected twice. The reason for this is to reduce flickering and to protect the film.(projector lights are HOT). Unfortunately, this double exposure messes up the brains visual prediction system, much in the same way a 30fps game on a 60Hz screen, only more so. Since there is a tangible delay between capturing an image in the optic nerve and feeding it to the brain, a lot of prediction is carried out to predict what things are going to look like when you receive the visual stimuli.
I agree that even a monitor at 48Hz would look worse than a movie theatre, but I expect this has something to do with the relatively low contrast movie screens have. A darker image takes longer to "see" than a bright one, not unlike how a photographer needs a longer exposure to take a picture in a dark environment.
Ever seen "3d-glasses" that have one dark glass and one perfectly transparent, instead of the normal red and blue/green? Those work on that principle, and the effect is best when the camera rotates clockwise around an object or pans across a landscape from right to left. If you reverse the direction, the 3D-effect is also reversed.
But i digress:
My point is that the human vision is incredibly advanced with a lot of special adaptations. There is no framerate of the eye. Fighter pilots have been shown to be able to not only see but also correctly identify a picture of a plane even when the image is displayed just in a 200Hz flash.
The ideal frame rate is the same rate as the monitor refresh, and to have a constant framerate. I'd much rather have 75fps at 75Hz than 80fps at 85Hz.
Re:Important? (Score:2)
Re:Important? (Score:4, Funny)
Limits to human perception. (Score:3, Informative)
These limits aren't arbitrary. You can test them the same way you proposed that frame rate limits be tested.
For colour gradations, make a picture that has a very gradual colour ramp from 0-255 in each colour (or one that sweeps across colour tones, but that changes at most one component by at most one between adjacent bands).
When I tried this with an old VGA card that used 18-bit colour, I could see banding. I had to stare for a while to let my eyes adjust, but I could see it.
When I try it on a modern card with 24-bit colour, I see no bands if the monitor's gamma correction is properly adjusted.
A monitor without gamma correction will end up expanding some brightness ranges and compressing others, with the result that gradations will not be visible at all in some areas and will be (barely) visible in others. Check your configuration before complaining.
The 24-bit argument applies to distinguishing colours. Similar experiments (not performed by me) have shown that you get about 10 bits of depth in greyscale, as humans have more sensitive black and white vision than colour (which is why everything appears in shades of grey at night with poor lighting; go for an evening walk and look for badly-lit stop signs some time).
You can do the same kind of tests with sound. It's actually more difficult with modern sound cards, as they have low-pass filters that cut off everything above about 22 kHz (nyquist rate of 44 kHz), but a PC speaker works. Or use a piezo buzzer and a signal generator if you're worried about the speaker efficiency dropping at high frequencies. My hearing, last time I tested it (and last time it was tested by a doctor), dropped out about about 18 kHz.
The reason why higher frequencies are relevant at all is because of nonlinear behavior both in the speakers and in the human ear. Beat frequencies between high-frequency tones can turn into audible frequencies when interacting with nonlinear systems (this is how that two-tone ultrasonic speaker linked to a while back worked). However, the key is that the final tone you hear is in the audible frequency range. This means you can duplicate the sound perfectly by using a microphone that acts more like the human ear when recording (i.e. that has similar nonlinear effects), or by recording at high frequencies and applying appropriate transformations before downsampling.
The fact remains that if I played a 20 kHz pure tone at you right now, you wouldn't hear it. And this is easy to verify by experiment.
In summary, while you're most definitely right about frame rates, your other objections about limits are unfounded.
Re:Limits to human perception. (Score:2)
3 samples * 490/2 lines (interlaced) * 60hz = 44100
Hz.
How is that for arbitrary?
Re:Limits to human perception. (Score:2)
3 samples * 490/2 lines (interlaced) * 60hz = 44100 Hz.
How is that for arbitrary?
Like 8-bit colour components, it represents a convenient value. But, like 8-bit colour components, it hasn't been replaced because it's close enough to perception limits to be indistinguishable for practical purposes. This is especially true for sound, as there would be no reason not to go to 4 samples per line if needed (while making higher-fidelity colour components requires sacrificing either ease of use of graphics cards (non-power-of-two sizes for RGBA pixels) or sacrificing the alpha channel).
[ObDisclaimer about high-fidelity equipment being needed for sound/image processing/compositing, where errors stack and sounds and colour values are rescaled/resampled.]
Re:Limits to human perception. (Score:2)
Not when I tried it. These were images that didn't cover the whole gradient, and had tens of pixels per band (Mandelbrot sets with fun colour maps to iteration counts, if you're curious).
Regarding the nyquist rate of CD-Audio:
How do you represent a 22kHz sawtooth wave?
A sawtooth wave is a sine wave at the fundamental frequency plus a bunch of harmonics at higher frequencies. If you can hear the difference between a 22 kHz sawtooth wave and a 22 kHz sine wave, you're not from this planet, because the harmonics are about a factor of two out of human hearing range.
By all means get a signal generator and try it. Or bring a piezo speaker into your local university's electronics lab to perform the test.
If you can hear a 22 kHz *anything*, you have exceptional hearing to begin with.
How can you accurately represent a 18kHz wave
without harmonic distortions?
Answer: You can't, because you get an interference
Nope. Break out that signal theory book - as long as the original signal was a pure tone, you have no aliasing. Sampling aliases higher frequencies down to lower on recording (which is why you need a low-pass input filter on any digital recording device). No high frequencies, no aliased signals at lower frequencies. Reconstruction causes higher-frequency harmonics if your output filtering is bad, but a) output filtering on sound cards is decent, and b) the harmonics are above your hearing range unless you're playing back at less than 10 kHz.
In summary, you made an assumption that I didn't re. colour, and don't seem to be working through the math re. sound.
Re:Limits to human perception. (Score:2)
There are three types of artifact that could be imposed: quantization, low-pass filtering, and slew-rate limiting. Quantization would result if the DAC on the video card could generate fewer than 256 levels. While this could make any given pair of bands the same colour, contrast between other bands would have been worse, so I think it can be dismissed without further consideration.
Slew-rate limiting occurs if there is a limit to the speed at which a signal can be changed. However, this would affect high-contrast edges first. The fact that I can see a grid pattern or the text I'm typing suggests that it is not a factor in seeing or not seeing banding on smooth gradients.
The more important artifact is low-pass filtering. This tends to average the colours of adjacent pixels, smoothing gradients. This would indeed remove or reduce banding. As the eye does differential processing, the perceived effect would be worse for low-contrast edges (the type I'm trying to measure). However, two factors suggest that this is not what limits my ability to perceive banding. Firstly, the DACs on modern video cards are rated to 300+ MHz sampling rates, while conventional desktop modes use far lower sampling rates (1024x768x85 has a dot clock of around 95 MHz, even after adding the horizontal and vertical blanking interval contributions). This means that if there is filtering, it's unlikely to be in the graphics card (leaving the monitor and monitor cable as options). Secondly, I can see banding with 18-bit colour even at high resolutions (where filtering problems are worst, due to high dot clocks), but I can't see banding with 24-bit colour even at low resolutions (where filtering problems are least severe). If low-pass filtering was limiting contrast at low resolutions, the problem should be bad enough at high resolutions to be noticeable with higher-contrast edges. Ditto for varying refresh rates with a low-resolution mode instead of varying resolution, though that only lets you change the dot clock by a factor of two or three due to monitor vsync limits.
The fact that I can't see banding in low resolution modes also suggests that dot pitch of the monitor is not an issue (I buy monitors with overspecced resolution capability out of habit anyways).
In short, I'm not convinced it's the hardware that prevents me from seeing banding. If you can think of a source of smoothing that I'm missing, I'll certainly reconsider my view.
I can believe that there are exceptional humans able to perceive contrasts more accurately, but I'm not one of them, and I doubt most other people are either.
Although humans have more sensitivity to grayscale, the dynamic range of color vision is higher than for distinguishing shades of gray. Sensitivity has nothing to do with dynamic range in this context.
Good point. However, I do recall being told that the dynamic range for greyscale was higher as well.
I am too lazy to look it up, but the reference is in the color chapter in Computer Graphics.
Thank you for the reference. I'll look it up if this thread drags on for more than a week
24 bits is used because there are three channels and a byte of 8 bits is convenient on most architectures.
If the quality change was noticeable enough to be useful for marketing, I'm confident we'd be using an 11/11/10 mode for 32-bit colour by now (with alpha channels stored somewhere else, or not used at all, outside of game mode). Remember the whole 32-vs-16 PR battle between 3dfx and nVidia a while back.
Re:Important? (Score:3, Interesting)
Re:Important? (Score:2, Interesting)
No. This is bogus. The Kryo2 chip, along with the GF1/2, had a form of FSAA where it basically rendered the image at a much higher res than it was going to be displayed, but these don't exist anymore (at least not in the R300/NV30/NV25/R200). I BELIEVE this is called supersampling--don't quote me on this, I'm not a coder and don't care too much about FSAA modes (I have a GF3. I can't use AA in ANYTHING but the oldest games.). Supersampling takes a much larger performance hit, but a lot of people regard it as looking better than the newer method. This newer method is called multisampling--it actually renders the image multiple times, offsetting it each time. This is why color compression has become so important. 4x MSAA COULD take up to four times the memory bandwidth of normal rendering, but with adequate color compression, you could get it down to two times or 1.5 times the bandwidth. This is part of the reason why nVidia went with a 128-bit bus on the GFFX--it thought it had good enough color compression.
Anyway, moving right along, there are two forms of MSAA (multisampling antialiasing)--ordered grid and rotated grid (once again, do not quote me on this).
So basically, FSAA ain't as simple as rendering at 3200x2400 and reducing that to 1280x960 anymore.
Re:Important? (Score:2)
While this was true until recently, there has been some significant progress in this area. The Parhelia for instance (bad example, I know, but stick with me) only anti-aliases the edge pixels of triangles that aren't joined to other triangles - spectacularly reducing the quantity of work needed to at least get the beneficial effects of AA if not true AA itself. There's also been some work on using non uniform sample grids, I believe this is how the GF4's AA.
Go look at some benchmarks, the more modern cards still sweat with AA, but not nearly to the hernia inducing loads that my GF3 would be put under should I ever run AA on it... And this, to me, is what the latest generation of video cards are all about - much the same in terms of framerate, but a better visual quality.
Dave
Re:Important? (Score:2)
Sure. Just keep in mind that real-life video images automatically get "FSAA" when displayed on a TV screen, which is one reason why TV phosphors are nigh-invisible at much lower resolution than computer moniter pixels. The other reason is that you typically sit much closer to your computer monitor.
You're absolutely right about movies though. The pan-strobe effect is most noticeable if you're sitting in the first ten rows or so.
Re:Important? (Score:2)
i know these guys shot for 60 Hz..
http://www.nads-sc.uiowa.edu/multimedia.htm
You to check out the concept of. . . (Score:2)
Please note the extra zeros at the end of 2000 as opposed to 30.
Believe it or not, they make a difference.
No, not the sort of difference you're talking about, the sort of difference that means the difference between 2467 and 2556 doesn't make a difference, even though the difference between 30 and 60 does.
Get the difference?
KFG
Re:Important? (Score:2)
http://www.penstarsys.com/editor/30v60/30v60p1.
Further, it's not the average frames/second that informed gamers are interested in as much as the minimum fps, which if it ever drop below the optimal amount will mean the display goes from smooth to jerky.
My flamebait comment of the day: sure, all the half-blind programmers and sysadmins here on
Until I can't differentiate visually between what's on my screen and what I see in the big blue room outside my house, 3D graphics are not good enough.
Re:Important? (Score:2)
One wrinkle that creeps in at high frame rates is synchronization with the monitor refresh. Typical monitors run 75-100Hz, and when the video card starts breaking into that range of frame rates, you can get bad stuttering effects where a few consective refreshes show different frames and then one refresh repeats a frame because the video card couldn't quite catch up. (or you can turn off "vsync" and get tearing instead of stuttering).
There is definitely a benefit to an "overpowered" graphics system that can deliver 200fps, because it gives extra headroom for when the action gets intense and the frame rate drops a bit. You might not notice a drop from 200-150fps but you will REALLY notice 60fps-30fps...
(John Carmack talks about this in a
Re:Important? (NOT) (Score:2)
Do any of you insightful people understand that you just can't display 2000+ fps on a video monitor (and LCD displays are even slower than CRT's)? OK, just maybe a frame rate above 30 fps might help a little, but if your system is actualy spending cpu power on rendering any more than the useful number of video frames, then it's really wasing time that it could be better spending on user input or data transfer or something else that really does matter in the game. Of course, this delay is also very small, so only hair splitting fanatics would care about it, but those are just the people going after unrealistically high frame rates.
Of course, more video power can be applied other ways that do help the user, such as higher resolution, better lighting effects, and so on, but that isn't the issue that many here seem to care about - they just want frame rates that their video display is never going to show, so even if they foolishly think their eye can extremely high frame rates, they miss the basic truth that vido cards could get 100 times faster, but more frames will never reach their eyes.
Hoax? (Score:2)
While one fake story does not prove the other is fake it does indicate that the journalistic standards are somewhat lax.
Re:Hoax? (Score:2)
New video cards = quality first, *then* speed (Score:2)
New video card technology means speed in old games and features/visual quality in new games. I can guarantee you that no matter how advanced video cards have gotten in 5 years, artists and designers will still be able to generate content that slows the game to a crawl
Linux? (Score:2)
Re:Linux? (Score:3, Insightful)
OpenGL is still alive largely because of ID. Quake does not use DirectX, so everyone who wants to use their 3D accelerator to play it needs OpenGL driver. (Of course OpenGL lives in the high-end workstation sector for other reasons)
Re:Linux? (Score:2)
Not sure where you are getting "nvidia saved opengl" from. Nvidia and ATI (among others), helped OpenGL become popular in the first place, mainly in competition with 3dfx's GLIDE.
Re:Linux? (Score:2)
Aww nuts (Score:5, Funny)
dont worry you need a new MB (Score:2)
they now have a license for the Intel P4 bus
and can do AMD
really this is the way to go as you could do funky things
and farm the chipset off to OEM's
and all the problems of cooling goes away !
regards
John Jones
damn (Score:5, Funny)
Re:damn (Score:2)
Re:damn (Score:3, Interesting)
BTW, I don't think video card memory is a good measure of R&D. Cramming more memory onto a card doesn't require that much thought or design, except maybe on the part of memory manufacturers in reducing latency and cost, but nVidia and ATI don't need much R&D to cram memory onto a card.
Re:damn (Score:4, Interesting)
128MB is only 4 times more than 32MB.
Comparing absolute numbers may not be the most valid comparisons.
Also, the growth in the RAM size of video cards should be attributed to improvements in memory density, not really an improvement in video card technology. Although video cards and memory technology sort of feeds on each other since video cards tend to use the latest and greatest memory chips, at least 1 generation ahead of what mainboards use.
Yeah... (Score:2, Informative)
Re:Yeah... (Score:2)
Of course, the 9900 looks yummy with DDRII support (in the future)/
ATI taking the lead (Score:2, Interesting)
I know that nVidia cards process OpenGL better than ATI (from what I have seen), however ATI has the advantage of being compatible with almost everything that uses DirectX. This may not seem like much since it's a MS product, but more and more are using DirectX, especially since WinXP comes with default drivers for DX and not OpenGL.
Plus ATI got their name on the GameCube, which makes them the coolest company ever.
Not that great (Score:2, Interesting)
Mac drivers! (Score:2)
Re:Not that great (Score:2)
Plus, the FireGL cards (which are somewhat souped up Radeons) are important to them as well. As popular as DirectX is for gaming, it is worthless in the realm of professional 3D graphics.
Now what nVidia does spur on is the now fledgling Linux OpenGL support for consumer cards. If nVidia were not in the market touting their Linux drivers, you can bet ATI would go back to not giving a rat's ass about official driver support for consumer level cards. They might even cease what cooperation they have been doing with the open source community if nVidia dropped out. GL support is not threatened, but the 'fringe' users out there would have a harder time getting support.
Re:ATI taking the lead (Score:5, Informative)
How so?
> however ATI has the advantage of being compatible with almost everything that uses DirectX.
What do you mean? NVIDIA is basically as compatable with DirectX as ATI is, and vice versa with OpenGL.
> This may not seem like much since it's a MS product, but more and more are using DirectX
More innovation has taken place in DirectX than OpenGL (discounting vendor specific extentions) in the last few years. However, there are important OpenGL-only games in the horizon, such as Doom3. Other newer games such as UT2k3 can run in either OpenGL or DirectX modes.
> especially since WinXP comes with default drivers for DX and not OpenGL.
WinXP also comes with opengl32.dll.
Re:ATI taking the lead (Score:2)
NVidia's OpenGL ICD is more stable than the Radeon's, but nowhere near as good as the FireGL's (And FireGL's bootstomp Quadro's)
Re:ATI taking the lead (Score:2)
Re:ATI taking the lead (Score:5, Informative)
"ATI has the advantage of being compatible with almost everything that uses DirectX" Hmm.. nVidia, SIS, Trident, Matrox, S3, all are too so ATI doesn't have an advantage, they're simply on par with the competition. Don't forget the Xbox with the nv2a chipset (basically GF3 with better DXT1 support and 2 vertex pipelines) is only programable via DirectX 8.
I'm not sure what you mean by nvidia cards process OpenGL better than ATI. I think you mean they more fps in QuakeIII. Maybe so, but this would happen if id used Direct3D as well. If you mean ATI's drivers have traditionally been poor (poor not just in terms of running applications slowly, but poor in terms of running applications incorrectly or not at all) then that is unfortunatly very true. ATI has always struggled to deliver more than 50% of a product (they make good hardware, but without good software drivers you've got a bad product. 3D chip companies have more people on their software teams than their hardware teams!), but that is more of a 2001 issue than a 2003 issue. They've really come a long long way.
BTW every major game, even those from id, use DirectX. You're actually talking about Direct3D, not DirectX, from what I can tell.
Re:ATI taking the lead (Score:3, Interesting)
In reality, top end games currently run better on OpenGL (when properly configured), for example, Half-Life (and mods), Battlefield 1942, Return to Castle wolfenstien (which only runs on OpenGL), and others that run on the Quake 2 & 3 engines.
So yes, you are correct, OpenGL isn't going anywhere any time soon. I didn't say it was. I simply pointed out the benefits of ATI and why it is more likely to succeed on the newbie's desktop.
Re:ATI taking the lead (Score:2)
Shacknews (Score:5, Informative)
HardOCP: http://www.hardocp.com/index.html#6494-1 [hardocp.com]
Shacknews: http://www.shacknews.com/onearticle.x/24906 [shacknews.com]
Re:Shacknews (Score:5, Insightful)
Where's the cold, hard facts?
Re:Shacknews (Score:2)
It wouldn't be the first time
(moderating comment - never bought nVidia, big ATI fan)
Re:Shacknews (Score:2, Funny)
Vaporware (Score:4, Funny)
Re:Vaporware (Score:3, Insightful)
Just for clarification... (Score:4, Informative)
Rumors, rumors (Score:5, Interesting)
Gotta love the grapevine.
TheInquirer (Score:3, Insightful)
You are wrong here. TheInquirer [theinquirer.net] have one of the best track records of any Tech Site that care to think rather than regurgitate press releases from corporations.
Re:TheInquirer (Score:2, Funny)
Yeah...... With headlines like [nationalenquirer.com]:
OJ Attacks Daughter!
Britney love triangle turning violent
Motley Crue rocker is deadbeat dad
They sure seem reputable to me!
...and...? (Score:5, Interesting)
Meanwhile ATI will enjoy higher profits and will have a bit of breathing room. Hopefully, they will use this time to extend their product offerings viz the R350 core, continue pouring money into driver development, and keep working on R400 or whatever their next-gen core ends up being called. In any event 6-9 months from now we will see these next-generation parts coming to market, and they will be just that much better.
Re:...and...? (Score:5, Interesting)
People said this about 3dfx right when it released the long delayed, big, noise, power hungry Voodoo 5 5550 (while Nvidia had long taken the lead).
Same thing seems to be have happening to Nvidia, only this time with ATI taking the lead.
Re:...and...? (Score:2)
I may be a little biased, just finished building my new compy with an NForce2 based mobo and quite simply it rocks.
Hardly 3dfx (Score:5, Interesting)
nVidia are a larger company with a string of huge successes to date. They have a much more diversified income, including some very popular OEM chips, the successful nForce2 (and less-successful Xbox) chipsets, a well-regarded pro card line, and a significant share of the Apple market too. Not to mention quite a bit of cash in the bank.
A single high-end chip(which is a small % of their total revenue anyway), even if it failed completely, is not going to impact their bottom line that much. It'll have more impact on their image as graphics leader, but they have the resources to learn, move on, redesign and try again.
Re:...and...? (Score:2)
Remember the Apple Lisa? (Score:2, Interesting)
Sometimes it takes a brilliant failure like this to catapult R&D to the next level. Let's hope that happens here.
Re:...and...? (Score:2)
This year's performance card is next year's value card, so it is not just a question of whether they will be able to produce a competitive performance card within a reasonable time. They need to produce a value card that is about as good as the Radeon 9700 within about a year ( a year and a half might be good enough). If nVidia is still playing catch-up a year from now then they are doomed.
Re:...and...? (Score:2)
how about round 2 (Score:2)
Hardly surprising... (Score:5, Funny)
Note, I'm not an ATI fanboy (actually I'm running a GeForce1 right now) but I'm really appalled at what 3dfx^H^H^H^HNVidia was thinking when they created this card...
In Business News Today... (Score:3, Funny)
It gets worse... (Score:5, Informative)
The Radeon 9900 [theinquirer.net] is expected out next month, with the new R350 core.
I am glad I don't have Nvidia stock right about now.
Best Buy stores only fulfilling pre-orders (Score:5, Informative)
-macado
Visiontek jabs from beyond the grave (Score:5, Funny)
Make sure you have your speakers on..
Vendor Confirmation (Score:5, Interesting)
Additionally, it seems the "Radeon9900" information at Xbitlabs might be less accurate than it appears.
This isn't the greatest news for Nvidia, but it doesn't exactly break the bank: Nvidia still has the lion's share of the graphics market, and will probably continue to keep that market simply due to Tier 1/2 OEM sales, as well as their reputation - even though ATI has faster hardware, Nvidia has had a history of rock-solid drivers 4 generations back. Although ATI's driver quality has improved significantly in recent times, they're still not up to par with Nvidia's. And be sure that Nvidia will capitalize on that, since they don't have bragging rights for their hardware currently.
What is the yield problem caused by? (Score:2, Insightful)
not good for nvidia (Score:4, Interesting)
Re:not good for nvidia (Score:2)
ATI Still Not There (Score:2, Interesting)
hmph. not really surprised (Score:2, Flamebait)
If the 4200 I'm using as a replacement for the 4400 dies, I'm going to ATI, and not looking back.
About the loss of that PCI slot... (Score:3, Interesting)
How stupid.
On practically every motherboard out there today, PCI 1 and the AGP slot share resources, so you're crippling your system performance by putting a card in each.
As I remember it, PCI has four specific special IRQ channels allocated for it, and thus the original spec is for one IRQ for each. Modern motherboards get away with this by having different slots share the bus mastering, so that two devices can piggyback on one slot. Usually, the onboard IDE controller piggybacks on one slot, and the last two slots (usually PCI 5 and 6) are often coupled together. By the same token, the AGP slot often shares an IRQ with PCI 1.
So, in short, if you're going to complain about the cooling system, complain about it being loud. You weren't losing anything on your motherboard that you could even use to begin with.
Yes, it's real, but NV30 lives on (Score:3, Informative)
But, it's not just a rumor anymore. When it first came to [H], everyone regarded it as BS. It was a rumor posted on a board that spread incredibly rapidly. But, apparently it's been confirmed by either OEMs or nVidia itself to those with good contacts. BFG has stopped taking preorders, AFAIK, because...
"According to an e-mail John Malley sent out a couple of days ago, BFG is concerned that pre-sales may exceed their allocation of units."
So, yes, the 5800Ultra is gone. Oh well. NV35 in June, according to some.
nooo (Score:3, Funny)
My plans to build a hovercraft are smoot!!
*shakes fist upwardly*
These are just rumors (Score:2, Insightful)
Finally, the secret's out! (Score:2)
Maybe this way they can change name to 3DFX as well before the end...
- Chris
Pure poetry (Score:5, Informative)
Still, I don't see NVidia in the same precarious position as 3Dfx was at the time. NVidia likes to point out that after the latest Radeons were released by ATI, NVidia's market share actually went up, not down. The super-performance market is actually a very small market, and NVidia still offers the best value out there for mainstream users in the GeForce 4 Ti4200. For most people, the extra $250 they'd spend on a Radeon 9700 Pro vs. a Ti4200 is just not worth it - the extra few frames per second you'd get in most games are generally not even that noticeable, and there are a lot of better ways to spend that money. I don't really think NVidia's got a lot to worry about, then - unless the performance gulf and manufacturing problems become so pronounced that public perception (or misperception) filters down to even the mainstream products (as has been ATI's bugaboo over the years).
Still, it looks like the GeForce FX has been NVidia's first real dud in some time. No doubt the "stock" FX 5800's will be a good value once the NV35 is released (just as the Ti4200's are a good value now), but at the moment the card doesn't seem to really fit in any niche. Performance gamers will choose the Radeon 9700 Pro, mainstream gamers will choose the Ti4200, and low-end or business users will continue choosing ultra low-cost but perfectly capable cards like the GeForce 2 Ti.
Canned milk for canned cows (Score:5, Insightful)
I was seriously unimpressed with the GFFX. This is an odd feeling as new nVidia cards have in the past been truly impressive and something to lust after.
"I sense something. A presence I've not felt since..."
While 3Dfx was not in the exact same position as nVidia is market penetration wise and financially it seems nVidia is pulling a technological page from their book. The GFFX 5800 Ultra Megazord seems a great deal like the Voodoo 5. It is a power hungry beat of a video card that doesn't live up to all of the hype that's been surrounding it since August when the Radeon 9700 needed an answer by nVidia.
Of course the GFFX will improve and in six more months they'll have a GFFXMXKY that comes as the toy in a box of Count Chocula. Sharing many similarities with the Voodoo 5 isn't going to necessarily Doom the card (get it?) but it is giving ATi a huge shot in the arm. They've got a 5 month old card that performs about as well as nVidia's latest offering, that is something they haven't been able to boast before. All ATi has to do is not screw up and they will get back a bunch of users who abandoned them when the GeForce smoked the Radeons like fat chronic blunts with a mere driver upgrade.
Even though ATi has the advantage now I think nVidia will come back with a really strong chip PDQ. They aren't going to accept defeat because their card requires an onboard RTG to run decently. If ATi keeps their momentum going they could top even the next NV chip nVidia will release. Do I care one way or the other? Hell no. I don't want to see either of them lose out, I want as much competition as possible to I get more frames with excellent visual quality for the buck. It will be great to be able to enable all of Doom 3's visual effects with AA and still be able to play the game, especially after people like Raven or Rogue license the engine and build the next Jedi Knight or Alice with it.
Re:Sa da tay? (Score:3, Insightful)
hicups such as these (if it is true in the first place) are only a part of the progress... remember the pentium bug? How many said intel would kill itself...
Re:Sa da tay? (Score:2, Informative)
I agree. I just read where they have been having probs with M$ over Xbox, where one of the problems was they would have to cut production of some chips to keep up with the pace of xbox chipsets. Since M$ has revised down their projections, this isn't a consideration.
They make good products. I'm a firm believer that if you don't fuck up every now and then, you arn't taking enough chances. All things considered, I think I will cut them some slack this time and see what comes out next.
Besides, my 4200 is all I can afford for at least a year anyway
Re:Honoring preorders (Score:2)
Re:Annoyances (Score:2)
But hey, I'll reiterate: nvidia cards get hot, to the detriment of the general stability of their host computers. They seem to do this in 2D or 3D modes. I've replaced a lot of M64s, Vantas and GF2MXs because of this, and found that those replacements FIXED PROBLEMS.
Re:Annoyances (Score:2, Funny)
How much less?
Re:"Ultra Cancelation" is lame (Score:4, Funny)
I actually had to read the topic a few times before I got that the "Cancellation" wasn't part of the product name.
"Rumors of a GeForceFX 5800 Ultra Cancelation?"
Wow, that's an innovative product name. I wonder how good it is? lol
Re:Who cares about a video card when... (Score:3, Informative)
http://www.megarad.com/modules.php?name=News&fi
Or this:
http://www.globalcomment.com/science&technology
It's a spoof article unfortunately, but a pretty good one.
HH