GeForce FX Architecture Explained 185
Brian writes "3DCenter has published one of the most in-depth articles on the internals of a 3D graphics chip (the NV30/GeForce FX in this case) that I've ever seen. The author has based his results on a patent NVIDIA filed last year and he has turned up some very interesting relevations regarding the GeForce FX that go a long way to explain why its performance is so different from the recent Radeons. Apparently, optimal shader code for the NV30 is substantially different from what is generated by the standard DX9 HLSL compiler. A new compiler may help to some extent, but other performance issues will likely need to be resolved by NVIDIA in the driver itself."
Re:Say what (Score:5, Insightful)
It keeps getting excused away by "archetecture changes" or "early driver issues" or "the full moon."
Go go ATI! You brought competition back to the consumer 3D board scene, thank you!
Re:Say what (Score:5, Insightful)
Know that there are many ways to do one thing and there are pros and contras in each of them. In this case, it seems that NVidia's is not chosen and the way DX9 handles things undermines NVidia's method. It's not necessarily because NVidia sucks. Remember that there are politic struggles among Microsoft, NVidia, and ATI during the inception of DX9? I think NVidia now falls victim of it.
Re:Say what (Score:3, Insightful)
It sounds like a monopolist helping out whoever they want to and then making the 'other guys' get screwed. Suck.
Re:Say what (Score:5, Insightful)
Re:3dcenter.org is not registered? (Score:2, Insightful)
whois 3dcenter.org@whois.pir.org
Re:Blantantly Off-topic (Score:2, Insightful)
Matrox may have had an advantage a while back, but it's nothing conclusive now days.
Re:All I heard was BLAH BLAH BLAH Nvidia sucks (Score:2, Insightful)
http://www.rage3d.com
Re:Say what (Score:3, Insightful)
- NVidia makes drivers for linux, and they don't suck
- NVidia works hard on making sure their cards support OpenGL, which is the only means through which linux can really have 3D, AND it's the only 3D alternative to DirectX
- John Carmack (and the rest of id) develops some of the best games in the industry, and he develops using OGL, as well as for multiple platforms
- ATI has traditionally been a very compliant OEM-type company that loves to bundle it's stuff with anything it can to make a buck.
Re:Say what (Score:2, Insightful)
Yeah, right.
(Puts on tinfoil hat) My theory is that MS was annoyed with NVidia after the negotiations over XBox v2 broke down... so they communicated a little better with ATI than NVidia over DX9.
Re:But can you hack a GeForce like you can hack Ra (Score:1, Insightful)
The 9700 was meant to have an R300 with 8 PS pipelines. (The Pro with faster clockspeeds, both with 256-bit memory bus.)
The 9500 was meant to have a "half-broken R300", with just 4 functional PS pipelines. (The PS pipes take up more silicon area than anything else in there, so a fabbing flaw is statistically likely to appear there -- ATI anticipated that.) (The Pro with faster clockspeeds and 256-bit memory bus, the non-Pro with a 128-bit memory bus.)
They didn't get enough half-broken chips from the fab to satisfy the 9500 demand, so some times they had to insert fully functional R300 chips in the 9500 cards. Exactly those are the ones that can be software converted to 9700 cards. The other 9500 cards just can't be software/hardware converted.
I'd say you were more lucky than 1337 there... supposing you didn't start with the non-Pro 9500, in which case the poor memory bus cripples your card regardless.
Of course, I'm also slightly jealous.