GeForce FX Architecture Explained 185
Brian writes "3DCenter has published one of the most in-depth articles on the internals of a 3D graphics chip (the NV30/GeForce FX in this case) that I've ever seen. The author has based his results on a patent NVIDIA filed last year and he has turned up some very interesting relevations regarding the GeForce FX that go a long way to explain why its performance is so different from the recent Radeons. Apparently, optimal shader code for the NV30 is substantially different from what is generated by the standard DX9 HLSL compiler. A new compiler may help to some extent, but other performance issues will likely need to be resolved by NVIDIA in the driver itself."
Say what (Score:5, Funny)
Is that the politically correct way of saying "performance sucks"?
Re:Say what (Score:5, Insightful)
It keeps getting excused away by "archetecture changes" or "early driver issues" or "the full moon."
Go go ATI! You brought competition back to the consumer 3D board scene, thank you!
Re:Say what (Score:5, Insightful)
Re:Say what (Score:2, Insightful)
Yeah, right.
(Puts on tinfoil hat) My theory is that MS was annoyed with NVidia after the negotiations over XBox v2 broke down... so they communicated a little better with ATI than NVidia over DX9.
Re:Say what (Score:4, Interesting)
The most complex part of a DX8 or DX9 chip is the Pixel Shader, so I'll concentrate on it. Nvidia spearheaded the development of PS1.1 for DX8.
Then ATI stole the show with PS1.4 (DX8.1), which is much closer to PS2.0 than PS1.1. At this point, ATI got Microsoft's ear -- ATI was ahead of Nvidia in implementing programmable shaders in graphics hardware.
So Microsoft had good reason to pay attention to ATI's ideas of DX9 (including how the HLSL should look like and what kind of assembly it should output), long before any Xbox 1 money issues with Nvidia, long before choosing the designer for Xbox 2 graphics/chipset.
I guess
Re:Say what (Score:2)
It doesn't matter what MS is going to use in Xbox 2, 3, whatever. It's that if I want to play the new games, I now have a choice of brands, and pricing is a lot better now too. (I do admit that the boards from either company are very expensive when they are new, but the competition factor brings those prices down quickly.)
Re:Say what (Score:2, Interesting)
Re:Say what (Score:5, Insightful)
Know that there are many ways to do one thing and there are pros and contras in each of them. In this case, it seems that NVidia's is not chosen and the way DX9 handles things undermines NVidia's method. It's not necessarily because NVidia sucks. Remember that there are politic struggles among Microsoft, NVidia, and ATI during the inception of DX9? I think NVidia now falls victim of it.
Re:Say what (Score:3, Insightful)
It sounds like a monopolist helping out whoever they want to and then making the 'other guys' get screwed. Suck.
Re:Say what (Score:2)
I agree, this sounds like a big brewhaha between ati, nvidia and microsoft
I remember not so long ago how Rambus was the black sheep. And how Intel was the maker of the new evil Rambus. Well, did you know that AMD was part of the companies that helped define Rambus?
This is business boys... not kindergarten. In this arena, bending down to get the soap gets you an ass load. It's reality. Face it. As linus said, Grow up.
Ohh, I feel so proud to apply my first propagan [propagandacritic.com]
Re:Say what (Score:2)
Re:Say what (Score:1, Funny)
I can see why he didn't though, no one really does that. Especially using the last name too. That is just way too melodramatic, and unrealistic.
Re:Say what (Score:4, Informative)
p.s.
If you don't get this, MS was losing money on the XBox for a long time, some analysts say they still are, to minimize those losses they asked Nvidia to take a hit on the contract terms for the XBox hardware agreement, Nvidia being a relitivly small company said no thanks and that effectivly ended their relationship for now.
Re:Say what (Score:5, Funny)
Lucky for me, I have 100 lives!I have a brother)-start
Up-up-down-down-left-right-left-right-B-A-select(
Re:Say what (Score:2)
I am so honored to actually get this post and hope that at least 4 +1 funny mods also get it. I tip my hat to you, sir.
Re:Say what (Score:3, Insightful)
- NVidia makes drivers for linux, and they don't suck
- NVidia works hard on making sure their cards support OpenGL, which is the only means through which linux can really have 3D, AND it's the only 3D alternative to DirectX
- John Carmack (and the rest of id) develops some of the best games in the industry, and he develops using OGL, as well as for multiple platforms
- ATI has traditionally been a very compliant OEM-type company that loves to bundle
Re:Say what (Score:2, Informative)
GO VOODOO and GLIDE.
Creative was suppose to help 3dlabs pump out consumer level cards yet I haven't seen them at the retail store.
Re:Say what (Score:1, Informative)
DX5 was mostly okay to develop for, DX6 offered some cool features (bumb mapping, texture compression), and DX7 finally caught up with OGL1.3 features (if not ease of programming).
Re:Say what (Score:2)
- ATI makes drivers for linux, and they don't suck
- ATI works hard on making sure their cards support OpenGL, because it's an industry standard, particularly in the commercial (CAD, 3D rendering) world.
- Carmack has repeatedly stated that the nVidia shader implementation is inferior to the ATI implementation, requiring a NV3X specific path that uses much lower resolution while still not having as much performance.
- Your last "point" is wholly
Re:Say what (Score:2)
That's not quite how I read it. I read it as "for the money, you can get a lot more performance. Games optimized for it will scream, tho..."
I guess it's hard to say its performance sucks if it plays today's games just fine anyway.
But can you hack a GeForce like you can hack Radeo (Score:2, Informative)
+5 insightful, I love you guys (Score:1, Informative)
As for you haters out there. It has nothing to do with the memory speeds, memory can be overclocked independently of the core. And no my Infineon 3.3 does not overclock to much. As for the hack itself, it involves opening up all 8 pipelines, as opposed to the 4 default in the 9500. Core can be overclocked trough the roof
Re:+5 insightful, I love you guys (Score:2)
Re:But can you hack a GeForce like you can hack Ra (Score:1, Informative)
Re:But can you hack a GeForce like you can hack Ra (Score:1, Insightful)
The 9700 was meant to have an R300 with 8 PS pipelines. (The Pro with faster clockspeeds, both with 256-bit memory bus.)
The 9500 was meant to have a "half-broken R300", with just 4 functional PS pipelines. (The PS pipes take up more silicon area than anything else in there, so a fabbing flaw is statistically likely to appear there -- ATI anticipated that.) (The Pro with faster clockspeeds and 256-bit memory bus, the non-Pro with a 128-bit memory bus.)
They didn
Re:But can you hack a GeForce like you can hack Ra (Score:2, Interesting)
Notice in the picture the arrangement of the memory chips AROUND the core.
http://www.newegg.com/app/ViewProduct.asp?
SoftQuadro (Score:2)
I still won't buy ATI. Sure it's faster, but given their driver quality track record, it's like swapping the engine from a Viper into a Yugo. Wicked fast until you crash.
Yes, I'm an *extremely* unhappy former ATI customer. I will NEVER buy one of their cards again.
Re:SoftQuadro (Score:2)
You should do a little research on your statement here. This is, and has always been a realm dominated by ATi. And, not too long ago, when Nvidia got caught "cheating" their drivers so MadOnion scores wouldn't suck, ATi got caught either right before or right after that "cheating
relevations? (Score:1, Funny)
"he has turned up some very interesting rasing or lifting up regarding the GeForce FX" ?
probly revelations would be better.
Re:relevations? (Score:2)
maybe, just maybe... (Score:5, Funny)
Re:maybe, just maybe... (Score:2)
Re:maybe, just maybe... (Score:2, Funny)
3dcenter.org is not registered? (Score:1)
http://www.3dcenter.org/artikel/cinefx/inde
www.3dcenter.org does not resolve and whois shows no registration for 3dcenter.org. googling for 3dcenter shows no entry that looks like the right site.
Re:3dcenter.org is not registered? (Score:1)
Re:3dcenter.org is not registered? (Score:2)
For the record and the karma, dig shows...
dig 3dcenter.org
DiG 9.2.2rc1 -> 3dcenter.org
global options: printcmd
Got answer:
HEADER opcode: QUERY, status: NOERROR, id: 33775
flags: qr rd ra; QUERY: 1, ANSWER: 1, AUTHORITY: 2, ADDITIONAL: 2
QUESTION SECTION:
ANSWER SECTION:
3dcenter.org. 86400
Re:3dcenter.org is not registered? (Score:2, Insightful)
whois 3dcenter.org@whois.pir.org
Re:3dcenter.org is not registered? (Score:2)
Here's the real story: (Score:5, Funny)
GeForce FX is really noisy
Explanation:
It sucks in large amounts of air to keep it cool. This is one of two ways a GeForce FX sucks. The other way is beyond the scope of this post.
And here's the proper explanation (Score:1)
The original GeForce FX cards were noisy because the fan cowling was misdesigned and the fan blades rubbed upon it. The later versions have fixed this problem, and thus no longer sound like a leaf blower.
Re:And here's the proper explanation (Score:2)
now they only look like one
Re:Here's the real story: (Score:1)
The fx5900 makes less noise than my GF3 TI500 and much less heat. and just for that I am happy about the card since I have spent a great deal of time lowering the noise level of my PC and the GF3 was the loudest part in my PC. Now all I need is to get a better cooler for the CPU.
[OT] Still no FOSS drivers (Score:3)
Not interested in anything NVidia do or say until they strile some agreement with the people who'se IP they license for their drivers & Open them...
2 Years since I bought my Geforce & I still cant have 3D accelleration, tv out and framebuffer all working at once.
ATI is no better. (Score:2)
I believe S3TC is one of the major factors in why BOTH ATI and NV are binary-only. I know it's the reason given for ATI's open-source drivers not being able to run UT2K3. Sadly, there aren't really any acceptable alternatives to S3TC.
Re:ATI is no better. (Score:2)
Neither one can open source their drivers because there are large chunks of code they don't own. In nVidia's case it's even worse -- a large amount of their codebase has the letters S-G-I all over it due to legal issues dating back to the origin of nVidia.
There's other reasons, but they're just icing on the cake -- there's simply no way for either company to open source their drivers even if they wanted to.
I wonder what a structured classroom approach... (Score:5, Interesting)
Re:I wonder what a structured classroom approach.. (Score:2, Informative)
Owens @ UC Davis [ucdavis.edu]
Akeley and Hanrahan @ Stanford [stanford.edu]
Re:I wonder what a structured classroom approach.. (Score:3, Interesting)
It's all obsolete and legacy now. But it gives you a good idea about how a current day graphics card is designed. Back then, the various components had to be implemented on separate chips (eg. RAMDAC's, clock oscillators, memory decoding, graphics).
TI also had the TMS34082 vector processor. You could have up to four of those in a slave/master configuration (a bit like the
Lies! (Score:5, Funny)
Re:Lies! (Score:2)
Yes, it does... Makes it sound more and more similar to the Radeon doesn't it?
A Example.. (Score:2)
Yeah, they disected an old ATI and found out how they were cheating on Q3 numbers so they figured they'd better join the race and start cheating.
I must say, they are doing an Excellent Job [hardocp.com]
This absolutely has to be one of the best examples of how the graphic card companies are using the ignorant "tech" sites to spread false stats.
Those guys got an offer from nVidia to do the Benchmarking along with "A new and unreleased nVidia driver" (yeah right !).
And even when the nVidia card smoked the Radeon
Re:A Example.. (Score:1)
made me laugh. "Thanks for letting us be dumb patsies to your lies!".
I stopped reading hardocp a while ago, when there seemed to be more chest beating about war and jingoism than hardware reviews. I can see I haven't missed out.
Re:A Example.. (Score:2)
On the other hand... (Score:4, Interesting)
Re:On the other hand... (Score:2)
Re:On the other hand... (Score:2)
Re:On the other hand... (Score:1)
Re:On the other hand... (Score:1)
Re:On the other hand... (Score:2, Flamebait)
Linux support and Native port of Doom3
or
Shocking linux support and dual boot just to play half life 2...
Then Screw Halflife 2, Microsoft and ATI, I'll do without any of them easily.. Give me NV, ID and Tux any day..
Re:On the other hand... (Score:1)
That reminds me. I need to go purchase Opera to let them know I appreciate them as well...
But ATI Linux drivers are getting better... (Score:2)
Re:But ATI Linux drivers are getting better... (Score:2)
Saw where? What are you talking about?
Re:But ATI Linux drivers are getting better... (Score:2)
Re:On the other hand... (Score:1)
ATI has historically made good hardware that was crippled by buggy, poorly written, limited drivers.
They still have yet to convince me to buy an ATI product. They seem to be having a good run at the moment, but given their many years of incompetence, I'm sure this will pass as quickly as it came to be.
Re:On the other hand... (Score:1)
Stop complaining about your 'historical' (i.e. anecdotal) experiences with ATI hardware. I own an R200 and am extremely satisfied with it (using DRI thank you very much). I haven't used the closed source driver, but feedback [linuxgames.com] seems to be very positive.
If it's Windows your referring to, then ATI has extremely competitive, high quality drivers. The CATALYST crew claim they produce the industry standard in gra
Re:On the other hand... (Score:1)
Re:On the other hand... (Score:2)
Re:On the other hand... (Score:1)
Re:On the other hand... (Score:1)
for an ati linux developer's post
6500 fps in glxgears with my Radeon 9700 Pro (Score:2)
One assumption is probably wrong (Score:5, Interesting)
I understand that the article writers are trying to come up with reasons that the Nvidia part is wasting performance, but this doesn't make sense. No architect in this right mind would ever design a pipeline that becomes full before the first instruction can exit. The means that you are fetching much faster than you are retiring instructions. That means you will always have a pipeline stall at the frontend and you will always be wasting cycles. I think the designers would have checked something like that. You can't afford pipeline stalls to happen regularly.
Re:One assumption is probably wrong (Score:1)
Re:One assumption is probably wrong (Score:1)
Still, AMD does it better, no matter the frequency...
Blantantly Off-topic (Score:1)
Re:Blantantly Off-topic (Score:2, Insightful)
Matrox may have had an advantage a while back, but it's nothing conclusive now days.
Re:Blantantly Off-topic (Score:2)
Re:Blantantly Off-topic (Score:1)
Of course, ELSA wasn't exactly no-name - bunch of Germans that want bankrupt rather than cheating on the assembly. And I don't think ATI (built) ever had 2D quality issues.
Anand tells the tale (Score:2, Informative)
ATI 9x owners rejoice, indeed! Even the budget 9200 smokes the 5600 Ultra!
Re:Anand tells the tale (Score:2, Interesting)
Linux Drivers (Score:4, Informative)
ATI or NVIDIA, it's just a matter of taste and/or faith.
But in the Linux world NVIDIA still rules.
And it's not that NVIDIA's cards are better, but they at least have a descent Linux driver.
The bottom line is: "If you use Linux, the best choice still is a NVIDIA card!"
Re:Linux Drivers (Score:3)
"3D acceleration works perfectly" (Score:2)
Due to intellectual property issues, there are no open-source drivers that support S3TC.
"working perfectly" implies that it can run a modern game like UT2K3 - Which the open-source drivers can't.
Your only option for UT2K3 (And likely Doom3 when it comes out) are either NV's or ATI's closed-source drivers. And NV's Linux drivers are FAR better.
Re:"3D acceleration works perfectly" (Score:2)
Really? Damn, then I am screwed. The nVidia drivers I am using now totally screw with my kernel latencies when doing 2D rendering. I hate them with a passion. They're huge, they sit in my kernel space, and so far they seem to be the only cause of my machine dying.
My next card will not be an nVidia. At the moment, that means ATi. Good for them.
I have learnt
All I heard was BLAH BLAH BLAH Nvidia sucks (Score:2)
The key thing ATI did besides great cost/performance was get drivers out the door that didnt totally suck. For the first time in memory I have the original video driver and am not forced to download a patch!
Good job ATI!
Re:All I heard was BLAH BLAH BLAH Nvidia sucks (Score:2, Insightful)
http://www.rage3d.com
Re:All I heard was BLAH BLAH BLAH Nvidia sucks (Score:2)
A friend and I (Score:2)
"Optimization" the NVidia way (Score:2)
I'd like to see that list.
Their bias is showing (Score:2, Funny)
From the article:
Er, oh wait, it's in German as well...
GeforceFX (Score:5, Interesting)
Honestly, I thought nVidia learned their lesson with the NV1 - don't make weird hardware.
Now, what has to be making GeforceFX owners worried is Gabe Newell's warning that the new Detonator drivers might be making illegitimate 'optimizations' and, furthermore, covering them up by rendering high quality screen captures.
Re:GeforceFX (Score:2)
Now wait a minute. The GeForceFX is essentially faster than anything out there, except for the newest Radeon cards. That makes it the second fastest 3D hardware solution for PCs. And it is certainly faster than past nVidia cards, card which were alrea
Re:GeforceFX (Score:2)
If all you play is Q3, that's true.
If you play some of the newer games, a GF3 isn't adequate. If you want to play the newest DX9 games then a GF3 is completely inadequate (for the full experience). Go look at the framerates coming out of HL2 -- AnandTech has a good article this morning.
The FX is only "slow" in the minds of fanboys who live for incremental performance increases without regard to power consumption or expense
Ri
Re:GeforceFX (Score:2)
You can't talk about performance of games that haven't been released yet, like HL2. That's a total fanboy realm.
My point is that we're essentially talking about a handful of games here, and these are not games that are particularly well optimized. If 3D game XYZ was targeted for the Xbox, it would ro
Re:GeforceFX (Score:2)
Really nice to look at. If you only have a GeForce4 (like me) you can still get some of the effects, just no depth of field or advanced
Re:GeforceFX (Score:2)
Fine. We'll talk about Tomb Raider [beyond3d.com] then. Sorry, but when your top of the line card has half (or less) of the performance of the competitor's card -- with the difference being between playable framerate and unplayable frame rate -- then your card is indeed slow.
My point is that we're essentially talking about a handful of games here, and these are not games that are particularly well optimized.
A handful of games, yes. But
It's really very simple ... (Score:3, Informative)
The proof is in the pudding.
Biased article... (Score:1)
DX9 aside (Score:2)
Re:3 Nvidia Articles?!! (Score:5, Funny)
Re:Are you sure? (Score:1, Funny)