Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Software

GeForce FX Architecture Explained 185

Brian writes "3DCenter has published one of the most in-depth articles on the internals of a 3D graphics chip (the NV30/GeForce FX in this case) that I've ever seen. The author has based his results on a patent NVIDIA filed last year and he has turned up some very interesting relevations regarding the GeForce FX that go a long way to explain why its performance is so different from the recent Radeons. Apparently, optimal shader code for the NV30 is substantially different from what is generated by the standard DX9 HLSL compiler. A new compiler may help to some extent, but other performance issues will likely need to be resolved by NVIDIA in the driver itself."
This discussion has been archived. No new comments can be posted.

GeForce FX Architecture Explained

Comments Filter:
  • Say what (Score:5, Funny)

    by Anonymous Coward on Thursday September 11, 2003 @11:22PM (#6939680)
    performance is so different from

    Is that the politically correct way of saying "performance sucks"?

    • Re:Say what (Score:5, Insightful)

      by cbreaker ( 561297 ) on Thursday September 11, 2003 @11:25PM (#6939699) Journal
      haha yea I guess so. It'll be awhile before it's considered "okay" for any sort of media to say that an nVidia board has sucky performance.

      It keeps getting excused away by "archetecture changes" or "early driver issues" or "the full moon."

      Go go ATI! You brought competition back to the consumer 3D board scene, thank you!
      • Re:Say what (Score:5, Insightful)

        by Anonymous Coward on Friday September 12, 2003 @12:00AM (#6939866)
        The fun thing about HL-2 is it'll likely be the first game where you WON'T have to install a patch or updated Catalyst driver to actually play it with your ATI. WOOT. Go driver development...
      • Re:Say what (Score:2, Insightful)

        by kubrick ( 27291 )
        Competition here being determined by choosing features to mesh best with whatever Microsoft specifies?

        Yeah, right.

        (Puts on tinfoil hat) My theory is that MS was annoyed with NVidia after the negotiations over XBox v2 broke down... so they communicated a little better with ATI than NVidia over DX9.
        • Re:Say what (Score:4, Interesting)

          by Anonymous Coward on Friday September 12, 2003 @03:45AM (#6940607)
          No need for the tinfoil hat.

          The most complex part of a DX8 or DX9 chip is the Pixel Shader, so I'll concentrate on it. Nvidia spearheaded the development of PS1.1 for DX8.

          Then ATI stole the show with PS1.4 (DX8.1), which is much closer to PS2.0 than PS1.1. At this point, ATI got Microsoft's ear -- ATI was ahead of Nvidia in implementing programmable shaders in graphics hardware.

          So Microsoft had good reason to pay attention to ATI's ideas of DX9 (including how the HLSL should look like and what kind of assembly it should output), long before any Xbox 1 money issues with Nvidia, long before choosing the designer for Xbox 2 graphics/chipset.

          I guess ;-)
        • No. To me, it's competition being determened by the speed and features of the graphics boards.

          It doesn't matter what MS is going to use in Xbox 2, 3, whatever. It's that if I want to play the new games, I now have a choice of brands, and pricing is a lot better now too. (I do admit that the boards from either company are very expensive when they are new, but the competition factor brings those prices down quickly.)
      • Re:Say what (Score:2, Interesting)

        ATI also brings a new level to "Abandoning support for hardware as soon as we think we can get away with it". I've had one video card and one tv tuner from ATI, each of which I bought right around the introduction of a new Windows version. In both cases, ATI dropped support for the card shortly after. In the case of the video card, they never did release WDM drivers. In the case of the tuner, they released only a version or two that "kind of" worked under 2000 pro and XP, then decided the remaining major bu
    • Re:Say what (Score:5, Insightful)

      by robbyjo ( 315601 ) on Thursday September 11, 2003 @11:39PM (#6939764) Homepage

      Know that there are many ways to do one thing and there are pros and contras in each of them. In this case, it seems that NVidia's is not chosen and the way DX9 handles things undermines NVidia's method. It's not necessarily because NVidia sucks. Remember that there are politic struggles among Microsoft, NVidia, and ATI during the inception of DX9? I think NVidia now falls victim of it.

      • Re:Say what (Score:3, Insightful)

        by dieman ( 4814 ) *
        I agree, this sounds like a big ati vs nvidia brewhaha with microsoft choosing who they want by getting both of them to get the crap 'in silicon' and then choosing which standard to use.

        It sounds like a monopolist helping out whoever they want to and then making the 'other guys' get screwed. Suck.
        • correction!!! Your post should read:

          I agree, this sounds like a big brewhaha between ati, nvidia and microsoft

          I remember not so long ago how Rambus was the black sheep. And how Intel was the maker of the new evil Rambus. Well, did you know that AMD was part of the companies that helped define Rambus?

          This is business boys... not kindergarten. In this arena, bending down to get the soap gets you an ass load. It's reality. Face it. As linus said, Grow up.

          Ohh, I feel so proud to apply my first propagan [propagandacritic.com]

          • you really need linus to help you say "grow up"? poor guy gets his words pulled out've context for just about anything these days.
            • Re:Say what (Score:1, Funny)

              by Anonymous Coward
              Wrong Linus. If he'd used the full quote it would have been clear, "Grow up, Charlie Brown."

              I can see why he didn't though, no one really does that. Especially using the last name too. That is just way too melodramatic, and unrealistic.
        • Re:Say what (Score:4, Informative)

          by afidel ( 530433 ) on Friday September 12, 2003 @02:01AM (#6940283)
          Basically it comes down to MS partnered with Nvidia for DX8 and XBox-1, Nvidia asked MS to use some KY so MS chose ATI for DX9 and XBox-2.

          p.s.
          If you don't get this, MS was losing money on the XBox for a long time, some analysts say they still are, to minimize those losses they asked Nvidia to take a hit on the contract terms for the XBox hardware agreement, Nvidia being a relitivly small company said no thanks and that effectivly ended their relationship for now.
      • Re:Say what (Score:5, Funny)

        by jjeffries ( 17675 ) on Friday September 12, 2003 @12:43AM (#6940035)
        > Know that there are many ways to do one thing and there are pros and contras in each of them.

        Lucky for me, I have 100 lives!
        Up-up-down-down-left-right-left-right-B-A-select(I have a brother)-start

        • > Lucky for me, I have 100 lives!

          I am so honored to actually get this post and hope that at least 4 +1 funny mods also get it. I tip my hat to you, sir.
      • Re:Say what (Score:3, Insightful)

        by CAIMLAS ( 41445 )
        Well, it makes sense, when you consider the following facts:

        - NVidia makes drivers for linux, and they don't suck
        - NVidia works hard on making sure their cards support OpenGL, which is the only means through which linux can really have 3D, AND it's the only 3D alternative to DirectX
        - John Carmack (and the rest of id) develops some of the best games in the industry, and he develops using OGL, as well as for multiple platforms
        - ATI has traditionally been a very compliant OEM-type company that loves to bundle
        • Re:Say what (Score:2, Informative)

          by Anonymous Coward
          Um... Nvidia didn't buy out 3dlabs. Creative bought out 3dlabs. I think you mean Nvidia bought out 3dfx. 3dfx made voodoo and glide.

          GO VOODOO and GLIDE.

          Creative was suppose to help 3dlabs pump out consumer level cards yet I haven't seen them at the retail store.
        • Re:Say what (Score:1, Informative)

          by Anonymous Coward
          The 3Dlabs/3dfx got corrected... But the reason why DX didn't get adopted (while Glide was around) was that every version before DX5 utterly, completely sucked. It was useless. Glide was an easy only choice when Voodoo was technically superior (not just an API matter) and few if any IHVs had half-decent OpenGL drivers.

          DX5 was mostly okay to develop for, DX6 offered some cool features (bumb mapping, texture compression), and DX7 finally caught up with OGL1.3 features (if not ease of programming).
        • It makes no sense when you bother to list the rest of the facts.

          - ATI makes drivers for linux, and they don't suck
          - ATI works hard on making sure their cards support OpenGL, because it's an industry standard, particularly in the commercial (CAD, 3D rendering) world.
          - Carmack has repeatedly stated that the nVidia shader implementation is inferior to the ATI implementation, requiring a NV3X specific path that uses much lower resolution while still not having as much performance.
          - Your last "point" is wholly
    • "Is that the politically correct way of saying "performance sucks"?"

      That's not quite how I read it. I read it as "for the money, you can get a lot more performance. Games optimized for it will scream, tho..."

      I guess it's hard to say its performance sucks if it plays today's games just fine anyway.
  • by Anonymous Coward
    Nothing beats my 9500 to 9700 card. Its a simple driver hack. Now my lowly 130$ budget card can whoop any GeforceFX garbage. Plus the overclockability after its a 9700. You just dont get any sweeter.
    • by Anonymous Coward
      Wow, I didnt expect my Anonymous Coward to get modded up. That really warms my heart that slashdot crowd listened to me for once. I might have to sign up for a real account now.

      As for you haters out there. It has nothing to do with the memory speeds, memory can be overclocked independently of the core. And no my Infineon 3.3 does not overclock to much. As for the hack itself, it involves opening up all 8 pipelines, as opposed to the 4 default in the 9500. Core can be overclocked trough the roof :)Check out
    • by Anonymous Coward
      Well you can make the GeForce a nVidia Quadro and gain additional OpenGL functionality you generally won't need :-) Although, speed will stay at the same level.
    • by Anonymous Coward
      Do you at all realise you were very lucky?

      The 9700 was meant to have an R300 with 8 PS pipelines. (The Pro with faster clockspeeds, both with 256-bit memory bus.)

      The 9500 was meant to have a "half-broken R300", with just 4 functional PS pipelines. (The PS pipes take up more silicon area than anything else in there, so a fabbing flaw is statistically likely to appear there -- ATI anticipated that.) (The Pro with faster clockspeeds and 256-bit memory bus, the non-Pro with a 128-bit memory bus.)

      They didn
    • 'nuff said.

      I still won't buy ATI. Sure it's faster, but given their driver quality track record, it's like swapping the engine from a Viper into a Yugo. Wicked fast until you crash.

      Yes, I'm an *extremely* unhappy former ATI customer. I will NEVER buy one of their cards again.
  • by Anonymous Coward
    "he has turned up some very interesting relevations regarding the GeForce FX".

    "he has turned up some very interesting rasing or lifting up regarding the GeForce FX" ?

    probly revelations would be better.
  • by PoPRawkZ ( 694140 ) on Thursday September 11, 2003 @11:28PM (#6939715) Homepage
    someone programmed the shaders to work with glide... i can't help hoping 3Dfx will perform some voodoo and ressurrect from nVidia's ashes. excuse me now, i must go stroke my voodoo5
  • Did cowboyneal even try to load the URL?
    http://www.3dcenter.org/artikel/cinefx/index _e.php
    www.3dcenter.org does not resolve and whois shows no registration for 3dcenter.org. googling for 3dcenter shows no entry that looks like the right site.
    • Err, I don't know what's up with your DNS... but the links work just fine to me. It appears to be a mostly-German site, though the article linked to was translated into English.
    • your dns blows dude :) Time to stop using win2k server's dns (which blows chunks, i know - i've run it in the past before i came to the light errr linux).

      For the record and the karma, dig shows...

      dig 3dcenter.org

      DiG 9.2.2rc1 -> 3dcenter.org
      global options: printcmd
      Got answer:
      HEADER opcode: QUERY, status: NOERROR, id: 33775
      flags: qr rd ra; QUERY: 1, ANSWER: 1, AUTHORITY: 2, ADDITIONAL: 2

      QUESTION SECTION: ;3dcenter.org. IN A

      ANSWER SECTION:
      3dcenter.org. 86400
    • by Anonymous Coward
      By default "whois" won't show registration for ANY .org's--remember, .org has a different registrar now, and whois uses "whois.internic.net" by default, which only serves .com and .net. For .org, you need to do a query at whois.pir.org:

      whois 3dcenter.org@whois.pir.org
  • by Anonymous Coward on Thursday September 11, 2003 @11:35PM (#6939746)
    Experience:

    GeForce FX is really noisy

    Explanation:

    It sucks in large amounts of air to keep it cool. This is one of two ways a GeForce FX sucks. The other way is beyond the scope of this post.

    • The original GeForce FX cards were noisy because the fan cowling was misdesigned and the fan blades rubbed upon it. The later versions have fixed this problem, and thus no longer sound like a leaf blower.

    • My Intel P4 standard fan are louder than my fx5900 msi card. When I go into a game the gfx fan speeds up and I can hear that because of the change, but once the CPU gets hot, the cpu fan is much more noisy.
      The fx5900 makes less noise than my GF3 TI500 and much less heat. and just for that I am happy about the card since I have spent a great deal of time lowering the noise level of my PC and the GF3 was the loudest part in my PC. Now all I need is to get a better cooler for the CPU.
  • by sould ( 301844 ) on Thursday September 11, 2003 @11:38PM (#6939761) Homepage
    Bah!

    Not interested in anything NVidia do or say until they strile some agreement with the people who'se IP they license for their drivers & Open them...

    2 Years since I bought my Geforce & I still cant have 3D accelleration, tv out and framebuffer all working at once.
    • The only drivers for ATI that allow you to play modern games are their binary-only Catalyst drivers. (One acronym: S3TC) Which blow chunks compared to NV's binary-only drivers.

      I believe S3TC is one of the major factors in why BOTH ATI and NV are binary-only. I know it's the reason given for ATI's open-source drivers not being able to run UT2K3. Sadly, there aren't really any acceptable alternatives to S3TC.
      • Bravo! Someone who has a clue!

        Neither one can open source their drivers because there are large chunks of code they don't own. In nVidia's case it's even worse -- a large amount of their codebase has the letters S-G-I all over it due to legal issues dating back to the origin of nVidia.

        There's other reasons, but they're just icing on the cake -- there's simply no way for either company to open source their drivers even if they wanted to.
  • by tloh ( 451585 ) on Thursday September 11, 2003 @11:42PM (#6939777)
    Weird timing. I'm currently writing code for a class on microcontrollers. Most electrical engineering students would at some time come across an advanced digital course on microprocessors where one learns about different machine architectures and how to write assembly code for them. Are there any /.ers who have systematically studies GPU chips as part of a class, like say on graphic algorithms or DSP?
  • Lies! (Score:5, Funny)

    by zapp ( 201236 ) on Thursday September 11, 2003 @11:45PM (#6939797)
    Does the FX architecture involve cheating on benchmarks? :)
    • Does the FX architecture involve cheating on benchmarks?

      Yes, it does... Makes it sound more and more similar to the Radeon doesn't it? ;-)
  • On the other hand... (Score:4, Interesting)

    by La Temperanza ( 638530 ) <temperanza@@@softhome...net> on Thursday September 11, 2003 @11:58PM (#6939858)
    NVidia has much better Linux drivers then ATI. Support 'em.
  • by hbog ( 463004 ) <hbog1 AT hotmail DOT com> on Thursday September 11, 2003 @11:59PM (#6939860)
    From the article - "Because of the length of the pipeline and the latencies of sampling textures it is possible that the pipeline is full before the first quad reaches its end. In this case the Gatekeeper has to wait as long as is takes the quad to reach the end. Every clock cycle that passes means wasted performance then. An increased number of quads in the pipeline lowers the risk of such pipeline stalls."

    I understand that the article writers are trying to come up with reasons that the Nvidia part is wasting performance, but this doesn't make sense. No architect in this right mind would ever design a pipeline that becomes full before the first instruction can exit. The means that you are fetching much faster than you are retiring instructions. That means you will always have a pipeline stall at the frontend and you will always be wasting cycles. I think the designers would have checked something like that. You can't afford pipeline stalls to happen regularly.
    • Hmmm I remember Intel has same or sameting like that problem (AKA LongPipeline) on early PIV. I remember these days AMD beats intel in any condition. Then Intel jumps 478 form factor and push limits above 2 ghz Problem ends....
      • I remember Intel saying the P4 architecture would only show its real muscle past 2GHz, and that was even before it came out. It came true, since any P4 below that mark sucks goats compared to a P3 at the same frequency.

        Still, AMD does it better, no matter the frequency...
  • I'd would just like to add that Matrox cards have by far the best image qaulity in 2d-land. Maybe 3d-land as well. They are slower, but still snazzy.
    • Maybe I'm just blind, but I can't tell the difference between my Matrox G400 Max, ELSA Gladiac 920 (nVidia GF3), and ATI 9700 Pro. IBM P260 Monitor.

      Matrox may have had an advantage a while back, but it's nothing conclusive now days.
      • could be your monitor.. matrox cards do use high end components in their filters/dacs that do produce the best signal for CRTs. I've seen lots of best case vs. worse case graphics on STNR for matrox vs. noname ati/nvidia mfg and its pretty substantial. of course if your using an LCD its not even applicable.
        • My monitor good, or my monitor bad? It's a trinitron crt.

          Of course, ELSA wasn't exactly no-name - bunch of Germans that want bankrupt rather than cheating on the assembly. And I don't think ATI (built) ever had 2D quality issues.
  • Anandtech's article clearly shows that ATI's DX9 totally pwnz0rz nVidia's. And probably will at least until nv40 is released.

    ATI 9x owners rejoice, indeed! Even the budget 9200 smokes the 5600 Ultra!

  • Linux Drivers (Score:4, Informative)

    by maizena ( 640458 ) on Friday September 12, 2003 @12:13AM (#6939922)
    In the Windows(argh) world I really couldn't care less about what card to use.
    ATI or NVIDIA, it's just a matter of taste and/or faith.

    But in the Linux world NVIDIA still rules.
    And it's not that NVIDIA's cards are better, but they at least have a descent Linux driver.

    The bottom line is: "If you use Linux, the best choice still is a NVIDIA card!"
    • Don't forget that the Open Source DRI 3D acceleration has indeed come in the last year alone, and most people use them for their ATI Radeons that ATI did not make Linux drivers for. About a year ago, I bought an ATI Radeon 7500 PCI, and I couldn't use it at all because of the fact that there were no suitable Linux drivers for it (DRI only supported AGP at that time, I think), and just two months ago, I decided to check it out again, and with the new 4.3 version of XFree86, 3D acceleration works perfectly,
      • No it doesn't.

        Due to intellectual property issues, there are no open-source drivers that support S3TC.

        "working perfectly" implies that it can run a modern game like UT2K3 - Which the open-source drivers can't.

        Your only option for UT2K3 (And likely Doom3 when it comes out) are either NV's or ATI's closed-source drivers. And NV's Linux drivers are FAR better.
        • Your only option for UT2K3 (And likely Doom3 when it comes out) are either NV's or ATI's closed-source drivers. And NV's Linux drivers are FAR better.

          Really? Damn, then I am screwed. The nVidia drivers I am using now totally screw with my kernel latencies when doing 2D rendering. I hate them with a passion. They're huge, they sit in my kernel space, and so far they seem to be the only cause of my machine dying.

          My next card will not be an nVidia. At the moment, that means ATi. Good for them.

          I have learnt
  • I bought nVidia 5 times in a row, so really I find it quite cool to be back to ATI. I was worried the competition in the market was over.

    The key thing ATI did besides great cost/performance was get drivers out the door that didnt totally suck. For the first time in memory I have the original video driver and am not forced to download a patch!

    Good job ATI!
  • ...found out the hard way that just because something has a more recent model number, in no way makes it a better product. After suffering through the miserable peformance of the 5200fx, and even the 5600fx, we came to the conclusion that NVidia should have just skipped the NV stuff and applied that effort to something more worthwhile. I ended up with a TI4200, and my friend ended up with a Radeon card, and as far as I'm concerned, most of the fx line just plain sucks.
  • If it belongs to a group of selected programs nVidia considers important, the driver already contains an optimized version of (the shader) which is used instead. All other shaders are handled as-is.

    I'd like to see that list.

  • by Anonymous Coward
    I couldn't help get past 3D Center's not-so-subtle editorializing.

    From the article:
    Die CineFX Pipeline[!] (emphasis added)

    Er, oh wait, it's in German as well...
  • GeforceFX (Score:5, Interesting)

    by BigFootApe ( 264256 ) on Friday September 12, 2003 @02:17AM (#6940324)
    This article seems to reiterate what everyone has been saying (Carmack, Valve, everyone). The GeforceFX architecture can only be made competitive for 3d engines using modern shaders with herculean effort. This is to be competitive, not dominantly superior.

    Honestly, I thought nVidia learned their lesson with the NV1 - don't make weird hardware.

    Now, what has to be making GeforceFX owners worried is Gabe Newell's warning that the new Detonator drivers might be making illegitimate 'optimizations' and, furthermore, covering them up by rendering high quality screen captures.
    • This article seems to reiterate what everyone has been saying (Carmack, Valve, everyone). The GeforceFX architecture can only be made competitive for 3d engines using modern shaders with herculean effort. This is to be competitive, not dominantly superior.

      Now wait a minute. The GeForceFX is essentially faster than anything out there, except for the newest Radeon cards. That makes it the second fastest 3D hardware solution for PCs. And it is certainly faster than past nVidia cards, card which were alrea
      • Considering that no one has even remotely pushed the limits of the GeForce 3

        If all you play is Q3, that's true.

        If you play some of the newer games, a GF3 isn't adequate. If you want to play the newest DX9 games then a GF3 is completely inadequate (for the full experience). Go look at the framerates coming out of HL2 -- AnandTech has a good article this morning.

        The FX is only "slow" in the minds of fanboys who live for incremental performance increases without regard to power consumption or expense

        Ri
        • If you play some of the newer games, a GF3 isn't adequate. If you want to play the newest DX9 games then a GF3 is completely inadequate (for the full experience). Go look at the framerates coming out of HL2

          You can't talk about performance of games that haven't been released yet, like HL2. That's a total fanboy realm.

          My point is that we're essentially talking about a handful of games here, and these are not games that are particularly well optimized. If 3D game XYZ was targeted for the Xbox, it would ro
          • Just an FYI, but there are some DirectX9 games out right now. Tomb Raider: Angel of Darkness is one of them.. and it looks really nice on a PS2.0-capable card (GeForceFX/Radeon9600-9800) -- Does wicked fire/water/fog effects, as well as "heat haze" coming off the fire. Water reflections, and depth of field (things farther away are blurry, things up close are crystal clear.)

            Really nice to look at. If you only have a GeForce4 (like me) you can still get some of the effects, just no depth of field or advanced
          • You can't talk about performance of games that haven't been released yet, like HL2

            Fine. We'll talk about Tomb Raider [beyond3d.com] then. Sorry, but when your top of the line card has half (or less) of the performance of the competitor's card -- with the difference being between playable framerate and unplayable frame rate -- then your card is indeed slow.

            My point is that we're essentially talking about a handful of games here, and these are not games that are particularly well optimized.

            A handful of games, yes. But
  • by dzym ( 544085 ) on Friday September 12, 2003 @08:38AM (#6941790) Homepage Journal
    Gamesdepot had a short one question one answer session with Gabe Newell, then John Carmack himself on Nvidia shader performance: clicky [gamersdepot.com]!

    The proof is in the pudding.

  • The headline under the first diagram is "Die CineFX Pipeline"
  • DX9 aside, how does the Nvida card compare, to the ATI card, when used in a purely OpenGL based game?

"The following is not for the weak of heart or Fundamentalists." -- Dave Barry

Working...