Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Graphics Software

GeForce FX Architecture Explained 185

Brian writes "3DCenter has published one of the most in-depth articles on the internals of a 3D graphics chip (the NV30/GeForce FX in this case) that I've ever seen. The author has based his results on a patent NVIDIA filed last year and he has turned up some very interesting relevations regarding the GeForce FX that go a long way to explain why its performance is so different from the recent Radeons. Apparently, optimal shader code for the NV30 is substantially different from what is generated by the standard DX9 HLSL compiler. A new compiler may help to some extent, but other performance issues will likely need to be resolved by NVIDIA in the driver itself."
This discussion has been archived. No new comments can be posted.

GeForce FX Architecture Explained

Comments Filter:
  • by Anonymous Coward on Friday September 12, 2003 @12:23AM (#6939689)
    Nothing beats my 9500 to 9700 card. Its a simple driver hack. Now my lowly 130$ budget card can whoop any GeforceFX garbage. Plus the overclockability after its a 9700. You just dont get any sweeter.
  • by mmp ( 121767 ) on Friday September 12, 2003 @01:04AM (#6939877) Homepage
    Sure, there are classes completely about the architecture of graphics hardware here and there. The slides for each of these two classes on graphics hardware are excellent.

    Owens @ UC Davis [ucdavis.edu]

    Akeley and Hanrahan @ Stanford [stanford.edu]

  • Anand tells the tale (Score:2, Informative)

    by Bob-o-Matic! ( 620698 ) <(robert.peters) (at) (gmail.com)> on Friday September 12, 2003 @01:07AM (#6939895) Homepage
    Anandtech's article clearly shows that ATI's DX9 totally pwnz0rz nVidia's. And probably will at least until nv40 is released.

    ATI 9x owners rejoice, indeed! Even the budget 9200 smokes the 5600 Ultra!

  • Linux Drivers (Score:4, Informative)

    by maizena ( 640458 ) on Friday September 12, 2003 @01:13AM (#6939922)
    In the Windows(argh) world I really couldn't care less about what card to use.
    ATI or NVIDIA, it's just a matter of taste and/or faith.

    But in the Linux world NVIDIA still rules.
    And it's not that NVIDIA's cards are better, but they at least have a descent Linux driver.

    The bottom line is: "If you use Linux, the best choice still is a NVIDIA card!"
  • by Anonymous Coward on Friday September 12, 2003 @01:52AM (#6940067)
    Wow, I didnt expect my Anonymous Coward to get modded up. That really warms my heart that slashdot crowd listened to me for once. I might have to sign up for a real account now.

    As for you haters out there. It has nothing to do with the memory speeds, memory can be overclocked independently of the core. And no my Infineon 3.3 does not overclock to much. As for the hack itself, it involves opening up all 8 pipelines, as opposed to the 4 default in the 9500. Core can be overclocked trough the roof :)Check out ocfaq.com/softmod for more info.
  • by Anonymous Coward on Friday September 12, 2003 @01:58AM (#6940091)
    Well you can make the GeForce a nVidia Quadro and gain additional OpenGL functionality you generally won't need :-) Although, speed will stay at the same level.
  • Re:Say what (Score:4, Informative)

    by afidel ( 530433 ) on Friday September 12, 2003 @03:01AM (#6940283)
    Basically it comes down to MS partnered with Nvidia for DX8 and XBox-1, Nvidia asked MS to use some KY so MS chose ATI for DX9 and XBox-2.

    p.s.
    If you don't get this, MS was losing money on the XBox for a long time, some analysts say they still are, to minimize those losses they asked Nvidia to take a hit on the contract terms for the XBox hardware agreement, Nvidia being a relitivly small company said no thanks and that effectivly ended their relationship for now.
  • Re:Say what (Score:2, Informative)

    by Anonymous Coward on Friday September 12, 2003 @05:20AM (#6940685)
    Um... Nvidia didn't buy out 3dlabs. Creative bought out 3dlabs. I think you mean Nvidia bought out 3dfx. 3dfx made voodoo and glide.

    GO VOODOO and GLIDE.

    Creative was suppose to help 3dlabs pump out consumer level cards yet I haven't seen them at the retail store.
  • Re:Say what (Score:1, Informative)

    by Anonymous Coward on Friday September 12, 2003 @06:28AM (#6940854)
    The 3Dlabs/3dfx got corrected... But the reason why DX didn't get adopted (while Glide was around) was that every version before DX5 utterly, completely sucked. It was useless. Glide was an easy only choice when Voodoo was technically superior (not just an API matter) and few if any IHVs had half-decent OpenGL drivers.

    DX5 was mostly okay to develop for, DX6 offered some cool features (bumb mapping, texture compression), and DX7 finally caught up with OGL1.3 features (if not ease of programming).
  • by dzym ( 544085 ) on Friday September 12, 2003 @09:38AM (#6941790) Homepage Journal
    Gamesdepot had a short one question one answer session with Gabe Newell, then John Carmack himself on Nvidia shader performance: clicky [gamersdepot.com]!

    The proof is in the pudding.

"If I do not want others to quote me, I do not speak." -- Phil Wayne

Working...