Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Graphics Technology

Nvidia's DX11 GF100 Graphics Processor Detailed 220

J. Dzhugashvili writes "While it's played up the general-purpose computing prowess of its next-gen GPU architecture, Nvidia has talked little about Fermi's graphics capabilities — to the extent that some accuse Nvidia of turning its back on PC gaming. Not so, says The Tech Report in a detailed architectural overview of the GF100, the first Fermi-based consumer graphics processor. Alongside a wealth of technical information, the article includes enlightening estimates and direct comparisons with AMD's Radeon HD 5870. The GF100 will be up to twice as fast as the GeForce GTX 285, the author reckons, but the gap with the Radeon HD 5870 should be 'a bit more slender.' Still, Nvidia may have the fastest consumer GPU ever on its hands — and far from forsaking games, Fermi has been built as a graphics processor first and foremost."
This discussion has been archived. No new comments can be posted.

Nvidia's DX11 GF100 Graphics Processor Detailed

Comments Filter:
  • by Ant P. ( 974313 ) on Tuesday November 24, 2009 @04:07PM (#30217890)

    There's no point bragging about being faster than last month's graphics card if your own is still a quarter of a year from being an actual product.

    • Re: (Score:2, Interesting)

      by roguetrick ( 1147853 )

      I'm more worried about the state of PC gaming. We're taking a long slide recently and I'm starting to worry if this high end hardware is worth it.

      • Re: (Score:3, Informative)

        No, it isn't. It almost never has been. If you needed a 'high end' graphics card to play a majority of PC games reasonably, they wouldn't be 'high end' anymore... they would be standard.
        • Re: (Score:2, Insightful)

          by bloodhawk ( 813939 )
          While it definitely isn't worth it now, it was only 4 or 5 years ago that you had to stay close to the cutting edge if you wanted to play games as they were released in full resolution. Now though even a middle of the range card is adequate for even the most system taxing games. Graphics cards have outpaced gaming. I just bought a new 5870 but I had been sitting on a card that was 2 generations old before that and was still able to play most games at full res, the only real reason for the 5870 was it is a
          • by SecondaryOak ( 1342441 ) on Tuesday November 24, 2009 @07:35PM (#30220616)

            It's true that a few years ago you had to stay close to the cutting edge and now you don't; but I'm pretty sure it's not because graphics cards had outpaced games, but because game developers slowed their pace because they wanted good performance on consoles.

            I'm sure game developers could easily overwhelm graphics cards if they wanted to, but that doesn't only block PCs without high-end cards, but also all the consoles. I have to say that as a PC-only gamer, I find the situation very positive. I like not having to upgrade constantly.

      • Re: (Score:3, Insightful)

        by timeOday ( 582209 )
        Which would be a good reason for NVidia to focus on science and media applications rather than games after all.
      • by Pojut ( 1027544 ) on Tuesday November 24, 2009 @04:23PM (#30218106) Homepage

        I'm gonna have to disagree with you there.

        Look at the 4850. When it was brand new, it cost $199, and it could run ANY game on the market at full resolution and detail with a smooth, sustained framerate. Flash back to the year 2000. Try to find me a $200 card back then that could do the same. Hell, I challange you to do the same thing just 5 years ago, back in 2004.

        Good luck.

        • Re: (Score:3, Insightful)

          I'm gonna have to disagree with you there.

          Look at the 4850. When it was brand new, it cost $199, and it could run ANY game on the market at full resolution and detail with a smooth, sustained framerate. Flash back to the year 2000. Try to find me a $200 card back then that could do the same. Hell, I challange you to do the same thing just 5 years ago, back in 2004.

          Does this mean that we're hitting a software complexity wall?

          It's now the devs turn to play catch up... I hope nobody cheats by adding idle loops (looks around menacingly).

          • by Pojut ( 1027544 ) on Tuesday November 24, 2009 @04:52PM (#30218462) Homepage

            As a poster previously in the thread stated, a big part of it are games that need to work on consoles and PC. As an example, considering the 360 has a video card roughly equivalent to a 6600GT, there is only so far they can push ports. Hell, even now, 3-4 years into the current gen, there are STILL framerate problems with a lot of games...games that can now run at an absurdly high FPS on a decent gaming PC.

            • Re: (Score:3, Interesting)

              by poetmatt ( 793785 )

              companies don't want games on console and PC. The reason is there is a lot less control on PC. So they want to shove console requirements onto a PC and you end up with horrible ports like Borderlands and MW2. Thus, nobody wants the PC version and they go "oh, nobody bought the PC version" even though the reason is they fucked their own community, so that they don't have to keep making games for PC.

              It's a really shortsighted strategy, but it's basically an attempt at creating a walled garden all over again.

              • Yes, it pisses me off that DX9 is the development target precisely because that's the feature level of the two most powerful consoles. It's funny you mention Borderlands, because the game is very fun to play, but the graphics are annoying.

                1. They took the same graphics on the 360 version, and just added a whole bunch of pitch-black dynamic shadows for the PC (I think they don't realize that you can assign DYNAMIC RANGE for your shadow intensity). This makes the game way too dark, and since their gamma con

          • Does this mean that we're hitting a software complexity wall?

            From the perspective of a game programmer, I'd posit that it's not as much a software complexity wall as it is a content creation wall. Creating a fully-realized world in a modern videogame is amazingly time consuming. It's now all about how to reduce the dozens of developer-years required to build the environment (world geometry, props, models, effects, etc) and gameplay content (events, missions, etc). One of the problems has been that with each new generation, we not only have to re-build all our cont

          • by afidel ( 530433 )
            More like an art wall, the percentage of total cost of development for art resources in modern games is incredibly higher than it was back in say the 90's. Budgets for AAA titles have ballooned to nearly Hollywood levels.
        • Re: (Score:3, Insightful)

          Look at the 4850. When it was brand new, it cost $199, and it could run ANY game on the market at full resolution and detail with a smooth, sustained framerate

          Pull the other one. It has got bells on it.

          Define "full resolution".

          If I have a very old 1280x1024 monitor, sure.
          If I have a new 1920x1200 monitor, not so much.
          If I have a dual 2560x1600 monitor setup, not in this life time.

          Also, define "full detail". Is that at medium? High? Maximum? What level of anisotropic filtering? Anti aliasing?

          But let's have a

          • Re: (Score:3, Informative)

            by Anonymous Coward

            They have admitted those 2 games were programmed by monkeys.

            If you compare a 4850 from then to a 4850 today with the game fully patched and monkey shit removed you'd see an increase in frame rates. Or compared it to the squeal which had even more monkey shit removed there would be a further increase in frame rates.

            Besides the fact that 2 games, that received crap reviews except from the "Oh so pretty" crowd do not represent the market.

          • If you're going to use any "random" game is an example, why would you purposely choose a game that was not very well optimized? Crysis ran pretty poorly on top of the line hardware at the time too (in comparison to other games that looked just as good and ran better). Age of Conan unfortunately resulted from a lot of the same issues. It wasn't as bad as Crysis, but still pretty bad in that regard.
          • by Pojut ( 1027544 )

            I define full resolution as the max resolution of the average big monitor these days...which, unless you have some 27-30 inch monstrosity, cap out at 1920 x 1200. In your example, they have 16XAA enabled, which makes a MASSIVE difference...which is something I have adressed in my other posts made in this thread. That being said, congrats...you're right. I am totally wrong. There actually is a game out there that a then-$200 card couldn't play full bore. Sue me.

            By the way, I appreciate you insulting me,

          • Re: (Score:2, Insightful)

            by Spatial ( 1235392 )
            First you cherry-pick two very rare resolutions, and then you choose two games that are renowned for their exceptionally high system requirements. Pretty intellectually dishonest of you.

            Edge cases don't make good refutations of general statements. Besides, he's not totally correct but he isn't far from the truth either. The HD4850 can run most games at fairly high settings, at the highest resolutions most people have available.

            (According to the Steam stats, 1920x1200 comprises less than 6% of users
          • Im running a 4850 i picked up open box at best buy for $130, best video card i ever bought. I have a $600 BFG 8800 GTX sitting on the shelf dead. For the money, the 4850 could not be beat at the time of its release. P.S. BFG sucks, EVERY card I have owned from them has failed. 6800 GT, 7800, 7900, 8800 GTX.
          • Also, define "full detail". Is that at medium? High? Maximum? What level of anisotropic filtering? Anti aliasing?

            Real gamers play with minimum details on an old Athlon XP. /inb4flamewar

            I have two nitpicks.

            1) If you tweak Crysis, it performs much better.
            2) If you add an SSD for Crysis, it performs much better. In some cases, the minimum FPS doubles. 16fps isn't very playable, but a solid 30fps with vsync definitely is. UE3 gets around this problem(HDD latency) by streaming in textures, so rather than get a lower framerate, sometimes you'll just be staring at gaudy soup textures for a second.

            P.S. I play L4D2 on an 8800

        • Actually I don't even mean it from a technical standpoint. I just feel like the influx of console tailored games, designed to run on local hosts for multiplayer, and designed to prevent modification are really screwing with things. Of course, I have to say that my view is strong in that I'm mainly looking at blockbuster games and not some of the real gems that are PC centric.

        • by jandrese ( 485 )
          The problem is that most of the new AAA games are console ports now, because consoles are where they money is. But none of the current generation consoles can hold a candle to even a 2 year old PC, so pretty much any halfway competent PC can play all modern games that aren't Crysis. You have to work to get a machine that can't support games (like getting one with Intel graphics).

          Worse, the games that aren't just console ports are small indy developer efforts with simple graphics that rarely need more th
      • Re: (Score:3, Interesting)

        by alen ( 225700 )

        since WoW controls 50% of all pc game revenues, the market as it was a few years ago is over. it's not even fun building a PC anymore since everything is integrated on the motherboard except for a decent graphics card.

        i'm personally tired of chasing the latest graphics card every year to play a game. i'll probably buy a PS3 soon and a Mac next year just because it's lack of wires makes the wife happy

        • it's not even fun building a PC anymore since everything is integrated on the motherboard except for a decent graphics card.

          And the RAM. And sound card if you want to get it off the mobo. And the hdd/optical drive(s)...

          Building a PC can be really fun, still. Getting a decent graphics card for cheap is still possible, too, and you don't have to chase the latest graphics card. You don't have to play games on the Ultra High setting, either...

          • I think the point is not that building a PC *can* be fun, but rather, it's usually not anymore. ie, the time+cost to reward ratio is off!

            Building a computer even 10 years ago was a lot different than it is today. Even minute amounts of overclocking could make a huge difference, small differences in ram timings were noticeable. Getting a Cyrix, an Intel or AMD cpu gave very different performance profiles. Individual steppings were sought out by overclockers. Certain soundcards could greatly lighten CPU load,

      • by dave562 ( 969951 )

        I finally made the switch to the console with COD/MW2. I have a PS3 hooked up to a 37inch Samsung LCD. My desktop PC is a simple Core2Duo (2.6ghz) with an old GeForce 6800 256MB. I couldn't stomach the cost of upgrading the hardware on the desktop and having to deal with hackers. In all honesty, it's the hackers that really drove me away. It was probably 2/3rds hackers, 1/3rd knowing that I'd get flawless framerate and top notch graphics on the console. I've been playing LAN/online FPS games since Qua

        • by Khyber ( 864651 )

          Umm, the PS3 has native keyboard and mouse support. I plugged in an old wireless Compaq keyboard/mouse combo and it worked flawlessly. No need to buy adapters. If the game devs didn't put in keyboard support for the PS3, that's their screwup.

      • by Dan667 ( 564390 )
        I disagree, it looks like console gaming has run its course like it did with the Atari 2600. After microsoft banned all those people and all the xbox and ps3 quality problems people are beginning to realize consoles are not as "just works" and they thought. I have upgraded my rig once in 4 years and it still runs every game I have bought in that time at a reasonable resolution. That is a lot cheaper than buying every gen console and all the extra crap with it.
      • Re: (Score:2, Interesting)

        by bonch ( 38532 )

        Thank the pirates for killing PC gaming. Developers actually make money from consoles.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Sure there is, because then some people will wait for this new card rather then buying AMD's card, thus providing Nvidia with revenue and profit.

      • by rrhal ( 88665 ) on Tuesday November 24, 2009 @04:19PM (#30218056)
        By that logic wouldn't those same people then wait for AMD's next offering which will be yet faster? Waiting for the latest and greatest means there will always be something greater in the pipeline to wait for. How long before we saturate the PCI-E bus and need something faster? The current bus structure is about as old as AGP was when it lost favor.
        • How long before we saturate the PCI-E bus and need something faster?

          In a way, it has already been replaced. PCIE V2 is the current standard. It's backwards and forwards compatible, and has twice the bandwidth of V1. V3 will double that bandwidth again.

          It'll be quite a long time before it becomes obsolete.

        • ``By that logic wouldn't those same people then wait for AMD's next offering which will be yet faster?''

          Well, some people actually do that. I'm waiting for the budget card that comes out with a fully functional open-source driver available. Until then, my fanless GeForce 6600 is chugging along just fine. I don't even remember what I had before that, but it was something fanless with the open-source r300 driver ... a Radeon 9200 or similar.

          But then, I don't buy games unless they run on Linux, either. Which u

        • by Khyber ( 864651 )

          "How long before we saturate the PCI-E bus and need something faster?"

          Considering Crysis can't fully tax the bandwidth of an AGP 8x slot, probably not for a good long while.

          ATi's 4850 AGP flavor rocks Crysis no problem. At that point, it's the CPU/Memory that's the bottleneck.

    • Re: (Score:3, Insightful)

      There's no point bragging about being faster than last month's graphics card if your own is still a quarter of a year from being an actual product.

      You haven't spent much time with Marketing people, have you?

    • There's a phrase for it, paper launch [wikipedia.org] or paper tiger [wikipedia.org]. If this actually gets released is one thing. I'd like to see benchmarks, not theoreticals.

  • Feh. (Score:5, Informative)

    by Pojut ( 1027544 ) on Tuesday November 24, 2009 @04:17PM (#30218032) Homepage

    The days of needing the biggest, fastest, most expensive card are pretty much over. You can run just about any game out there at max settings at 1920 X 1080 silky smooth with a 5870, which goes for less than $300. Hell, even the 4870 is still almost overkill.

    Unless you plan on maxing out AA and AF while playing on a 30 inch screen, there is no reason to drop $500-$600 on a video card anymore...

    • Re:Feh. (Score:5, Interesting)

      by Knara ( 9377 ) on Tuesday November 24, 2009 @04:21PM (#30218084)
      Almost as if Nvidia were looking at some other market than gamers....
    • Re:Feh. (Score:5, Insightful)

      by Kratisto ( 1080113 ) on Tuesday November 24, 2009 @04:22PM (#30218096)
      I think this is largely because consoles set the pace for hardware upgrades. If you want to develop a multi-platform game, then it's going to need to run on XBox 360 hardware from four years ago. I don't even check recommended requirements anymore: I know that if it has a 360 or PS3 port (or the other way around), I can run it.
      • Re: (Score:3, Interesting)

        by Pojut ( 1027544 )

        This is pretty much the case with me. I plan on doing a full system upgrade this Cyber Monday, but I haven't bought any new hardware for my computer other than a new DVD drive in about 2 years...and I STILL haven't needed to turn down visual details in any games that are released.

        • by DavidTC ( 10147 )

          Yeah. I paid about 150 for my graphics card, a 9600 GT, I have a nice 1680x1050 monitor I'm not going to upgrade any time soon, and at this point I can't imagine what games would require me to buy a new CPU.

          I can run any game whatsoever at full resolution and visual details.

          That's always been the joke...if you buy a middling video card, you're buying the same thing that's in a PS3 or whatever the newest console is, because those were created a year ago.

          • Seriously? I paid $100 for a 9800 GT a while back, and have two 1400x1050 monitors. Your card sounds expensive.

            I agree with you though, aside any hardware failures, I won't be upgrading it for a long time either. Heck, I wouldn't have moved up from the old 8800 GS if it weren't for VDPAU requiring a newer card.

            • by DavidTC ( 10147 )

              I probably got it before you, I think I've had it a year at this point.

              $100 is normally the spot I aim at, but I had some extra cash last time, because cost of the memory and motherboard suddenly dropped before I bought, so went about $50 higher than normal.

          • by sznupi ( 719324 )

            Not always. Not when both kinds of platforms weren't homogenized to such a degree...

        • by afidel ( 530433 )
          That's why I'm looking at a HD5750 with passive cooling when they come down to ~$100, lower power bills and no noise but it will still play just about anything at 1920*1080.
      • Re: (Score:3, Insightful)

        by sznupi ( 719324 )

        Well, the "problem" is those are not really ports anymore; often practically the same engine.

        Which kinda sucks, coming from both worlds, enjoying both kinds of games - now that Microsoft made targeting both platforms from the start of development "sensible", most games are hybrids; not exploiting the strengths of either platform.

      • by NotBorg ( 829820 )

        I'm sorry. I can't resist. I simply must put a /. spin on this. Lets see...

        MICROSOFT AND SONY ARE HOLDING THE WORLD BACK AGAIN! AHHHHHH!H!H!!!!! They are teh evils! Innovation stiflers!

        (Note to moderators: I expect nothing less than a +5 Insightful. There I saved you time you won't have to post that "Undoing moderation" crap.)

    • Less than $300 is still a lot for a graphics card. Some higher end CPUs (Intel Core i7 920) go for around that price, and CPUs are much more important than a graphics card in terms of functionality (although GPUs have become more important recently). If you don't have a CPU, your computer doesn't work at all. If you don't have a discrete graphics card, you can still do a great many things aside from playing games/rendering graphics. I want to be able to run just about any game out there at max settings
      • Re:Feh. (Score:4, Insightful)

        by Pojut ( 1027544 ) on Tuesday November 24, 2009 @04:38PM (#30218264) Homepage

        Mostly agreed, however I will take a low-to-mid range CPU if it means I can afford a top of the line GPU...when it comes to gaming, anyway.

        The GPU is a much larger bottleneck in terms of gaming, although the line of importance between the GPU and CPU has been blurring a bit lately.

    • You can run just about any game out there at max settings at 1920 X 1080 silky smooth with a 5870, which goes for less than $300.

      Well, that's until Crysis 2 with Stereo 3D + Multi-Head and Windows 8's compositing + DirectX 13 come out.
      Then it'll be again waiting 1 year until the hardware catch up.

      Remember the mantra :
      What hardware giveth, software taketh...

      Also, you assume discreet GPU.
      nVidia and ATI have still to do some improvement until the performance you quote happen on a low-power miniature *embed* GPU in a laptop (that doesn't drain the battery flat after 10 minutes).
      Thus expect future generation with better performance per wa

    • i agree completely but i think that this situation will be a catalyst for the next big step. i think back to when unreal was released. there was almost no hardware that could run the game smoothly, in a way it was a proof of concept of what gaming could become, but as hardware caught up we saw it give rise to a whole new way to make games, FPS, RTS, RPG, all genres really, have adopted the 3d model, even board games. now the market is saturated and the pressure is off the hardware vendors to make components

    • You can run just about any game out there at max settings at 1920 X 1080 silky smooth with a 5870, which goes for less than $300.

      The 5870 still seems to cost more than $400, but your point is of course valid. What might become an issue is multi-monitor gaming like ATI's Eyefinity. Running a triple-screen setup demands a bit more. I don't know if multi-monitor will become mainstream, but it's roughly in the same ballpark price-wise as high-end GPUs.

    • by jasonwc ( 939262 )
      You probably meant the 5850, which had an initial MSRP of $260 but is now selling at $300-310 due to supply issues. The 5870 is ATI's flagship card and had a MSRP of $380. It's currently on sale for $400-420.
    • by Khyber ( 864651 )

      "Unless you plan on maxing out AA and AF while playing on a 30 inch screen,"

      Totally unnecessary with the subpixel rendering engines in most LCD TVs nowdays, considering their native resolution is FAR higher than their maximum capable resolution (that's where the subpixel rendering comes into play, for upscaling to three times the amount of pixels. A 30-inch TV would have HUGE pixels at 1920x1080 if aspect were followed.)

  • by SnarfQuest ( 469614 ) on Tuesday November 24, 2009 @04:36PM (#30218244)

    I assume they mean the scientist Enrico Fermi. So, did they dig him up, or is this one of those Jesus fingerbone type of thing, where there are more fingerbones than there are chickens? Did they use the whole Fermi, or are there only specific pieces of him that work? Whatever the case, there must be a limited number of cards that can be built, since there is a finite amount of Fermi.

    • I assume they mean the scientist Enrico Fermi. So, did they dig him up, or is this one of those Jesus fingerbone type of thing, where there are more fingerbones than there are chickens? Did they use the whole Fermi, or are there only specific pieces of him that work? Whatever the case, there must be a limited number of cards that can be built, since there is a finite amount of Fermi.

      They used his skull with the jawbone of an orangutan. [wikipedia.org]

  • 40nm process... (Score:3, Insightful)

    by Sollord ( 888521 ) on Tuesday November 24, 2009 @04:41PM (#30218302)
    Isn't this going to be built on the same TSMC process as the 5870? The same one that's having yield problems and supply shortages for AMD and yet the nvidia chip is even bigger and more complex chip? I for see delays.
    • by afidel ( 530433 )
      I really wonder why AMD uses TSMC when GlobalFoundries has mature 45nm 300mm SOI tech?
  • So was it created in the Fermilab?!
  • by ZirbMonkey ( 999495 ) on Tuesday November 24, 2009 @05:13PM (#30218736)

    While the articles is very interesting on explaining the chip archetecture and technical specifications, I can't believe there sin't a single actual gaming benchmark on these chips yet.

    The best they can do is give an estimated calculation on how the chips may or may not actually live up to. They estimate that it will be faster at gaming than ATI's already released 5870.

    By the time Nvidia actually releases their Fermi GPU's, ATI's Cypres will have been actively selling for over 3 months. And there's a funny thing about advancements over time: things keep getting faster (aka Moore's Law). Supposing that chips are supposed to double in transistor count every year, the new Fermi chips need to have 20% more transistors than ATI's RV5870 if they release 3 months later... just to keep on the same curve.

    And there's still no mention of pricing... but that's expected on a product that doesn't actually run games yet. I don't see a lot of optimism on the gaming front, so I hope for Nvidia's sake that the investment into GPGPU is the branch out they need to trump ATI's sales.

  • Does nVidia sell any of these top-end GPU chips with a full x86 core on the die with it? A CPU that's maybe as powerful as a single core P3/2.4GHz, tightly coupled for throughput to GPU and VRAM, going out to the PC bus (PCI-e) just for final interface to a video display (or saving to a file, or streaming to a network).

    • x86 is horrible for graphics processing compared to a stream processor. when it comes to performing repetitive calculations on a constant stream of data a gpu will beat a cpu any day. yes, that is even with hyperthreading and streaming SIMD.
      • Yes, that is why the GPU is on there. The x86 is there for everything else: OS and application processing, managing devices, etc. A single chip with both CPU and GPU for maximum total performance. An x86 because that's where the most SW runs.

  • Looks like a cool chip. It will be interesting to see how Nvidia does in the marketplace when the don't have rabid enthusiast gamers subsidizing their development efforts every 6 months. Let's face it, who runs out to buy the latest graphics card anymore, when you get the same game on your 360/PS3 with no upgrade? They're mostly positioning this launch as a 'compute' GPU, so they certainly see the writing on the wall. With Fermi and beyond, Nvidia will have to provide tangible real-world profits for some c

BLISS is ignorance.

Working...