Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Graphics Technology

Nvidia's DX11 GF100 Graphics Processor Detailed 220

J. Dzhugashvili writes "While it's played up the general-purpose computing prowess of its next-gen GPU architecture, Nvidia has talked little about Fermi's graphics capabilities — to the extent that some accuse Nvidia of turning its back on PC gaming. Not so, says The Tech Report in a detailed architectural overview of the GF100, the first Fermi-based consumer graphics processor. Alongside a wealth of technical information, the article includes enlightening estimates and direct comparisons with AMD's Radeon HD 5870. The GF100 will be up to twice as fast as the GeForce GTX 285, the author reckons, but the gap with the Radeon HD 5870 should be 'a bit more slender.' Still, Nvidia may have the fastest consumer GPU ever on its hands — and far from forsaking games, Fermi has been built as a graphics processor first and foremost."
This discussion has been archived. No new comments can be posted.

Nvidia's DX11 GF100 Graphics Processor Detailed

Comments Filter:
  • by Ant P. ( 974313 ) on Tuesday November 24, 2009 @04:07PM (#30217890)

    There's no point bragging about being faster than last month's graphics card if your own is still a quarter of a year from being an actual product.

  • by Anonymous Coward on Tuesday November 24, 2009 @04:10PM (#30217920)

    Sure there is, because then some people will wait for this new card rather then buying AMD's card, thus providing Nvidia with revenue and profit.

  • by timeOday ( 582209 ) on Tuesday November 24, 2009 @04:14PM (#30217980)
    Which would be a good reason for NVidia to focus on science and media applications rather than games after all.
  • Re:Feh. (Score:5, Insightful)

    by Kratisto ( 1080113 ) on Tuesday November 24, 2009 @04:22PM (#30218096)
    I think this is largely because consoles set the pace for hardware upgrades. If you want to develop a multi-platform game, then it's going to need to run on XBox 360 hardware from four years ago. I don't even check recommended requirements anymore: I know that if it has a 360 or PS3 port (or the other way around), I can run it.
  • by Pojut ( 1027544 ) on Tuesday November 24, 2009 @04:23PM (#30218106) Homepage

    I'm gonna have to disagree with you there.

    Look at the 4850. When it was brand new, it cost $199, and it could run ANY game on the market at full resolution and detail with a smooth, sustained framerate. Flash back to the year 2000. Try to find me a $200 card back then that could do the same. Hell, I challange you to do the same thing just 5 years ago, back in 2004.

    Good luck.

  • Re:Feh. (Score:3, Insightful)

    by sznupi ( 719324 ) on Tuesday November 24, 2009 @04:33PM (#30218210) Homepage

    Well, the "problem" is those are not really ports anymore; often practically the same engine.

    Which kinda sucks, coming from both worlds, enjoying both kinds of games - now that Microsoft made targeting both platforms from the start of development "sensible", most games are hybrids; not exploiting the strengths of either platform.

  • Re:Feh. (Score:4, Insightful)

    by Pojut ( 1027544 ) on Tuesday November 24, 2009 @04:38PM (#30218264) Homepage

    Mostly agreed, however I will take a low-to-mid range CPU if it means I can afford a top of the line GPU...when it comes to gaming, anyway.

    The GPU is a much larger bottleneck in terms of gaming, although the line of importance between the GPU and CPU has been blurring a bit lately.

  • 40nm process... (Score:3, Insightful)

    by Sollord ( 888521 ) on Tuesday November 24, 2009 @04:41PM (#30218302)
    Isn't this going to be built on the same TSMC process as the 5870? The same one that's having yield problems and supply shortages for AMD and yet the nvidia chip is even bigger and more complex chip? I for see delays.
  • by quercus.aeternam ( 1174283 ) on Tuesday November 24, 2009 @04:48PM (#30218394) Homepage

    I'm gonna have to disagree with you there.

    Look at the 4850. When it was brand new, it cost $199, and it could run ANY game on the market at full resolution and detail with a smooth, sustained framerate. Flash back to the year 2000. Try to find me a $200 card back then that could do the same. Hell, I challange you to do the same thing just 5 years ago, back in 2004.

    Does this mean that we're hitting a software complexity wall?

    It's now the devs turn to play catch up... I hope nobody cheats by adding idle loops (looks around menacingly).

  • by Pojut ( 1027544 ) on Tuesday November 24, 2009 @04:52PM (#30218462) Homepage

    As a poster previously in the thread stated, a big part of it are games that need to work on consoles and PC. As an example, considering the 360 has a video card roughly equivalent to a 6600GT, there is only so far they can push ports. Hell, even now, 3-4 years into the current gen, there are STILL framerate problems with a lot of games...games that can now run at an absurdly high FPS on a decent gaming PC.

  • by MartinSchou ( 1360093 ) on Tuesday November 24, 2009 @05:05PM (#30218634)

    Look at the 4850. When it was brand new, it cost $199, and it could run ANY game on the market at full resolution and detail with a smooth, sustained framerate

    Pull the other one. It has got bells on it.

    Define "full resolution".

    If I have a very old 1280x1024 monitor, sure.
    If I have a new 1920x1200 monitor, not so much.
    If I have a dual 2560x1600 monitor setup, not in this life time.

    Also, define "full detail". Is that at medium? High? Maximum? What level of anisotropic filtering? Anti aliasing?

    But let's have a look at something a bit realistic and look at "any game", in this case Crysis.

    From [H]ard|OCP's review of the 4850 from June 25th, 2008 [hardocp.com]:

    Highest Playable Resolution:
    1600x1200
    No AA | 16x AF

    Minimum FPS: 16
    Maximum FPS: 42
    Average FPS: 28.5

    Considering that the Radeon 4870 and Geforce GTX 260 have their highest playable at 1920x1200, I'd say you're flat out wrong in your claim.

    Now, you may claim that Crysis doesn't count as it's not "ANY game on the market", so let's use Age of Conan instead [hardocp.com]:
    Woops, that one seems to hit its limit at 1600x1200.

    That was my rather convoluted way of saying "you're an idiot".

  • by Talderas ( 1212466 ) on Tuesday November 24, 2009 @05:12PM (#30218716)

    Which is why Metal Gear Solid 4 still looks better than games that are coming out now that are both PS3 and XBox 360.....

  • by ZirbMonkey ( 999495 ) on Tuesday November 24, 2009 @05:13PM (#30218736)

    While the articles is very interesting on explaining the chip archetecture and technical specifications, I can't believe there sin't a single actual gaming benchmark on these chips yet.

    The best they can do is give an estimated calculation on how the chips may or may not actually live up to. They estimate that it will be faster at gaming than ATI's already released 5870.

    By the time Nvidia actually releases their Fermi GPU's, ATI's Cypres will have been actively selling for over 3 months. And there's a funny thing about advancements over time: things keep getting faster (aka Moore's Law). Supposing that chips are supposed to double in transistor count every year, the new Fermi chips need to have 20% more transistors than ATI's RV5870 if they release 3 months later... just to keep on the same curve.

    And there's still no mention of pricing... but that's expected on a product that doesn't actually run games yet. I don't see a lot of optimism on the gaming front, so I hope for Nvidia's sake that the investment into GPGPU is the branch out they need to trump ATI's sales.

  • by DoofusOfDeath ( 636671 ) on Tuesday November 24, 2009 @05:22PM (#30218906)

    There's no point bragging about being faster than last month's graphics card if your own is still a quarter of a year from being an actual product.

    You haven't spent much time with Marketing people, have you?

  • by Pojut ( 1027544 ) on Tuesday November 24, 2009 @05:42PM (#30219146) Homepage

    I don't think the problem is the 360, I think the problem is your fanboyism. Multi-platform games look more or less the same between the 360 and the PS3.

    Trust me, I know. I have both systems.

  • by bloodhawk ( 813939 ) on Tuesday November 24, 2009 @05:53PM (#30219312)
    While it definitely isn't worth it now, it was only 4 or 5 years ago that you had to stay close to the cutting edge if you wanted to play games as they were released in full resolution. Now though even a middle of the range card is adequate for even the most system taxing games. Graphics cards have outpaced gaming. I just bought a new 5870 but I had been sitting on a card that was 2 generations old before that and was still able to play most games at full res, the only real reason for the 5870 was it is a new machine and should hold me in good stead for a few years.
  • by Spatial ( 1235392 ) on Tuesday November 24, 2009 @06:06PM (#30219476)
    First you cherry-pick two very rare resolutions, and then you choose two games that are renowned for their exceptionally high system requirements. Pretty intellectually dishonest of you.

    Edge cases don't make good refutations of general statements. Besides, he's not totally correct but he isn't far from the truth either. The HD4850 can run most games at fairly high settings, at the highest resolutions most people have available.

    (According to the Steam stats, 1920x1200 comprises less than 6% of users' displays, 2560x1600 is in an "other" category of less than 4%. 1280x1024 is the most common, and that or lower comprises 65%)
  • by Anonymous Coward on Tuesday November 24, 2009 @06:31PM (#30219788)

    We can tell who is a console gamer here. If you actually do some research and find out why PC gamers are upset about MW2, maybe you'll understand why. Dedicated servers are there so everybody has fair play. Why would I want some 'yuck' hosting a match on his/her crappy 756k DL/256k UL connection where I have 200-300 ms ping (sometimes up to 500 ms which is unplayable) and the host has none?

    Entitlement? You guys just don't understand what that word means since you're used to getting everything shoved down your throat. The game costs $10 more than the previous game which, by the way, is just Activision being greedy since they don't pay licensing fees on PC, plus PC loses support for large matches, console commands, like changing your FoV, and my personal favorite "...custom options like mouse control, in game chat, and graphic settings." (look it up, developer notes).

    PC less relevant hm? So the vast majority of competitive gaming is irrelevant now? And if you actually know how to use a PC rather than saying you can turn it on, PC is by far not a "pain in the ass." Better than being locked in by M$, Sony, or hell even Nintendo. I would much rather be able to control what goes in my machine rather than somebody else controlling not only my hardware but my software too. Sure it's more expensive, but after PC gaming I have a hard time going back to console. So really it's people blaming PC users for their own downfall that is really killing PC gaming. You guys lay back and let it all happen, rather than being proactive about something.

    Back to the original point, meh, don't really need the latest and greatest to run any current games. I run Borderlands with a 9800 GT 512 MB at full settings and it plays great. I run Dragon Age: Origins at nearly max settings and there is only really a little stutter in gameplay, but I'm thinking that has to due with my CPU since it's always running at 100%. I'm using a stock AMD Athlon X2 5200+ (sorry don't remember the core off the top of my head) not overclocked (yet). Both are brand new games and yet, they run beautifully on my older rig, getting up to 2 3/4 years now. Only thing I've upgraded is the video card, and added a sound card about a year after it was built, to take some load of the CPU.

  • by 4D6963 ( 933028 ) on Tuesday November 24, 2009 @07:44PM (#30220714)

    Here's how it really works : no one (on PC) buys single player games, they only buy multiplayer games because, I don't know if you've tried lately, but if you want to be a pirate there are very few games on which you'll be able to play multiplayer, if you're lucky you'll get access to a few cracked servers.

    So PC gamers buy multiplayers, they HAVE to. MW2 shipped with a multiplayer system that fell VERY short of people's expectations for a multiplayer game, henceforth they treated it like a single player game, pirated it to play its 6 hours of gameplay and went back to waiting for Bad Company 2.

  • by non0score ( 890022 ) on Wednesday November 25, 2009 @04:16AM (#30223678)

    The 360 and PS3 are practically identical.

    The PS3 and 360 PPC elements are identical, yes. But the rest aren't. The SPUs are vastly different to the PPEs ranging from the ISA, to the memory architecture, to the instruction latencies, to the register file size/width, to the local memory latencies, to the...oh boy, they're vastly different on so many levels. I also don't know how those TFLOP numbers came about, because they're totally wrong.

    Comparing GTAIV on the 360 vs the PS3, the 360 looks like it's running in 16-bit color depth, shadows are absolutely horrible, and the draw distance isn't even on par with the PS3.

    Using GTA to compare the graphics hardware and concluding that PS3 is better? I just hope you don't mention that to the devs, because they'll laugh their ass off about how wrong that comment is.

    the PS3 has 256MB of GDDR3 for their GPU, and the 256MB of XDR DESTROYS the 512MB of GDDR3 that the 360 uses for system memory (For one GDDR3 isn't meant to be used as main system memory, XDR is.)

    On the XDR front, I don't know how it destroys the GDDR3. Both are pieces of memory and they're just there to support reads and writes. As long as they have the bandwidth, size, and low latency, that's all that really matters to devs (obviously, devs shouldn't have to worry about signal integrity and what not here).

    PS3 stomps the 360. The 360 is by far inferior, it's locked down, and it burns itself out more often than not.

    And the slim isn't locked down? But true, the original PS3 doesn't burn itself out more than the original 360.

Arithmetic is being able to count up to twenty without taking off your shoes. -- Mickey Mouse

Working...