Nvidia's DX11 GF100 Graphics Processor Detailed 220
J. Dzhugashvili writes "While it's played up the general-purpose computing prowess of its next-gen GPU architecture, Nvidia has talked little about Fermi's graphics capabilities — to the extent that some accuse Nvidia of turning its back on PC gaming. Not so, says The Tech Report in a detailed architectural overview of the GF100, the first Fermi-based consumer graphics processor. Alongside a wealth of technical information, the article includes enlightening estimates and direct comparisons with AMD's Radeon HD 5870. The GF100 will be up to twice as fast as the GeForce GTX 285, the author reckons, but the gap with the Radeon HD 5870 should be 'a bit more slender.' Still, Nvidia may have the fastest consumer GPU ever on its hands — and far from forsaking games, Fermi has been built as a graphics processor first and foremost."
When's it coming out? (Score:5, Insightful)
There's no point bragging about being faster than last month's graphics card if your own is still a quarter of a year from being an actual product.
Re: (Score:2, Interesting)
I'm more worried about the state of PC gaming. We're taking a long slide recently and I'm starting to worry if this high end hardware is worth it.
Re: (Score:3, Informative)
Re: (Score:2, Insightful)
Re:When's it coming out? (Score:4, Interesting)
It's true that a few years ago you had to stay close to the cutting edge and now you don't; but I'm pretty sure it's not because graphics cards had outpaced games, but because game developers slowed their pace because they wanted good performance on consoles.
I'm sure game developers could easily overwhelm graphics cards if they wanted to, but that doesn't only block PCs without high-end cards, but also all the consoles. I have to say that as a PC-only gamer, I find the situation very positive. I like not having to upgrade constantly.
Re: (Score:2)
Re: (Score:2, Insightful)
We can tell who is a console gamer here. If you actually do some research and find out why PC gamers are upset about MW2, maybe you'll understand why. Dedicated servers are there so everybody has fair play. Why would I want some 'yuck' hosting a match on his/her crappy 756k DL/256k UL connection where I have 200-300 ms ping (sometimes up to 500 ms which is unplayable) and the host has none?
Entitlement? You guys just don't understand what that word means since you're used to getting everything shoved down yo
Re: (Score:3, Interesting)
Re: (Score:2)
If Activision sees that no one's buying the game, and that it's not even considered a suitable target of piracy, they'll either ditch the PC or work to make it a better experience next time. If they see a lot of grog-swillers with peg-legs, Activision will be able to play the victim and blame the failure of the game on the greed of others, instead of their own ineptitude.
Yes, but here's the thing : pretending that demand doesn't exist to justify cutting the supply doesn't make the demand exist, and Acti
Re: (Score:3, Insightful)
Here's how it really works : no one (on PC) buys single player games, they only buy multiplayer games because, I don't know if you've tried lately, but if you want to be a pirate there are very few games on which you'll be able to play multiplayer, if you're lucky you'll get access to a few cracked servers.
So PC gamers buy multiplayers, they HAVE to. MW2 shipped with a multiplayer system that fell VERY short of people's expectations for a multiplayer game, henceforth they treated it like a single player g
Re: (Score:3, Interesting)
However, dedicated servers ARE relevant. If there is no dedicated server, the functionality of your game can be reduced or disabled at a moment's notice. I can still play Quake or UT (1, i never bothered with the others because they didn't play as well imho) multi player because no matter wh
Re: (Score:3, Insightful)
Re:When's it coming out? (Score:4, Insightful)
I'm gonna have to disagree with you there.
Look at the 4850. When it was brand new, it cost $199, and it could run ANY game on the market at full resolution and detail with a smooth, sustained framerate. Flash back to the year 2000. Try to find me a $200 card back then that could do the same. Hell, I challange you to do the same thing just 5 years ago, back in 2004.
Good luck.
Re: (Score:3, Insightful)
I'm gonna have to disagree with you there.
Look at the 4850. When it was brand new, it cost $199, and it could run ANY game on the market at full resolution and detail with a smooth, sustained framerate. Flash back to the year 2000. Try to find me a $200 card back then that could do the same. Hell, I challange you to do the same thing just 5 years ago, back in 2004.
Does this mean that we're hitting a software complexity wall?
It's now the devs turn to play catch up... I hope nobody cheats by adding idle loops (looks around menacingly).
Re:When's it coming out? (Score:5, Insightful)
As a poster previously in the thread stated, a big part of it are games that need to work on consoles and PC. As an example, considering the 360 has a video card roughly equivalent to a 6600GT, there is only so far they can push ports. Hell, even now, 3-4 years into the current gen, there are STILL framerate problems with a lot of games...games that can now run at an absurdly high FPS on a decent gaming PC.
Re: (Score:3, Interesting)
companies don't want games on console and PC. The reason is there is a lot less control on PC. So they want to shove console requirements onto a PC and you end up with horrible ports like Borderlands and MW2. Thus, nobody wants the PC version and they go "oh, nobody bought the PC version" even though the reason is they fucked their own community, so that they don't have to keep making games for PC.
It's a really shortsighted strategy, but it's basically an attempt at creating a walled garden all over again.
Re: (Score:2)
Yes, it pisses me off that DX9 is the development target precisely because that's the feature level of the two most powerful consoles. It's funny you mention Borderlands, because the game is very fun to play, but the graphics are annoying.
1. They took the same graphics on the 360 version, and just added a whole bunch of pitch-black dynamic shadows for the PC (I think they don't realize that you can assign DYNAMIC RANGE for your shadow intensity). This makes the game way too dark, and since their gamma con
Re: (Score:2)
Re: (Score:3, Insightful)
I don't think the problem is the 360, I think the problem is your fanboyism. Multi-platform games look more or less the same between the 360 and the PS3.
Trust me, I know. I have both systems.
Re: (Score:3, Informative)
Trust me, I hack both systems.
The 360 and PS3 are practically identical. Both use IBM Power-PC based main cores and a bunch of side processing units. the 360's total performance capability is HALF that of the PS3 (360 does 1TFLOP PS3 can do 2TFLOP) the PS3 also has a superior graphics hardware set. Comparing GTAIV on the 360 vs the PS3, the 360 looks like it's running in 16-bit color depth, shadows are absolutely horrible, and the draw distance isn't even on par with the PS3.
Sorry, speaking from an 'inside'
Re: (Score:2)
More powerful, absolutely. More reliable, absolutely. However, I have found that I enjoyed more of the exclusives for the 360 this generation than those on the PS3. Interestingly, the opposite was true for me in the previous generation...in my opinion, the PS2 trounced the Xbox as far as exclusives were concerned.
As far as being used for general media, the PS3 wins this generation hands down...on that I will readily agree.
Re: (Score:3, Informative)
The 10MB of "VRAM" you refer to on the xbox 360 is actually called eDRAM, and is more similar to the cache memory found in CPUs than video memory. It's often not even used, but reportedly can reduce the hit when anti-aliasing to nearly nothing. The main difference between the 360 and the ps3 in terms of memory is that the 360 GPU serves as a memory controller. Since this is the case developers can use up to about 480mb as either system memory or graphics memory - as they decide, instead of being limited
Re:When's it coming out? (Score:4, Insightful)
The 360 and PS3 are practically identical.
The PS3 and 360 PPC elements are identical, yes. But the rest aren't. The SPUs are vastly different to the PPEs ranging from the ISA, to the memory architecture, to the instruction latencies, to the register file size/width, to the local memory latencies, to the...oh boy, they're vastly different on so many levels. I also don't know how those TFLOP numbers came about, because they're totally wrong.
Comparing GTAIV on the 360 vs the PS3, the 360 looks like it's running in 16-bit color depth, shadows are absolutely horrible, and the draw distance isn't even on par with the PS3.
Using GTA to compare the graphics hardware and concluding that PS3 is better? I just hope you don't mention that to the devs, because they'll laugh their ass off about how wrong that comment is.
the PS3 has 256MB of GDDR3 for their GPU, and the 256MB of XDR DESTROYS the 512MB of GDDR3 that the 360 uses for system memory (For one GDDR3 isn't meant to be used as main system memory, XDR is.)
On the XDR front, I don't know how it destroys the GDDR3. Both are pieces of memory and they're just there to support reads and writes. As long as they have the bandwidth, size, and low latency, that's all that really matters to devs (obviously, devs shouldn't have to worry about signal integrity and what not here).
PS3 stomps the 360. The 360 is by far inferior, it's locked down, and it burns itself out more often than not.
And the slim isn't locked down? But true, the original PS3 doesn't burn itself out more than the original 360.
Re:When's it coming out? (Score:5, Informative)
When the differences are minute to the point where you have to pause a Gametrailers video and lean in close to your monitor, they may as well be the same...you aren't going to see that during actual gameplay, ESPECIALLY not in a frantic shooter like MW2.
That being said, there is one consistant difference between the 360 and the PS3 in terms of image quality: the 360 tends to be a little washed out, and the PS3 tends to be a little dark. Thank goodness for auto-switching color profiles based on the input selected.
Re: (Score:2)
I guess you haven't compared games like Dragon Age [blorge.com], where the PS3 is noticeably superior and is even receiving higher review scores. Judging by the unwarranted hostility in your reaction to an innocent comment, it seems that you're the one suffering from fanboyism.
Oh of course, how could I not have seen it before?. I have at least one console or handheld from each of the major manufacturers to release one in the past thirty years. Atari, Magnavox, Nintendo, Sega, Sony, Microsoft...yup, you are absolutely right, I am a fanboy.
I'm a video game fanboy. I don't care about the brand name on the front of the machine, I care that it plays games.
Re: (Score:2)
Does this mean that we're hitting a software complexity wall?
From the perspective of a game programmer, I'd posit that it's not as much a software complexity wall as it is a content creation wall. Creating a fully-realized world in a modern videogame is amazingly time consuming. It's now all about how to reduce the dozens of developer-years required to build the environment (world geometry, props, models, effects, etc) and gameplay content (events, missions, etc). One of the problems has been that with each new generation, we not only have to re-build all our cont
Re: (Score:2)
Re: (Score:3, Insightful)
Which is why Metal Gear Solid 4 still looks better than games that are coming out now that are both PS3 and XBox 360.....
Re: (Score:3, Insightful)
Pull the other one. It has got bells on it.
Define "full resolution".
If I have a very old 1280x1024 monitor, sure.
If I have a new 1920x1200 monitor, not so much.
If I have a dual 2560x1600 monitor setup, not in this life time.
Also, define "full detail". Is that at medium? High? Maximum? What level of anisotropic filtering? Anti aliasing?
But let's have a
Re: (Score:3, Informative)
They have admitted those 2 games were programmed by monkeys.
If you compare a 4850 from then to a 4850 today with the game fully patched and monkey shit removed you'd see an increase in frame rates. Or compared it to the squeal which had even more monkey shit removed there would be a further increase in frame rates.
Besides the fact that 2 games, that received crap reviews except from the "Oh so pretty" crowd do not represent the market.
You Need A Better Example (Score:2)
Re: (Score:2)
I define full resolution as the max resolution of the average big monitor these days...which, unless you have some 27-30 inch monstrosity, cap out at 1920 x 1200. In your example, they have 16XAA enabled, which makes a MASSIVE difference...which is something I have adressed in my other posts made in this thread. That being said, congrats...you're right. I am totally wrong. There actually is a game out there that a then-$200 card couldn't play full bore. Sue me.
By the way, I appreciate you insulting me,
Re: (Score:2, Insightful)
Edge cases don't make good refutations of general statements. Besides, he's not totally correct but he isn't far from the truth either. The HD4850 can run most games at fairly high settings, at the highest resolutions most people have available.
(According to the Steam stats, 1920x1200 comprises less than 6% of users
Re: (Score:2)
Re: (Score:2)
Also, define "full detail". Is that at medium? High? Maximum? What level of anisotropic filtering? Anti aliasing?
Real gamers play with minimum details on an old Athlon XP. /inb4flamewar
I have two nitpicks.
1) If you tweak Crysis, it performs much better.
2) If you add an SSD for Crysis, it performs much better. In some cases, the minimum FPS doubles. 16fps isn't very playable, but a solid 30fps with vsync definitely is. UE3 gets around this problem(HDD latency) by streaming in textures, so rather than get a lower framerate, sometimes you'll just be staring at gaudy soup textures for a second.
P.S. I play L4D2 on an 8800
Re: (Score:2)
Actually I don't even mean it from a technical standpoint. I just feel like the influx of console tailored games, designed to run on local hosts for multiplayer, and designed to prevent modification are really screwing with things. Of course, I have to say that my view is strong in that I'm mainly looking at blockbuster games and not some of the real gems that are PC centric.
Re: (Score:2)
Worse, the games that aren't just console ports are small indy developer efforts with simple graphics that rarely need more th
Re: (Score:2)
What's the model? It sounds interesting.
Re: (Score:2)
He's counting the red, green and blue dots as pixels.
Re: (Score:2)
Actually, the physical pixel count on my 32" LCD runs almost that high. Each "pixel" under a microscope at 1080p is actually 9 groupings of three primary color sub-pixels.
So, instead of being 1080p, I could see a potential firmware hack allowing even higher resolutions. Disable subpixel rendering on the screen and just do raw control of each pixel. Obviously the hardware capability is already present in the LCD.
Re: (Score:3, Interesting)
since WoW controls 50% of all pc game revenues, the market as it was a few years ago is over. it's not even fun building a PC anymore since everything is integrated on the motherboard except for a decent graphics card.
i'm personally tired of chasing the latest graphics card every year to play a game. i'll probably buy a PS3 soon and a Mac next year just because it's lack of wires makes the wife happy
Re: (Score:2)
it's not even fun building a PC anymore since everything is integrated on the motherboard except for a decent graphics card.
And the RAM. And sound card if you want to get it off the mobo. And the hdd/optical drive(s)...
Building a PC can be really fun, still. Getting a decent graphics card for cheap is still possible, too, and you don't have to chase the latest graphics card. You don't have to play games on the Ultra High setting, either...
Re: (Score:2)
I think the point is not that building a PC *can* be fun, but rather, it's usually not anymore. ie, the time+cost to reward ratio is off!
Building a computer even 10 years ago was a lot different than it is today. Even minute amounts of overclocking could make a huge difference, small differences in ram timings were noticeable. Getting a Cyrix, an Intel or AMD cpu gave very different performance profiles. Individual steppings were sought out by overclockers. Certain soundcards could greatly lighten CPU load,
Re: (Score:2)
Not sure why you posted anonymous--seems a fairly common viewpoint--but I'll respond anyway.
I would absolutely not argue that there's no difference between a super high-end custom built computer a ~$400 Dell. I would say that for the vast majority of users there is effectively no difference. Heck, even for me, I'm running a 2 year old dell (q6600) that all I modified was popping in a Geforce 8800gt and it runs most games I play just fine. So for your average gamer out there, is a (say) $150 video card reall
Re: (Score:2)
That's not at all the point. The point is this:
When you had in the range of a 133mhz chip (say 66x2) and you cold overclock it to 150 (75x2) -- that led to an immediate and noticeable improvement in everyday usage. Faster boot time, faster windows, faster games.
With your i7, you might get slightly higher game performance, benchmark a bit faster, etc, but I would bet you would be hard pressed to tell a difference in everyday usage. Sure in your case (and in mine) where the overclock is "free" -- go for it, n
Re: (Score:2)
Thousands of dollars? The most expensive imac -- with a 27" screen is $2000. The average model (with 4gb memory btw) is $1200.
The nice thing about apple "exclusives" (iphoto, iweb, etc) is that they make stuff very accessible to non-power users.
Perhaps there will even be a time in your life when you will be more interested in less wires than in a few mhz, or perhaps than in a case with a window on the side and glowing wires?
Re: (Score:2)
I finally made the switch to the console with COD/MW2. I have a PS3 hooked up to a 37inch Samsung LCD. My desktop PC is a simple Core2Duo (2.6ghz) with an old GeForce 6800 256MB. I couldn't stomach the cost of upgrading the hardware on the desktop and having to deal with hackers. In all honesty, it's the hackers that really drove me away. It was probably 2/3rds hackers, 1/3rd knowing that I'd get flawless framerate and top notch graphics on the console. I've been playing LAN/online FPS games since Qua
Re: (Score:2)
Umm, the PS3 has native keyboard and mouse support. I plugged in an old wireless Compaq keyboard/mouse combo and it worked flawlessly. No need to buy adapters. If the game devs didn't put in keyboard support for the PS3, that's their screwup.
Re: (Score:2)
I haven't had any frame rate problems (jitters, etc). The only lag I've noticed has been network related. Compared to my desktop, the PS3 has top notch graphics. Given that developers are saying that they haven't maxed out the potential of the system yet, I think it's fair to say that the graphics subsystem is pretty top notch. Of course this is all subjective and we're arguing over semantics at this point, so what's the point?
Re: (Score:2)
Re: (Score:2, Interesting)
Thank the pirates for killing PC gaming. Developers actually make money from consoles.
Re: (Score:2, Insightful)
Sure there is, because then some people will wait for this new card rather then buying AMD's card, thus providing Nvidia with revenue and profit.
Re:When's it coming out? (Score:4, Interesting)
Re: (Score:2)
How long before we saturate the PCI-E bus and need something faster?
In a way, it has already been replaced. PCIE V2 is the current standard. It's backwards and forwards compatible, and has twice the bandwidth of V1. V3 will double that bandwidth again.
It'll be quite a long time before it becomes obsolete.
Re: (Score:2)
``By that logic wouldn't those same people then wait for AMD's next offering which will be yet faster?''
Well, some people actually do that. I'm waiting for the budget card that comes out with a fully functional open-source driver available. Until then, my fanless GeForce 6600 is chugging along just fine. I don't even remember what I had before that, but it was something fanless with the open-source r300 driver ... a Radeon 9200 or similar.
But then, I don't buy games unless they run on Linux, either. Which u
Re: (Score:2)
"How long before we saturate the PCI-E bus and need something faster?"
Considering Crysis can't fully tax the bandwidth of an AGP 8x slot, probably not for a good long while.
ATi's 4850 AGP flavor rocks Crysis no problem. At that point, it's the CPU/Memory that's the bottleneck.
Re: (Score:3, Insightful)
You haven't spent much time with Marketing people, have you?
Re: (Score:2)
There's a phrase for it, paper launch [wikipedia.org] or paper tiger [wikipedia.org]. If this actually gets released is one thing. I'd like to see benchmarks, not theoreticals.
Feh. (Score:5, Informative)
The days of needing the biggest, fastest, most expensive card are pretty much over. You can run just about any game out there at max settings at 1920 X 1080 silky smooth with a 5870, which goes for less than $300. Hell, even the 4870 is still almost overkill.
Unless you plan on maxing out AA and AF while playing on a 30 inch screen, there is no reason to drop $500-$600 on a video card anymore...
Re:Feh. (Score:5, Interesting)
Re: (Score:2)
That's the wrong way to do it. You're talking fancy sync'd headsets if you do it that way. Power and signal tether you to a position, and weight puts unnecessary strain on your neck.
The proper way to do stereoscopic 3D with an LCD is to sacrifice half your pixels to perpendicular orientation and use linear polarized lenses like the movie theaters do. I mean, jeez, LCD screens are already polarized. All it would take is an extra layer.
Re:Feh. (Score:5, Insightful)
Re: (Score:3, Interesting)
This is pretty much the case with me. I plan on doing a full system upgrade this Cyber Monday, but I haven't bought any new hardware for my computer other than a new DVD drive in about 2 years...and I STILL haven't needed to turn down visual details in any games that are released.
Re: (Score:2)
Yeah. I paid about 150 for my graphics card, a 9600 GT, I have a nice 1680x1050 monitor I'm not going to upgrade any time soon, and at this point I can't imagine what games would require me to buy a new CPU.
I can run any game whatsoever at full resolution and visual details.
That's always been the joke...if you buy a middling video card, you're buying the same thing that's in a PS3 or whatever the newest console is, because those were created a year ago.
Re: (Score:2)
Seriously? I paid $100 for a 9800 GT a while back, and have two 1400x1050 monitors. Your card sounds expensive.
I agree with you though, aside any hardware failures, I won't be upgrading it for a long time either. Heck, I wouldn't have moved up from the old 8800 GS if it weren't for VDPAU requiring a newer card.
Re: (Score:2)
I probably got it before you, I think I've had it a year at this point.
$100 is normally the spot I aim at, but I had some extra cash last time, because cost of the memory and motherboard suddenly dropped before I bought, so went about $50 higher than normal.
Re: (Score:2)
Not always. Not when both kinds of platforms weren't homogenized to such a degree...
Re: (Score:2)
Re: (Score:3, Insightful)
Well, the "problem" is those are not really ports anymore; often practically the same engine.
Which kinda sucks, coming from both worlds, enjoying both kinds of games - now that Microsoft made targeting both platforms from the start of development "sensible", most games are hybrids; not exploiting the strengths of either platform.
Re: (Score:2)
I'm sorry. I can't resist. I simply must put a /. spin on this. Lets see...
MICROSOFT AND SONY ARE HOLDING THE WORLD BACK AGAIN! AHHHHHH!H!H!!!!! They are teh evils! Innovation stiflers!
(Note to moderators: I expect nothing less than a +5 Insightful. There I saved you time you won't have to post that "Undoing moderation" crap.)
Re: (Score:2)
Except that PC gaming still has the hold on MMO and RTS games.
Though once again Square-Enix is going to be trying to market a MMO to platforms with FF14.
Re: (Score:2)
Re: (Score:2)
This is what will kill PC gaming, the fact that every game will have only the maximum potential of what a console can do hardware wise.
In case you hadn't noticed, the consoles are already years old. If that were true, wouldn't you expect it to have happened already?
Yet PC hardware continues to advance, PC games still scale up far beyond the capabilities of their console brethren, and the market hasn't died. [steampowered.com]
Re:Feh. Only GFX matters to you? (Score:2)
Sure, it might "kill" PC gaming if all that matters for "true PC gamers" is bling...
Though I wonder how that correlates with the fact that most PCs sold have integrated GFX. And that most popular PC games are Solitaire, Minesweeper, Peggle, flash games, etc.
Re: (Score:2)
Sorry but I don't follow your logic. How does PC ports not exploiting the full potential of the superior PC hardware but still marginally exceeding in quality what consoles do kill PC gaming? As long as people have PCs they'll want games on it, and who cares if the hardware isn't pushed to its limits? Actually it democratises PC gaming by making practically any PC you might have fit for playing games like an Xbox?
Re: (Score:2)
You have a point there, but the PC port was particularly bad, mostly on the graphics. It had uncommonly high requirements to not even reach the quality of the Xbox 360 version. As in, my PC beats the Xbox 360 to the curb, yet I'd still end up driving on invisible roads or in the land of the blur. Not to mention the weird looking shadows. And I was running at the lowest resolutions.
Re: (Score:2)
Re:Feh. (Score:4, Insightful)
Mostly agreed, however I will take a low-to-mid range CPU if it means I can afford a top of the line GPU...when it comes to gaming, anyway.
The GPU is a much larger bottleneck in terms of gaming, although the line of importance between the GPU and CPU has been blurring a bit lately.
Well, wait some time. (Score:2)
You can run just about any game out there at max settings at 1920 X 1080 silky smooth with a 5870, which goes for less than $300.
Well, that's until Crysis 2 with Stereo 3D + Multi-Head and Windows 8's compositing + DirectX 13 come out.
Then it'll be again waiting 1 year until the hardware catch up.
Remember the mantra :
What hardware giveth, software taketh...
Also, you assume discreet GPU.
nVidia and ATI have still to do some improvement until the performance you quote happen on a low-power miniature *embed* GPU in a laptop (that doesn't drain the battery flat after 10 minutes).
Thus expect future generation with better performance per wa
Re: (Score:2)
Uhmm...aren't current LCD monitors pretty much locked to 60 fps anyway?
Stereo compatible LCD monitors (Score:2)
aren't current LCD monitors pretty much locked to 60 fps anyway?
No.
- There are 3D-Stereo-grade monitors which are able to work at higher refresh rates, so that the left-right alternating doesn't get noticeable. Usually the HDMI bandwidth is the limiting factor. Hence newer norms as HDMI 1.4
- Auto-Stereo LCD exist. (they don't alternate between left-right, they display both at the same time and rely on some hardware property - say lenticular filter - to separate the images).
And they've become cheap.
Also beside LCDs :
- Modern projector internally function at much higher r
Re: (Score:2)
i agree completely but i think that this situation will be a catalyst for the next big step. i think back to when unreal was released. there was almost no hardware that could run the game smoothly, in a way it was a proof of concept of what gaming could become, but as hardware caught up we saw it give rise to a whole new way to make games, FPS, RTS, RPG, all genres really, have adopted the 3d model, even board games. now the market is saturated and the pressure is off the hardware vendors to make components
Eyefinity (Score:2)
The 5870 still seems to cost more than $400, but your point is of course valid. What might become an issue is multi-monitor gaming like ATI's Eyefinity. Running a triple-screen setup demands a bit more. I don't know if multi-monitor will become mainstream, but it's roughly in the same ballpark price-wise as high-end GPUs.
Re: (Score:2)
augh! yes indeed, I meant $400. You can also get the 4870 for cheap even when it was new, and for super cheap now that it has some age on it...an extremely capable card that will likely last at least another generation or two of video cards.
http://www.newegg.com/Product/Product.aspx?Item=N82E16814129113 [newegg.com]
$170, awesome stuff
Re: (Score:2)
Re: (Score:2)
"Unless you plan on maxing out AA and AF while playing on a 30 inch screen,"
Totally unnecessary with the subpixel rendering engines in most LCD TVs nowdays, considering their native resolution is FAR higher than their maximum capable resolution (that's where the subpixel rendering comes into play, for upscaling to three times the amount of pixels. A 30-inch TV would have HUGE pixels at 1920x1080 if aspect were followed.)
Re: (Score:2)
And you don't have to.
You can spend just $60 on a Radeon HD 4670. This affordable gem is more powerful than the GPU of the PS3, and should give you similar gaming performance.
Want a little better performance? You can spend just $100 on a nice 9800GT. That's twice as powerful as the GPU in the PS3, and will support rendering resolutions the PS3 can only dream of.
Want a little better performance? Then step-up to a GTS 250 or 4850 at $110-120! That's the beauty of PC gaming: you can buy as little or as mu
Fermi-based? (Score:3, Funny)
I assume they mean the scientist Enrico Fermi. So, did they dig him up, or is this one of those Jesus fingerbone type of thing, where there are more fingerbones than there are chickens? Did they use the whole Fermi, or are there only specific pieces of him that work? Whatever the case, there must be a limited number of cards that can be built, since there is a finite amount of Fermi.
Re: (Score:2)
I assume they mean the scientist Enrico Fermi. So, did they dig him up, or is this one of those Jesus fingerbone type of thing, where there are more fingerbones than there are chickens? Did they use the whole Fermi, or are there only specific pieces of him that work? Whatever the case, there must be a limited number of cards that can be built, since there is a finite amount of Fermi.
They used his skull with the jawbone of an orangutan. [wikipedia.org]
40nm process... (Score:3, Insightful)
Re: (Score:3)
Alongside a wealth of technical information... (Score:2)
Enough to write a Free driver?
Only Question I have (Score:2)
Still no "real" benchmarks? (Score:3, Insightful)
While the articles is very interesting on explaining the chip archetecture and technical specifications, I can't believe there sin't a single actual gaming benchmark on these chips yet.
The best they can do is give an estimated calculation on how the chips may or may not actually live up to. They estimate that it will be faster at gaming than ATI's already released 5870.
By the time Nvidia actually releases their Fermi GPU's, ATI's Cypres will have been actively selling for over 3 months. And there's a funny thing about advancements over time: things keep getting faster (aka Moore's Law). Supposing that chips are supposed to double in transistor count every year, the new Fermi chips need to have 20% more transistors than ATI's RV5870 if they release 3 months later... just to keep on the same curve.
And there's still no mention of pricing... but that's expected on a product that doesn't actually run games yet. I don't see a lot of optimism on the gaming front, so I hope for Nvidia's sake that the investment into GPGPU is the branch out they need to trump ATI's sales.
With x86 on the Die? (Score:2)
Does nVidia sell any of these top-end GPU chips with a full x86 core on the die with it? A CPU that's maybe as powerful as a single core P3/2.4GHz, tightly coupled for throughput to GPU and VRAM, going out to the PC bus (PCI-e) just for final interface to a video display (or saving to a file, or streaming to a network).
Re: (Score:2)
Re: (Score:2)
Yes, that is why the GPU is on there. The x86 is there for everything else: OS and application processing, managing devices, etc. A single chip with both CPU and GPU for maximum total performance. An x86 because that's where the most SW runs.
Re: (Score:2)
Intel's planning on releasing x86 CPUs with GPUs on the die. Why can't nVidia? Besides, there are plenty of Pentiums out there doing all graphics in the CPU (and NSP, etc), without melting.
Re: (Score:2)
But do they reach the performance of top of the line GPUs?
Welcome to the real world, Nvidia (Score:2)
Looks like a cool chip. It will be interesting to see how Nvidia does in the marketplace when the don't have rabid enthusiast gamers subsidizing their development efforts every 6 months. Let's face it, who runs out to buy the latest graphics card anymore, when you get the same game on your 360/PS3 with no upgrade? They're mostly positioning this launch as a 'compute' GPU, so they certainly see the writing on the wall. With Fermi and beyond, Nvidia will have to provide tangible real-world profits for some c