NVIDIA Previews GF100 Features and Architecture 101
MojoKid writes "NVIDIA has decided to disclose more information regarding their next generation GF100 GPU architecture today. Also known as Fermi, the GF100 GPU features 512 CUDA cores, 16 geometry units, 4 raster units, 64 texture units, 48 ROPs, and a 384-bit GDDR5 memory interface. If you're keeping count, the older GT200 features 240 CUDA cores, 42 ROPs, and 60 texture units, but the geometry and raster units, as they are implemented in GF100, are not present in the GT200 GPU. The GT200 also features a wider 512-bit memory interface, but the need for such a wide interface is somewhat negated in GF100 due to the fact that it uses GDDR5 memory which effectively offers double the bandwidth of GDDR3, clock for clock. Reportedly, the GF100 will also offer 8x the peak double-precision compute performance as its predecessor, 10x faster context switching, and new anti-aliasing modes."
Wow, that article is terribly written... (Score:2, Informative)
Anandtech (Score:5, Informative)
Anandtech also has an article up about the GF100. They generally have very well written, in-depth articles: http://www.anandtech.com/video/showdoc.aspx?i=3721 [anandtech.com]
Re:Wait... (Score:5, Informative)
280W power drain, 550mm^2 chip size => no thanks, i'll pass.
http://www.semiaccurate.com/2010/01/17/nvidia-gf100-takes-280w-and-unmanufacturable [semiaccurate.com]
Re:Can someone who is more knowledgeable tell me.. (Score:1, Informative)
A wide memory bus is expensive in terms of card real-estate (wider bus = more lines) this increases cost. It also increases the amount of logic in the GPU and requires more memory chips for the same amount of memory.
Re:"The GPU will also be execute C++ code." (Score:3, Informative)
From the article: "The GPU will also be execute C++ code."
They integrate a C++ interpreter (or JIT compiler) into their graphics chip?
That's a misinterpretation of part of the NVIDIA CUDA propaganda stuff: better C++ support in NVCC
Re:Wait... (Score:4, Informative)
I think he's talking about dissipation of such a large amount of power in such a small package size.
The die size is barely larger than a square inch, and 280W is a tremendous amount of energy to dissipate through it.
Cooling these things is going to be an issue for sure.
Re:Wait... (Score:3, Informative)
Costs more (Score:4, Informative)
The wider your memory bus, the greater the cost. Reason is that it is implemented as more parallel controllers. So you want the smallest one that gets the job done. Also, faster memory gets you nothing if the GPU isn't fast enough to access it. Memory bandwidth and GPU speed are very intertwined. Have memory slower than your GPU needs, and it'll be bottlenecking the GPU. However have it faster, and you gain nothing while increasing cost. So the idea is to get it right at the level that the GPU can make full use of it, but not be slowed down.
Apparently, 256-bit GDDR5 is enough.
Re:What's with the terrible naming (Score:3, Informative)
GF100 is the name of the chip. The cards will be called the GT300 series.
Re:Wait... (Score:3, Informative)
I would wait for a GF100 or 5870 refresh first. AMD is rumored to be working on the 28nm refresh that should be available by mid-year. GlobalFoundries has been showing off wafers that have been fabbed on a 28 nm process [overclock.net], and rumors indicate that we'll be seeing 28nm GPUs by the mid-year. I would imagine that nvidia is planning a 28nm refresh of GF100 not long after. Smaller GPU = less power = smaller PCB, so the cards will be shorter.
Re:Should AMD sue them too? (Score:3, Informative)
Buying now gt200 card is pointless as it is a well known fact that nVidia literally abandons support of previous GPU generation when they release new one.
Such bullshit. For example the latest Geforce 4 drivers date to Nov 2006 which was when the GeForce 8 series came out 4 years after the initial Geforce 4 card. Even the Geforce 6 has Win7 drivers that came out barely 2 months ago and thats 5 series back from the current 200 series.
Re:Someone please tell me (Score:3, Informative)
a) 5670 or GT240 if you can find one cheap enough... However depending how British pounds convert, the true budget card is a gt 220 or a 4670.
b) 5770 or GTX260 216 core
c) Radeon 5870 or 5970 if you can afford it.
Re:Someone please tell me (Score:1, Informative)
I'm running a single Radeon 4850 and have no problem with it whatsoever.
A friend of mine is running two GeForce 260 cards in SLI mode which make his system operate at roughly the same temperature as the surface of the sun.
We both play the same modern first person shooter games. If you bring up the numbers, he might get 80fps compared to my 65fps. However I honestly cannot notice any difference.
The real difference is that he spent over $400 (approx. squiggly L-shape650) compared to my $125 (approx. squiggly L-shape230) and has to open the windows to his room in the middle of winter to cool it down.
Comment removed (Score:3, Informative)
What do you mean by rescue? (Score:4, Informative)
Graphical hardware power is a problem on consoles not PC. Despite their much touted power the PS3 or Xbox360 cannot do FSAA at 1080p. Most developers have resorted to software solutions (hacks, for all intents and purposes) to get rid of jaggedness.
Most games made for consoles will work the same, if not better on a low end PC (if they don't do a crappy job on porting but Xbox to PC this is pretty hard to screw up these days). The problem with PC gaming is that it is not utilised to its fullest extent. Most games are console ports or PC games bought up at about 60% completion and then consolised.
PC Graphics 1280x1024 upwards tend to look pretty good. Compare that to Xbox (720p) or PS3 (1080p) which still look pretty bad at those resolutions. Check out the screenshots of Fallout 3 or Far Cry 2, the PC version always looks better no matter the resolution. According the the latest Steam survey 1280x1024 is still the most popular resolution, 1680x1050 the second.
If you have the power, why not use it.
Dont get me wrong however, progress and new idea are a good thing but the PC gaming market is far from in trouble.