NVIDIA Makes First 4GB Graphics Card 292
Frogger writes to tell us NVIDIA has released what they are calling the most powerful graphics card in history. With 4GB of graphics memory and 240 CUDA-programmable parallel cores, this monster sure packs a punch, although, with a $3,500 price tag, it certainly should. Big-spenders can rejoice at a new shiny, and the rest of us can be happy with the inevitable price shift in the more reasonable models.
Just what I always wanted! (Score:5, Insightful)
A video card I can't use on XP32 since it can't properly allocate that much VRAM & system RAM at the same time.
Re:Just what I always wanted! (Score:5, Insightful)
That you are using an old operating system incapable of dealing with this new hardware is not the fault of nVidia.
Re: (Score:2)
There's [gizmodo.com]
a FUCKLOAD [wikipedia.org]
of problems that are [nvidia.com] I'll never buy anything with an Nvidia card in it again.
Especially since (Score:2)
You can still run all your 32-bit programs with no problems. Windows has an extremely good virtualization layer that allows 32-bit software to run in the 64-bit OS with no problems. We've done extensive testing and use of it at work. So even supposing you did need a big card and a 32-bit app, well that'd work.
Of course if you are doing anything that would need 4GB of video RAM, good chance you need a good deal more than 4GB of system RAM. After all, you probably aren't keeping the data only on the card, nev
Re: (Score:2)
The sort of person still running a 32 bit OS is not from the same set as those who might spend $3k on the latest and greatest hardware. You don't matter to them.
Re: (Score:2)
The days of the graphics card mapping all it's memory into PCI address space at once are over and have been for some time. IIRC modern cards use a movable window of 256MB or so for access to graphics card ram from the rest of the system.
How did this retarded comment get upmodded? (Score:4, Interesting)
Really, people. If you're going to buy such an expensive professional card, you're going to go with a professional-grade operating system, which will of course be 64-bit.
Re: (Score:3, Insightful)
32-bit is dead. It should have been dead 4 years ago...
Any serious computer enthusiast or professional running a 32bit os on today's hardware should be ashamed. They're holding the industry back.
Re: (Score:2)
Actually, it's all the applications (such as Adobe Flash) that don't work on 64-bit that are holding the industry back.
It's much easier for them to recompile than it is for us to work without certain software...
Re: (Score:2)
32-bit apps work fine on 64-bit Windows unless the application specifically checks for it and doesn't work on purpose.
The most common problem I have is my Windows XP x64 being misidentified as Server 2003 because it shares the same kernel version (NT 5.2), and even though the "I am a server OS" flag is OFF the software still refuses to run because you apparently have a server OS. Vista x64 is obviously not affected by this, as it has the same kernel version as 32-bit Vista, so an app that works on 32-bit Vi
Re: (Score:2)
A video card I can't use on XP32 since it can't properly allocate that much VRAM & system RAM at the same time.
A few things things wrong with this statement:
Re: (Score:2)
Re:Just what I always wanted! (Score:4, Funny)
Re: (Score:2)
I do as well. And *both* of us can't be nobody.
you both ain't somebody either.
Re: (Score:3)
64-bit Vista, 64-bit Kubuntu on the gaming computer. Theoretically, all the Macs should also be 64-bit. 32-bit is dead to me :)
History repeats... (Score:2, Informative)
I am reminded of old 3DFx advertisments just before they went belly-up.
Re: (Score:2)
what a revolution (Score:5, Funny)
Does this mean we can finally run Crysis now?
Re:what a revolution (Score:5, Funny)
Oh come on, you can run Crysis with half that number of cores and only 2 gig of video RAM. This card is obviously being built because of the impending release of Duke Nukem Forever.
Re: (Score:3, Funny)
Others say that the developers have in fact already done this several times.
Re: (Score:2)
Nope, but you can run Vista.
Re: (Score:2, Funny)
Almost.
cool (Score:2)
i've always wanted to watch wall-e as it is being rendered in real-time
Re: (Score:2, Interesting)
Not gonna happen, RenderMan is CPU-only.
no it's not (Score:5, Funny)
... "the most powerful video card in history", it's "the most powerful videocard yet".
[/pet peeve]
Re:no it's not (Score:4, Funny)
I dunno, those Germans made quite a powerful video card back in the 1940s.
It certainly had more power than those steam-powered video cards the French made in WWI.
Re: (Score:2)
Wrong closing tag.
You mean [/pedant].
Re: (Score:2)
maybe you should call Guinness Book of World Records and tell them that all their records are incorrect. or you could, you know, stop being such a pedant.
Re: (Score:2)
"the most powerful video card in history", it's "the most powerful videocard yet".
FACT: The Rebel Alliance used significantly more powerful videocards to render the Death Star in Star Wars Episode IV: A New Hope.
FACT: This event occured a long time ago in a galaxy far far away.
[/pet peeve]
Re: (Score:2)
But given the far far away bit, is it in the Earth's past light cone?
Re: (Score:2)
Everyone else takes the definition of "history" in this context as:
* the aggregate of past events
You seem to be taking it as:
* the continuum of events occurring in succession leading from the past to the present and even into the future
Both of which are valid, the first is far more common though.
Claiming that the statement is wrong, when it is correct according to the more common definition of the word is a bit of a stretch.
Not for home users (Score:3, Informative)
Re: (Score:2)
you're all confused (Score:5, Insightful)
I don't believe anyone claimed this was a gaming card.
This is a scientific number cruncher. Its use is in visual computer modeling for anything from weather models to physics models.
How about folding@home? this does it faster than any computer on the block.
All of you kids making jokes about crysis are missing the point. This might run games, but it's a science processor first.
Re: (Score:2, Informative)
folding@home.
My 3GHz C2D gives me 1920 points every 30/33 hours. My Geforce 8800GT gives me 480 points every 2.5 hours.
Actually not mainly a science processor (Score:2)
It's mostly a professional visualization card. nVidia has three different brands for the same basic hardware:
GeForce: This is the consumer line of cards. They are intended for desktop and gaming use. They generally feature the least RAM, and no special outputs (just dual DVI and the like).
Quadro: Professional visualization cards. Same core as the GeForce, just a different market. The primary difference is they are certified with pro apps, and you pay extra for that. Along those lines, they have drivers opti
Re: (Score:2)
No it's not, it's a graphics card. Most people buying them will be doing it for realtime rendering. Something like volume rendering can use boatloads of memory. Certainly we were using 1Gbyte of texture memory in 2000 on SGI machines.
If you want a 'science processor' you should be thinking about a Tesla http://www.nvidia.co.uk/object/tesla_c1060_uk.html [nvidia.co.uk].
Re: (Score:2)
And the difference between a Tesla card and a Quadro card is a DVI port...
Re: (Score:2, Informative)
Tesla C1060 = $1,700
QuadroFX 5800 = $3,500
You're right about the difference pretty much being the DVI port, but it is a pretty expensive DVI port. Compute professionals didn't want the GeForce series because of lack of support, and they didn't want Quadro because it was too expensive, so the Tesla was NVIDIAs middle ground.
Re: (Score:2)
When you stop to think about that extra $1,800, it doesn't sound that unreasonable. Most of that premium goes to cover the software costs - particularly all the QA needed to make sure that all the $20k+ software packages still work.
That's Awesome. (Score:2, Insightful)
That's the great thing about video cards. Even a card that's two generations old is a terrific card, and they're fantastically cheap.
Headlines (Score:2)
Old news (Score:4, Informative)
Re: (Score:2)
Four gig?????? My first mainframe had 4 meg of RAM.
Does it run Linux? (Score:2)
No, really, could it?
Re: (Score:2)
That's more or less how the Sonnet G3 processor upgrade for the Powermac 7200 worked. They used the motherbord as an I/O coprocessor and did all the heavy lifting on what's basically a complete G3 on a PCI card.
Stupid. (Score:2)
All this amounts to is a proof of concept for technology. To quote Simon from Mad TV "LOOK WHAT I CAN DO"!
To those that think that this has any application what so ever let me say a few things.
#1) Can you just think of what the driver for this think might be? Ludicrous.
#2) It would likely require specialized programing even to function, none of which would be supported anywhere, and all of which would likely have to be custom.
#3) For those think this has scientific applications guess again. You should get t
Re: (Score:2)
This isn't a gaming card
Re:Power != memory (Score:5, Informative)
Yes, AMD's Stream [amd.com] technology. I don't think it is used as much as CUDA in practice.
Re: (Score:2, Interesting)
Basically, nvidia behavior is generating a lot of hate in coders community...
Re:Power != memory (Score:5, Interesting)
Coder Hate like that brought by the shitty, bug filled drivers that ATI has a long history with?
I think ATI/AMD is on the right path, but they have a long history of being on the wrong path, while NVIDIA has always been more towards the middle (Not completely right, but not too badly wrong). It'll take some time before I jump to the ATI Bandwagon as completely as you obviously have.
Re: (Score:2)
Re:Power != memory (Score:5, Insightful)
Do you realize that for computers 12+ years is several GENERATIONS?
I had always been using ATI for Windows boxes and laptops, since my main concern was almost always video performance and TV-Out capability and I could not even get a video overlay work over TV-out with nVidia cards for years.
Of course, when I had problems with linux drivers I built nVidia (I admit, even intel) linux boxes. But that is a thing of the past, I am back to ATI for linux, they are good and even getting better with each release.
Anyway, long term loyalties is pretty silly. I bought my K6 233 at the same price my friend bought his MMX 166, in retrospect we all know how those two compare. I kept on buying Athlons when others were paying more for their crap P4's (they weren't called crap back when it was the best intel had to offer). But, hey, I am now buying Core 2 for non-low end systems, until AMD can come up with something better.
Fanboyism gets you bad deals at least half of the time. You buy hardware, you don't marry it. Ok, I know this is slashdot and the last statement might generate some debate, but anyway you get the point.
Re: (Score:3, Informative)
I have switched sides twice during that time, have had bad cards from both manufacturers during the last 5 years and will continue to now buy based on individual product reviews.
The landscape for most hi-tech products seems to change so quickly now, and suppliers / manufacturers change at such short notice that it is no longer possible to rely upon a vendor's name as a sign of quality.
In the worst cases, even the same product with the same part number is a different product with different performance charac
Re: (Score:2)
However, there is hope. The Vista failure is biting Microsoft in the ass so hopefully this will also happen in a way for nVidia and give us some OSS drivers. If everything else fails, there is st
Re: (Score:2)
The only reason why nvidia has better market share is because ATI was crappier than nvidia.
They only recently got somewhere with their 4xxx series. However I _still_ see more complaints about their drivers - even the recent ones, compared to nvidia's.
Re: (Score:3, Informative)
You are technically correct.
Now, the next question is this: Is the class of problems caused by the existance of a monopoly restricted to situations where a market actor meets the strict definition of a monopoly that you gave?
The answer is no, and anti-trust law in the United States recognizes that. Therefore, you can be convicted of "abusing monopoly power" without technically being a monopoly. Since strict monopolies basically never occur in nature without government interference (and even then you could
Re:Power != memory (Score:5, Funny)
I like how statistics are so meaningless we're not even putting the numbers in anymore.
Re: (Score:2)
I ACCIDENTALLY THE WHOLE COMMENT!!!
*insert lowercase letters here to defeat lameness filter that doesn't know how to take a joke.
Re:Power != memory (Score:4, Funny)
This is a new age of statistics. Instead of putting numbers up there that could be misinterpreted, the author has chosen to take the politically correct route and allow the user the decision that best pleases them.
This way you get your message out, and the person on the other side is happy with their decision. It's a win-win!
Re: (Score:2)
In that case I decide the number to be 0.0001
Re: (Score:3, Funny)
Still a .01% increase over the competition.
Re:Power != memory (Score:5, Insightful)
meaning you can code directly the hardware
Guess what CUDA and Stream have been designed for? Yes: for programming the hardware. What you suggest is pure insanity. NEVER EVER touch hardware directly from an userland app. And once you start writing a kernel module, you end up with something like CUDA/Stream anyway.
I am a coder, and quite frankly I couldn't care less about nvidia drivers being closed source. They are MUCH better than the ATI ones, especially in the OpenGL department. nvidia whipped up a beta GL 3.0 driver in less than a month since GL3 specs were released. ATI? Nope. New standardized feature X is added to the registry. nvidia adds it pretty quickly; ATI adds it months, even years later. nvidia drivers are also pretty robust; I can bombard them with faulty OpenGL code, and they remain standing. With ATI's fglrx, even CORRECT code can cause malfunctioning.
THESE are the things I care about. Not the license.
Re: (Score:2)
with that shitty drivers im banging in games (Score:2)
and i dont even have a 4870. i still use 3870.
Re:Power != memory (Score:4, Informative)
Re: (Score:3, Interesting)
Re:Power != memory (Score:5, Informative)
excuse me but this is total bullshit. oldest trick in the book. if you are behind in technology, pop out a card with huge ram and try to get some sales.
lets face it. nvidia has fallen behind ati in the chip race. you can place any number of 4870s in a setup as much as you like to equate the power of any monolithic nvidia card and they always kick the living daylights out of that nvidia card in terms of cost/performance per unit of processing power.
In case the $3,500 price tag didn't tip you off, this isn't a gaming/enthusiast card. This is a Quadro - a professional card for high-end 3D rendering. Stuff like generating film-grade 3D or insane CAD stuff. Actually, due to the design of the card, it'd be pretty horrible at playing games.
This thing is aimed at high-end scientific calculation and professional-grade rendering.
ATI may, or may not, have something comparable. ATI may even have something better. I don't know, I don't follow the GPU industry very closely. But claiming that they're just slapping a bunch of RAM on a card to drum up sales is just plain wrong. Hell, the blurb here on Slashdot even mentions the fact that it has 240 cores.
Re:Power != memory (Score:4, Funny)
In case the $3,500 price tag didn't tip you off, this isn't a gaming/enthusiast card. This is a Quadro - a professional card for high-end 3D rendering. Stuff like generating film-grade 3D or insane CAD stuff.
Cm'on, we are all grown ups here. You can say it clearly:
It's for high-detailed 3D virtual porn.
Re: (Score:2)
Re: (Score:2)
You don't want to see most of those "actors" in that great of detail.
No, not real human actors. Close up, a human actor's ugly bits are actually... well... ugly. That's why it's necessary to create virtual pron stars, they have ugly bits that are aesthetically pleasing.
Re: (Score:2)
Re: (Score:3, Insightful)
Look at the video games, they keep trying to add flaws and blemishes everywhere to make it look real.
In X years they won't be able to compete with perfect skin from virtual actors. So why bother?
Given the porn market has people going for strange stuff, I'm sure there would be a fair number who would actually prefer their porn stars to have a tiny bit of hair stubble, slight blemishes etc.
For the "perfect" stuff, they'll probably still have jobs providing original moti
Re: (Score:2)
As opposed to the detail of the real thing a few inches away?
Trust me, if ultra detailed closeups of a woman's naughty bits was that much of a turn off, real world sex wouldn't be as popular as it is. Apparently, people are managing to cope . . .
Re: (Score:2)
Re: (Score:2)
Spotlights and mirrors - the best way to have sex!
Re: (Score:2)
Standard room lighting normally (unless she wants the lights off), but even under normal lighting you're still going to see plenty of "detail". I'm perfectly capable of accepting that right close up every girl doesn't have utterly perfect skin or might have some razor burn.
Re:Power != memory (Score:4, Insightful)
But claiming that they're just slapping a bunch of RAM on a card to drum up sales is just plain wrong. Hell, the blurb here on Slashdot even mentions the fact that it has 240 cores.
Umm, the GeForce GTX 280, a gamer card released last summer, also has 240 "cores" (as Nvidia counts them; actually stream processors).
This workstation card, as you might expect, is essentially the same thing as the consumer card, just tweaked towards the professional market (more RAM, different drivers). It's nothing especially innovative.
Re: (Score:2)
Yes and no.
It is a Quadro, and the extra memory is indeed handy for "professional" rendering, but don't get too far ahead of yourself. The rest of the card is nearly identical to the gaming versions. I suspect this is a GTX 260 with minor changes... mostly in the firmware.
Re: (Score:2)
Re: (Score:3, Insightful)
excuse me but this is total bullshit. oldest trick in the book. if you are behind in technology, pop out a card with huge ram and try to get some sales.
Are you some kind of idiot?
With 4GB of graphics memory and 240 CUDA-programmable parallel cores
That alone should be a plain indicator that this ISN'T a consumer-level card, nor is it even remotely close to being targeted as such by nvidia.
no, but you might be (Score:2)
The card is exactly the same as the Geforce GTX 280 except with quadruple the VRAM. Like all of the Quadros, they're the same as the normal lineup with more RAM and different drivers.
Yes and no. It is a Quadro, and the extra memory is indeed handy for "professional" rendering, but don't get too far ahead of yourself. The rest of the card is nearly identical to the gaming versions. I suspect this is a GTX 260 with minor changes... mostly in the firmware.
my regards to the fool that modded you insightful.
Re: (Score:2)
lets face it. nvidia has fallen behind ati in the chip race. you can place any number of 4870s in a setup as much as you like to equate the power of any monolithic nvidia card
But can you get decent drivers for it?
Re: (Score:2)
Re:Power != memory (Score:5, Interesting)
There is no upper limit on the amount of memory required for tasks like volume visualisation, where you have a nice big 3D cube of data in 16-bit format. A cube 1024 voxels in each dimension with a single channel of 16-bit data (2 bytes) is going to be 2 Gigabytes. You will need at least two such cubes to do any sort of image processing work.
Even a digital movie can be considered to be a cube if you consider time as the 3rd dimension.
Rather than having cards with a fixed amount of VRAM, which can't manufacturers just put a bunch of memory card sockets on the card and allow users to add memory when they want?
Re: (Score:2)
They used to do that. You can find sockets on ancient cards, like a S3 ViRGE.
The problem in my understanding is that current cards push memory as far as it will go, and a socket would impose a limit on it. Besides, it's hard to put a heatsink on the RAM then.
Re: (Score:2)
Rather than having cards with a fixed amount of VRAM, which can't manufacturers just put a bunch of memory card sockets on the card and allow users to add memory when they want?
They used to, at least on some of the better ones (way back when "Windows accelerators" that drew rectangles faster were hot stuff).
You need to realize, though, that graphics cards are on the absolute bleeding edge of memory technology. It's not electrically feasible to pump the sort of bandwidth and latency a modern GPU requires (which is literally orders of magnitude faster than what your CPU gets), while at the same time enabling expandability (especially if you want to make it a standard socket).
Video
Re:Power != memory (Score:4, Informative)
Two reasons:
One is simply that the cards use memory that isn't available normally. They don't use normal DDR RAM, they use special RAM for graphics cards, called GDDR. It is similar but not the same as memory in systems. Thus you can't just go out and buy sticks of RAM for it. So they'd have to be made special for the cards (and each gen of card uses different RAM), and thus would be expensive.
The bigger one is that the RAM is really pushed to the limit. You start to run in to all sorts of shit you never thought about. The electrical properties of the connection are highly important and there is a difference between what you get soldered on to traces and in a socket.
It's a nice thought, but not practical these days. Graphics cards are heavily dependent on high RAM bandwidth and you get that by really pushing the envelope. That means new RAM technologies all the time and the chips being pushed to the max.
Re: (Score:2)
Because the memory commonly available isn't usually as fast as the memory that's on those cards, and is often in a different "flavor" to allow faster graphics-style access over standard memory?
There's a reason there's GDDR3 [wikipedia.org] and DDR3 [wikipedia.org] as different specs, not to mention that there's GDDR5 out on ATI 4870 cards right now, and you can still barely buy standard DDR3 DIMM's.
Re: (Score:2)
Maybe they meant power consumption! :-D
Re: (Score:2)
Re: (Score:2, Informative)
Re: (Score:2)
Yea. This card is made for Cad functioning and not shading or any kind of gaming. The card is more for workshops.
Excuse me? Nvidia says [nvidia.com]
The reference standard for Shader Model 4.0 and next generation operating systems
Enabling breakthrough ultra-realistic, real-time visualization applications.
Available only on the Quadro FX 3700
Flip through the OpenGL specifications [opengl.org] sometime. Ask yourself whether there are any features of OpenGL which are not useful for games.
Examples that have been used to sell Quadro cards in the past include:
wireframe antialiasing (as opposed to full screen antialiasing)
multiple clip planes
two sided lighting.
Come on! Do you play games in wireframe mode?
Yes, you might be able to fool the computer into thinking your GeForce is a Quadro, but, iirc, the quadro devotes a bit more silicon to t
PLEASE (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Have you played No More Heros for Wii? It suffers frame rate slowdown issues...
Re:misread the subject (Score:5, Funny)
I read that as 4*MB* video card.
I fucking hate the beginning of work weeks.
Working hard, I see.
Re: (Score:2)
If someone is playing games with one of those cards in their system, it's because they're killing time in their lunch break before going back to do some seriously complex computational work. That's not a gaming card.
Re: (Score:2, Flamebait)
Because the die problems are with notebooks of the previous generation cards so how is it in anyway related?
Also it's lame to post a AC pretending to be someone else because you were modded troll. Real lame.