NVIDIA Makes First 4GB Graphics Card 292
Frogger writes to tell us NVIDIA has released what they are calling the most powerful graphics card in history. With 4GB of graphics memory and 240 CUDA-programmable parallel cores, this monster sure packs a punch, although, with a $3,500 price tag, it certainly should. Big-spenders can rejoice at a new shiny, and the rest of us can be happy with the inevitable price shift in the more reasonable models.
Power != memory (Score:1, Interesting)
lets face it. nvidia has fallen behind ati in the chip race. you can place any number of 4870s in a setup as much as you like to equate the power of any monolithic nvidia card and they always kick the living daylights out of that nvidia card in terms of cost/performance per unit of processing power.
Re:Power != memory (Score:1, Interesting)
Re:cool (Score:2, Interesting)
Not gonna happen, RenderMan is CPU-only.
Re:Power != memory (Score:2, Interesting)
Basically, nvidia behavior is generating a lot of hate in coders community...
How did this retarded comment get upmodded? (Score:4, Interesting)
Really, people. If you're going to buy such an expensive professional card, you're going to go with a professional-grade operating system, which will of course be 64-bit.
Re:Power != memory (Score:5, Interesting)
Coder Hate like that brought by the shitty, bug filled drivers that ATI has a long history with?
I think ATI/AMD is on the right path, but they have a long history of being on the wrong path, while NVIDIA has always been more towards the middle (Not completely right, but not too badly wrong). It'll take some time before I jump to the ATI Bandwagon as completely as you obviously have.
Re:Power != memory (Score:5, Interesting)
There is no upper limit on the amount of memory required for tasks like volume visualisation, where you have a nice big 3D cube of data in 16-bit format. A cube 1024 voxels in each dimension with a single channel of 16-bit data (2 bytes) is going to be 2 Gigabytes. You will need at least two such cubes to do any sort of image processing work.
Even a digital movie can be considered to be a cube if you consider time as the 3rd dimension.
Rather than having cards with a fixed amount of VRAM, which can't manufacturers just put a bunch of memory card sockets on the card and allow users to add memory when they want?
Re:Power != memory (Score:3, Interesting)