Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Upgrades Graphics Software Hardware

NVIDIA Makes First 4GB Graphics Card 292

Frogger writes to tell us NVIDIA has released what they are calling the most powerful graphics card in history. With 4GB of graphics memory and 240 CUDA-programmable parallel cores, this monster sure packs a punch, although, with a $3,500 price tag, it certainly should. Big-spenders can rejoice at a new shiny, and the rest of us can be happy with the inevitable price shift in the more reasonable models.
This discussion has been archived. No new comments can be posted.

NVIDIA Makes First 4GB Graphics Card

Comments Filter:
  • History repeats... (Score:2, Informative)

    by B5_geek ( 638928 ) on Monday November 10, 2008 @11:13AM (#25704143)

    I am reminded of old 3DFx advertisments just before they went belly-up.

  • Re:Power != memory (Score:5, Informative)

    by Neon Spiral Injector ( 21234 ) on Monday November 10, 2008 @11:21AM (#25704295)

    Yes, AMD's Stream [amd.com] technology. I don't think it is used as much as CUDA in practice.

  • by Anonymous Coward on Monday November 10, 2008 @11:28AM (#25704421)

    Um the card in question is in the Graphics workstation Quadro line, not for gaming, for making money doing CGI animations, 4D modeling, etc. Would not have 32bit drivers for XP, only drivers 64bit Microsoft and Linux O/S's.

  • Re:Power != memory (Score:4, Informative)

    by rogermcdodger ( 1113203 ) on Monday November 10, 2008 @11:28AM (#25704425)
    Or maybe there are companies that need high end cards with 4GB of RAM. This isn't some trick to get consumers to pay more for a low end card. This is now Nvidia's highest end workstation card.
  • by Enleth ( 947766 ) <enleth@enleth.com> on Monday November 10, 2008 @11:30AM (#25704451) Homepage

    Are we going to shell out $3,500 for a card that will fail [theinquirer.net] after half a year, or did they correct the problem already?

  • Not for home users (Score:3, Informative)

    by Bieeanda ( 961632 ) on Monday November 10, 2008 @11:30AM (#25704477)
    If the price tag didn't tip you off, this is one of Nvidia's Quadro line. They're not enthusiast boards, they're for intensive rendering work-- film-grade CG or simulations. Now, while the technology may come down to consumer-level hardware, especially if Carmack's supposition that real-time raytracing is the next big step, but this is like comparing a webcam to a real-time frame grabber.
  • Re:Power != memory (Score:5, Informative)

    by Ephemeriis ( 315124 ) on Monday November 10, 2008 @11:40AM (#25704697)

    excuse me but this is total bullshit. oldest trick in the book. if you are behind in technology, pop out a card with huge ram and try to get some sales.

    lets face it. nvidia has fallen behind ati in the chip race. you can place any number of 4870s in a setup as much as you like to equate the power of any monolithic nvidia card and they always kick the living daylights out of that nvidia card in terms of cost/performance per unit of processing power.

    In case the $3,500 price tag didn't tip you off, this isn't a gaming/enthusiast card. This is a Quadro - a professional card for high-end 3D rendering. Stuff like generating film-grade 3D or insane CAD stuff. Actually, due to the design of the card, it'd be pretty horrible at playing games.

    This thing is aimed at high-end scientific calculation and professional-grade rendering.

    ATI may, or may not, have something comparable. ATI may even have something better. I don't know, I don't follow the GPU industry very closely. But claiming that they're just slapping a bunch of RAM on a card to drum up sales is just plain wrong. Hell, the blurb here on Slashdot even mentions the fact that it has 240 cores.

  • Old news (Score:4, Informative)

    by freddy_dreddy ( 1321567 ) on Monday November 10, 2008 @11:56AM (#25705033)
    These [nvidia.com] were being sold in the first half of August for 10500$ - containing 2 of those cards. Only 3 months late.
  • by sa1lnr ( 669048 ) on Monday November 10, 2008 @12:20PM (#25705487)

    folding@home.

    My 3GHz C2D gives me 1920 points every 30/33 hours. My Geforce 8800GT gives me 480 points every 2.5 hours.

  • Re:Power != memory (Score:2, Informative)

    by ccool ( 628215 ) on Monday November 10, 2008 @12:20PM (#25705503)
    You're absolutly right, but it would be amazing with any CUDA-apps right now. Hell, you could probably use that to encode your H.264 movies more than 18x faster!!! see http://www.nvidia.com/object/cuda_home.html [nvidia.com]
  • Re:Power != memory (Score:4, Informative)

    by Sycraft-fu ( 314770 ) on Monday November 10, 2008 @12:57PM (#25706219)

    Two reasons:

    One is simply that the cards use memory that isn't available normally. They don't use normal DDR RAM, they use special RAM for graphics cards, called GDDR. It is similar but not the same as memory in systems. Thus you can't just go out and buy sticks of RAM for it. So they'd have to be made special for the cards (and each gen of card uses different RAM), and thus would be expensive.

    The bigger one is that the RAM is really pushed to the limit. You start to run in to all sorts of shit you never thought about. The electrical properties of the connection are highly important and there is a difference between what you get soldered on to traces and in a socket.

    It's a nice thought, but not practical these days. Graphics cards are heavily dependent on high RAM bandwidth and you get that by really pushing the envelope. That means new RAM technologies all the time and the chips being pushed to the max.

  • by cheier ( 790875 ) on Monday November 10, 2008 @01:33PM (#25706983)
    And about $1,800.

    Tesla C1060 = $1,700
    QuadroFX 5800 = $3,500

    You're right about the difference pretty much being the DVI port, but it is a pretty expensive DVI port. Compute professionals didn't want the GeForce series because of lack of support, and they didn't want Quadro because it was too expensive, so the Tesla was NVIDIAs middle ground.
  • Re:Power != memory (Score:3, Informative)

    by 0xygen ( 595606 ) on Monday November 10, 2008 @03:52PM (#25709719)

    I have switched sides twice during that time, have had bad cards from both manufacturers during the last 5 years and will continue to now buy based on individual product reviews.

    The landscape for most hi-tech products seems to change so quickly now, and suppliers / manufacturers change at such short notice that it is no longer possible to rely upon a vendor's name as a sign of quality.

    In the worst cases, even the same product with the same part number is a different product with different performance characteristics a few months down the line. This happens a LOT in the USB flash drive market.

  • Re:Power != memory (Score:3, Informative)

    by Chandon Seldon ( 43083 ) on Monday November 10, 2008 @05:15PM (#25711191) Homepage

    You are technically correct.

    Now, the next question is this: Is the class of problems caused by the existance of a monopoly restricted to situations where a market actor meets the strict definition of a monopoly that you gave?

    The answer is no, and anti-trust law in the United States recognizes that. Therefore, you can be convicted of "abusing monopoly power" without technically being a monopoly. Since strict monopolies basically never occur in nature without government interference (and even then you could argue about black market suppliers), it is convenient to use the term imprecisely to refer to any market participant that has significantly more market power in relation to a single product or service than any other participant.

    The general (economic and social) problem is market power, not the number of suppliers. Any oligopoly will warp the market in their favor and cause the same type of problem that a theoretical abusive monopolist would.

  • Re:what a revolution (Score:2, Informative)

    by collinstocks ( 1295204 ) on Monday November 10, 2008 @07:30PM (#25713249) Journal

    No, it's from The Hitchhiker's Guide to the Galaxy.

    It goes something like this:

    Some people speculate that if both the ultimate question and the ultimate answer were known in the same universe, the universe would cease to exist and be replaced with something more complicated.
    Others say that this has in fact already happened several times.

UNIX was not designed to stop you from doing stupid things, because that would also stop you from doing clever things. -- Doug Gwyn

Working...