Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Upgrades Graphics Software Hardware

NVIDIA Makes First 4GB Graphics Card 292

Frogger writes to tell us NVIDIA has released what they are calling the most powerful graphics card in history. With 4GB of graphics memory and 240 CUDA-programmable parallel cores, this monster sure packs a punch, although, with a $3,500 price tag, it certainly should. Big-spenders can rejoice at a new shiny, and the rest of us can be happy with the inevitable price shift in the more reasonable models.
This discussion has been archived. No new comments can be posted.

NVIDIA Makes First 4GB Graphics Card

Comments Filter:
  • by Jaysyn ( 203771 ) on Monday November 10, 2008 @11:11AM (#25704119) Homepage Journal

    A video card I can't use on XP32 since it can't properly allocate that much VRAM & system RAM at the same time.

  • by IanCal ( 1243022 ) on Monday November 10, 2008 @11:30AM (#25704467)
    If you're doing scientific computing requiring about 4 gigs of ram, and need the processing power of current-gen graphics cards then you should be able to figure out how to migrate from XP32 to 64 bit.

    That you are using an old operating system incapable of dealing with this new hardware is not the fault of nVidia.

  • Re:no it's not (Score:1, Insightful)

    by Anonymous Coward on Monday November 10, 2008 @11:32AM (#25704505)

    Looks at his history books and current events... Yep, most powerful card in history (so far)

  • by DragonTHC ( 208439 ) <Dragon AT gamerslastwill DOT com> on Monday November 10, 2008 @11:32AM (#25704509) Homepage Journal

    I don't believe anyone claimed this was a gaming card.

    This is a scientific number cruncher. Its use is in visual computer modeling for anything from weather models to physics models.

    How about folding@home? this does it faster than any computer on the block.

    All of you kids making jokes about crysis are missing the point. This might run games, but it's a science processor first.

  • That's Awesome. (Score:2, Insightful)

    by Petersko ( 564140 ) on Monday November 10, 2008 @11:38AM (#25704661)
    In two years I'll be able to pick it up for $149.

    That's the great thing about video cards. Even a card that's two generations old is a terrific card, and they're fantastically cheap.
  • Re:Power != memory (Score:3, Insightful)

    by pak9rabid ( 1011935 ) on Monday November 10, 2008 @12:07PM (#25705221)

    excuse me but this is total bullshit. oldest trick in the book. if you are behind in technology, pop out a card with huge ram and try to get some sales.

    Are you some kind of idiot?

    With 4GB of graphics memory and 240 CUDA-programmable parallel cores

    That alone should be a plain indicator that this ISN'T a consumer-level card, nor is it even remotely close to being targeted as such by nvidia.

  • by Jackie_Chan_Fan ( 730745 ) on Monday November 10, 2008 @12:11PM (#25705297)

    32-bit is dead. It should have been dead 4 years ago...

    Any serious computer enthusiast or professional running a 32bit os on today's hardware should be ashamed. They're holding the industry back.

  • Re:Power != memory (Score:5, Insightful)

    by ardor ( 673957 ) on Monday November 10, 2008 @12:23PM (#25705559)

    meaning you can code directly the hardware

    Guess what CUDA and Stream have been designed for? Yes: for programming the hardware. What you suggest is pure insanity. NEVER EVER touch hardware directly from an userland app. And once you start writing a kernel module, you end up with something like CUDA/Stream anyway.

    I am a coder, and quite frankly I couldn't care less about nvidia drivers being closed source. They are MUCH better than the ATI ones, especially in the OpenGL department. nvidia whipped up a beta GL 3.0 driver in less than a month since GL3 specs were released. ATI? Nope. New standardized feature X is added to the registry. nvidia adds it pretty quickly; ATI adds it months, even years later. nvidia drivers are also pretty robust; I can bombard them with faulty OpenGL code, and they remain standing. With ATI's fglrx, even CORRECT code can cause malfunctioning.

    THESE are the things I care about. Not the license.

  • Re:Power != memory (Score:4, Insightful)

    by GleeBot ( 1301227 ) on Monday November 10, 2008 @12:24PM (#25705581)

    But claiming that they're just slapping a bunch of RAM on a card to drum up sales is just plain wrong. Hell, the blurb here on Slashdot even mentions the fact that it has 240 cores.

    Umm, the GeForce GTX 280, a gamer card released last summer, also has 240 "cores" (as Nvidia counts them; actually stream processors).

    This workstation card, as you might expect, is essentially the same thing as the consumer card, just tweaked towards the professional market (more RAM, different drivers). It's nothing especially innovative.

  • Re:Power != memory (Score:3, Insightful)

    by TheLink ( 130905 ) on Monday November 10, 2008 @02:38PM (#25708227) Journal
    But doesn't that give them more realism?

    Look at the video games, they keep trying to add flaws and blemishes everywhere to make it look real.

    In X years they won't be able to compete with perfect skin from virtual actors. So why bother?

    Given the porn market has people going for strange stuff, I'm sure there would be a fair number who would actually prefer their porn stars to have a tiny bit of hair stubble, slight blemishes etc.

    For the "perfect" stuff, they'll probably still have jobs providing original motion capture stuff.

    You can have a virtual actor sit still and look pretty, but I think it'll be a while before a computer can figure out how to make it "move sexy" even "move humanly" seems hard - they often just use motion capture.

    I believe humans have age old instincts in rapidly distinguishing from "moving healthily" to "moving not so healthily", and so on. Maybe in a decade or so the research will be done, and a product actually made. Even then I think they might have jobs just for voice overs - and voices are important.

    And not least Brand names are important.

    Looking at Hollywood movies and you can see some actors who just look pretty and are good candidates for replacement by virtual actors...

    Upcoming actors of that sort who aren't already established Brandnames are the ones who should worry.
  • Re:Power != memory (Score:5, Insightful)

    by Ecuador ( 740021 ) on Monday November 10, 2008 @03:15PM (#25708989) Homepage

    Do you realize that for computers 12+ years is several GENERATIONS?
    I had always been using ATI for Windows boxes and laptops, since my main concern was almost always video performance and TV-Out capability and I could not even get a video overlay work over TV-out with nVidia cards for years.
    Of course, when I had problems with linux drivers I built nVidia (I admit, even intel) linux boxes. But that is a thing of the past, I am back to ATI for linux, they are good and even getting better with each release.
    Anyway, long term loyalties is pretty silly. I bought my K6 233 at the same price my friend bought his MMX 166, in retrospect we all know how those two compare. I kept on buying Athlons when others were paying more for their crap P4's (they weren't called crap back when it was the best intel had to offer). But, hey, I am now buying Core 2 for non-low end systems, until AMD can come up with something better.
    Fanboyism gets you bad deals at least half of the time. You buy hardware, you don't marry it. Ok, I know this is slashdot and the last statement might generate some debate, but anyway you get the point.

E = MC ** 2 +- 3db

Working...