Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Nvidia 480-Core Graphics Card Approaches 2 Teraflops 261

An anonymous reader writes "At CES, Nvidia has announced a graphics card with 480 cores that can crank up performance to reach close to 2 teraflops. The company's GTX 295 graphics cards has two GPUs with 240 cores each that can execute graphics and other computing tasks like video processing. The card delivers 1.788 teraflops of performance, which Nvidia claims is the fastest single graphics card in the market."
This discussion has been archived. No new comments can be posted.

Nvidia 480-Core Graphics Card Approaches 2 Teraflops

Comments Filter:
  • by unity100 ( 970058 ) on Friday January 09, 2009 @04:55PM (#26391601) Homepage Journal
    its not a problem to implement 52342525113 cores. its a problem to implement in cost, size, and power drain that an acceptably priced gamer pc case can accommodate.

    so far, nvidia is failing in that respect.
  • Great... (Score:5, Insightful)

    by pwolf ( 1016201 ) on Friday January 09, 2009 @05:01PM (#26391679)
    That's just great and all but when can I get a video card that doesn't take up half my case and melts down after 6 months of use? Not to mention, doesn't cost an arm and a leg.
  • by pieisgood ( 841871 ) on Friday January 09, 2009 @05:10PM (#26391795) Journal
    Can you really make a game that looks as good as crysis? Seriously, do you have any idea of what went into making it? Something tells me that you have no idea what so ever.
  • Re:480 core? (Score:3, Insightful)

    by Chabo ( 880571 ) on Friday January 09, 2009 @05:18PM (#26391925) Homepage Journal
    Maybe because GPGPU is coming soon, and the GPU makers want people to think of them as individual cores? So... partly marketing, I guess.
  • Re:Yes (Score:3, Insightful)

    by jgtg32a ( 1173373 ) on Friday January 09, 2009 @05:20PM (#26391941)
    A video card test needs to show a consumer the capabilities of the card, so they can decide if the card is for them. If what you said was true than they would only do one test at 1920x1600 and be done with it. The lowest resolution I've ever seen on a review was 1920x1080. Not everyone has a monitor that runs that high.
  • Re:Yes (Score:3, Insightful)

    by Surt ( 22457 ) on Friday January 09, 2009 @05:37PM (#26392147) Homepage Journal

    1920x1200 is the most preferred resolution because it is the native resolution of most of the 24" panels. If you don't play at native resolution, you get to experience glorious scaling artifacts. Glorious, glorious scaling artifacts.

  • by BloodyIron ( 939359 ) on Friday January 09, 2009 @05:43PM (#26392207)

    I like how you think you're a big enough person to post a racist comment, yet not big enough to log in.

    Grow up, and welcome to the real world (READ: Anyone can be president)

  • by Clueless Moron ( 548336 ) on Friday January 09, 2009 @05:46PM (#26392257)

    If you really want to go back to the source, "giga" is Greek and uses a "j" sound. [wiktionary.org]

    Consider the word "gigantic". It has the same root, "giga". Some people pronounce it with a hard "g", some with a soft "g".

    The language is a mess.

  • by bertok ( 226922 ) on Friday January 09, 2009 @05:49PM (#26392323)

    Well, I do know what goes into a game like Crysis, being a 3D game programmer and all. Those programmers were very, very good, believe me. Some of the stuff they pulled off is just amazing.

    The reason Crysis is slow is because of the artistic direction. Outdoor environments full of plants and shadows with a huge viewing distance is very hard to implement in a 3D engine. I mean really fucking hard. Making a game like that playable at all is a tradeoff between two scraggly trees on a flat green carpet that pretends to be grass, OR an enormous amount of research into optimization techniques that are very hard and time consuming to implement. The Crysis engine is pretty much the state of the art in optimization. And these guys managed to squeeze in fantastic shader effects on top of that, depth of field, and even some basic radiosity shadowing for the characters!!! That's just insane.

    Most reviewers and players with the right hardware thought the game looked amazing, way better than its peers at the time, or even now. I thought the effects (especially in the spaceship) looked better than most Sci-Fi movies, which is a stunning achievement for a 3D game running on a $500 video card. I upgraded my PC just to play the game, and I thought it was worth it. Lots of people did too:

    http://www.penny-arcade.com/comic/2007/10/15/ [penny-arcade.com]

    Take your head out of your ass and stop belittling other people's achievements until you have some of your own to compare it to, OK?

  • Comment removed (Score:3, Insightful)

    by account_deleted ( 4530225 ) on Friday January 09, 2009 @06:03PM (#26392537)
    Comment removed based on user account deletion
  • by MadnessASAP ( 1052274 ) <madnessasap@gmail.com> on Friday January 09, 2009 @06:09PM (#26392625)

    Hey now this man speaks the truth albeit with a poor choice of words, to use an alternative but equally popular automotive analogy, I may attach a PCIE connector to my car but that does not therefore mean that my car is suitable for operation inside a standard computer case much less plugged into an actual PCIE slot.

  • Re:Yes (Score:3, Insightful)

    by Runefox ( 905204 ) on Friday January 09, 2009 @06:33PM (#26392915)

    Almost makes me pine for the days of the CRT. ... Well, maybe not exactly. I don't want to imagine how heavy a 24" or larger CRT would be, but I'd love for another technology not locked to a single native resolution to break through the never-ending sea of fixed-pixel devices. For now, I just run my LCD in the scaled "maintain aspect" mode on my Radeon and enjoy the black borders on non-native resolutions. Better than that nice blurry stretch effect I'd get otherwise!

  • Re:*sigh* (Score:1, Insightful)

    by Anonymous Coward on Friday January 09, 2009 @07:18PM (#26393485)
    Do you ask your girlfriend permission to do everything and justify her every step you take? Good luck with girls in your life.
  • Re:Yes (Score:2, Insightful)

    by Pentium100 ( 1240090 ) on Friday January 09, 2009 @08:20PM (#26394169)

    And this is the main reason that I use a 21" CRT. Sure, it weighs 35kg, but I had to carry it to my room only once. And I can enjoy all resolutions up to 1920x1440@85Hz (if I want to, I can set it to 2048x1536 but only at 75Hz, so I don't do that).

    What I would like to have is a widescreen CRT such as Sony GDM-FW900, but they are not available locally and the shipping price is too high for me if I were to buy one on ebay.

  • by J.R. Random ( 801334 ) on Friday January 09, 2009 @08:45PM (#26394361)
    Until NVIDIA starts supporting the development of open source drivers I'm sticking with ATI, no matter how many Blazing Cores of Might NVIDIA might fit onto their chips. While ATI's closed source drivers have their fair share of bugs, and it will be some time before there are good 3D open source drivers for their more recent cards, at least the development has started and ATI has been aiding it, not hobbling it.
  • Re:Yes (Score:3, Insightful)

    by billcopc ( 196330 ) <vrillco@yahoo.com> on Friday January 09, 2009 @09:00PM (#26394493) Homepage

    Right, but I don't know very many people with 1920x1200 displays. I have one, and my 18-month old GPU can run Crysis and any other game just fine at that rez, but practically everyone else I know is still at 1280x1024 or 1680x1050.

    Realistically, reviewers should find the resolution and settings at which a game is playable, meaning 25-30 fps average for most games. Sure, it's funny to know that Crysis will get 8 fps at 2560x1600 with 16x AA+AF, but if that's what they reviewers think even hardcore gamers expect of their machines, they need to stop drinking the kool aid.

    To make things worse, I'm not at all averse to running a game at less-than-native resolution on my bigass LCD. The scaling in today's displays is far better than it used to be (for most models). If dropping the rez down one step makes it more enjoyable, then so be it; not the end of the world!

  • by Khyber ( 864651 ) <techkitsune@gmail.com> on Friday January 09, 2009 @11:36PM (#26395459) Homepage Journal

    I'll do one better.

    Case = bullshit $20 wonderjob at a pawn shop.
    PSU - 700w Rocketfish for 70 bucks.
    mobo/CPU combo - PC Chips with dual-core AMD Athlon64 X2 5200+ - 60 bucks
    RAM - 4GB cheapo RAM - 20 bucks from craigslist.
    GPU - 512MB 9800GTX+ - 175 from pricewatch.
    Hard Drive - 80GB 7200RPM WD - FREE from craigslist, complete with porn!
    Optical drive - DAEMON TOOLS, but I've found the one in my machine for 10 bucks
    OS License - XP Pro - 100.

    455 bucks, Crysis at 1920x1080 at high settings. I get very few framerate issues, in fact I only got them during the battleship invasion part of the game.

  • by Casandro ( 751346 ) on Saturday January 10, 2009 @02:54AM (#26396313)

    I mean seriously, as long as they don't publish the hardware specifications so you can write your own software for it, it's preety much useless. The only thing you can do with it is play games. And even then you have to fear every little software update as it might trigger some bug in the binary only drivers the manufacturer provides.

  • by Douglas Goodall ( 992917 ) on Saturday January 10, 2009 @04:11AM (#26396605) Homepage
    Learning how to put these CUDA cores to work for more than games is a great new opportunity because each new NVIDIA card has more of these resources. Unfortunately this seems to be rocket science and just because engineers can build these boards doesn't mean that the software community is ready or able to design software that benefits from this architecture. When they do, things will get very interesting. Hardware people decided to go multicore because it was getting harder to go faster with uni-core processors. Software people got told they would have to write a different kind of software to stay competitive, and this area will be very important in the future. Actually it is right now. I noticed Dell is pushing 2.5 GHz quad core machines with six gigs of memory at Costco. I don't know how much of the contemporary software can properly utilize these cores, but time will tell. As the programming languages get built-in support for multi-core programming, things will improve. I noticed there is some nice support in Python.
  • by Targon ( 17348 ) on Saturday January 10, 2009 @10:15AM (#26397993)

    A part of the software design process is how to break up the main application into the different components. With multi-threading, the design needs to figure out what can be handled in a different thread, and if having a different thread for that function is worth the code administration needed to tie things all together.

    Remember, it is fairly easy to make a different thread and have it do what you want it to do. The difficulty is in how to tie the different threads together to make the application work as expected.

Happiness is twin floppies.

Working...