Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware Entertainment Games

World's First 2GB Graphics Card Is Here 400

An anonymous reader writes "TUL Corporation's PCS HD4850 is the world's first graphics card to offer on-board 2gig video memory. The card is based on RV770 core chip, with 800 stream processors and 2GB of GDDR3 high-speed memory." That's more memory than I've had in any computer prior to this year — for a video card.
This discussion has been archived. No new comments can be posted.

World's First 2GB Graphics Card Is Here

Comments Filter:
  • Re:Bottlenecks? (Score:5, Interesting)

    by eebra82 ( 907996 ) on Tuesday July 15, 2008 @11:56AM (#24197447) Homepage

    The article mentions that too little video memory can be a bottleneck. But wouldn't squeezing 2 gigs of memory on a graphics card simply move the limiting bottlenecks elsewhere?

    I understand your question, but the whole point is that sometimes a game can be sluggish only because there is not enough memory and not even remotely close because of core performance. Today's games and the future brings us more games that utilize all the extreme amounts of memory, which ultimately results in greater textures and more variety.

    But to answer your question: there's always going to be at least one bottleneck, but by adding more memory, at least they raised the bar a bit. Not that today's games are going to run much faster with this, but upcoming titles will.

  • by Spatial ( 1235392 ) on Tuesday July 15, 2008 @12:06PM (#24197633)
    We're really beginning to feel it now. With this card, you're limited to around 1,750MB of RAM on a 32-bit Windows system; 4GB minus the 2GB on the card, minus all the other mapped stuff which amounts to 250MB on my computer.

    In summary, I for one welcome our new 64-bit overlords...
  • Re:Huh (Score:2, Interesting)

    by erudified ( 958273 ) <alex@erudified.com> on Tuesday July 15, 2008 @12:22PM (#24197947) Homepage

    I tend to agree with the other poster who mentioned Counterstrike.

    I'll take it a step further, though, and say this: I believe game development by mom & pop shops is about to enter a golden age.

    High quality open source engines like Cube 2 (as well as many others) and a greater emphasis on procedural content generation (I give it a year or two before high quality open source libraries for this are available) will enable small developers to take advantage of these (somewhat insane!) hardware capabilities. You don't need ridiculous poly counts to have great gameplay, the Wii has proved that beyond any doubt. The open source world is very well equipped to provide small developers with huge sets of textures and models under licenses (e.g., creative commons) that will enable awesome things that we can't even imagine yet. I believe we will end up with more open gaming platforms as a result of these developments.

    In short, no offense, and maybe I'm just an optimist, but I think you're 100% wrong ;)

  • Why do you do this? (Score:0, Interesting)

    by Anonymous Coward on Tuesday July 15, 2008 @12:25PM (#24198045)

    I saw this [slashdot.org] posted elsewhere and took the time to read through it. I think the evidence presented there is very much incontrovertible.

    Honest question, why do you do all that? Don't you have better uses for your time? A family? Hobbies?

  • Re:you have no idea (Score:5, Interesting)

    by Jasonjk74 ( 1104789 ) on Tuesday July 15, 2008 @12:26PM (#24198057)

    i dont like fpses. but then again, that kind of graphics, makes some fpses worth playing. And that right there sums up the problem with the gaming industry. Game producers don't even need to worry about whether their game is any good simply because some people will play it just because it's shiny (unity100, I'm looking right at you).

    That's one of the easiest ways to be modded +5 insightful on /., just complain about games with good graphics not having any creativity. What about the games with bad graphics and bad gameplay? The two are not mutually exclusive. Games are a visual medium, they are supposed to look good.

  • by mikael ( 484 ) on Tuesday July 15, 2008 @12:33PM (#24198203)

    The human eye has about 100 million rods and cones. You need a 100 megapixel framebuffer (around 10,000 by 10,000 pixels to achieve this.

    There was an article in the Independent newspaper about Virtual Reality a long time ago. In the article, one of the researchers stated that photorealistic quality was defined as 80 million textured triangles/second.

  • by Anonymous Coward on Tuesday July 15, 2008 @12:44PM (#24198433)

    use it for swap..

    http://www.hackszine.com/blog/archive/2008/06/use_video_ram_as_swap_in_linux.html

  • by xouumalperxe ( 815707 ) on Tuesday July 15, 2008 @01:14PM (#24198975)
    A framebuffer for a 2560x1600, 32 bits per pixel display (the highest resolution you're likely to find on a monitor that's even remotely reasonable for home use) would take up around 15 MiB. make it triple buffering with 64 bpp (for what, exactly, I don't know. But it's a worst case scenario), and you're still only at 90 megs. Sure, 90 megs is a big chunk of a 512 MiB card, but I seriously doubt that it's going to have much impact on a 1 GiB card. It *is*, however, going to hurt -- a lot -- insofar as raw processing power is concerned. To fully use a 2 GiB card, you're either using massively large textures, or some never-before-seen technology, like fully loading map meshes into VRAM and using your card's geometry transform capabilities to do funky stuff with them. In those terms, I guess I'll buy one of these when Will Wright teams up with John Carmack. :)
  • by Hurricane78 ( 562437 ) <deleted&slashdot,org> on Tuesday July 15, 2008 @01:22PM (#24199147)

    Just that the resolution of the framebuffer and the textures are two entierly different things.
    The framebuffer, even at 2048 x 1600 x 48 bit uses a ridiculous 18.75 MB per frame... out of 2GB? That's nothing.
    The rest of the memory gets used for textures, vertex data, normals, and so on... so you have to have color, normal, bump map, and specular reflection information, just for one texture. Then a mip map of everything. For large textures you can never have enough graphics memory as long as the chip can render the textures. Main RAM is useless for this. Just try an onboard graphics chip with memory sharing. Huge PITA.
    Shaders are not even worth mentioning in terms of graphics memory. Code is usually the tiniest part.

    Main RAM on the other hand holds mainly the word data, sound files, textures that are preloaded but not used yet (think GTA) and other game data like model data used for game calculations.

    And: Yes, IAIGD (I am a game developer).

  • Screw Gaming... (Score:3, Interesting)

    by Penguinisto ( 415985 ) on Tuesday July 15, 2008 @02:04PM (#24199921) Journal

    ...at least in this context. Now OTOH, 3D/CG render engines that have OpenGL-rendering can do a whole hell of a lot with a beefy GPU and 2GB of RAM.

    Normally, compared to software (CPU) raytracing, OGL rendering is pretty crappy on vidcards with low resources (shadows are jaggy, etc)... but with enough RAM and a high-end GPU, quality and speed could approach (if not surpass) the old-school "click 'render' then go have lunch" routine that most CG artists deal with nowadays.

    /P

  • Re:And maybe.... (Score:5, Interesting)

    by Slime-dogg ( 120473 ) on Tuesday July 15, 2008 @06:24PM (#24204483) Journal
    What happens when you're using dual monitors?
  • Re:you have no idea (Score:1, Interesting)

    by Anonymous Coward on Tuesday July 15, 2008 @07:07PM (#24205207)

    From my experience game companies try and make original games with strong enough stories and content that squeals can be created. So yes they do want you to play that title for years and years. Graphics card companies continue to push the envelope of what can be done with $500. Most of the work in creating games is making the game playable on today's hardware. Crysis is a good example of a game that really pushes today's hardware at full quality mode. You can scale it back to get better fps or you can get a better gfx card/computer. Crysis is made in a way that will let it stay on the shelf and still provide outstanding gfx for several years. By the time Crysis is playable on your standard office computer, the next great thing will be out to replace it with even more realism and bling.

    As for old games, if enough people still play, most companies will still support them. I still play Starcraft online, and Blizzard still updates it. Starcraft has pretty good graphics tho for a 10 year old game. Need more minerals.

  • by Moraelin ( 679338 ) on Wednesday July 16, 2008 @07:43AM (#24210481) Journal

    Except for the fact that this is what EQ1 did. Back when EQ1 launched, they *required* a 3D card. That was pure INSANITY according to the conventional wisdom at the time, because few single-player games required you to have a 3D card; most games had a software renderer, too, that looked like crap. This was back when 3Dfx was the top dog and you had to install Glide because DirectX wasn't quite up to snuff.

    EQ1 was also the far better game at the time. Simply because the competition was even worse.

    Since you mention UO, it was still a fucked-up, unbalanced, small, simplistic, gank-fest. The dynamic duo of self-centered narcisists, Lord British and his trusty sidekick Raph Koster (who'd later do the same with SWG) were still telling players what they should like, instead of even trying to notice what players actually want. Untested patches were issued that broke more than they fixed, and some had to be rolled back because they were a catastrophe. The fact that Lord British diverted the bug-fixing budget of UO to make Ultima 9, also didn't help.

    And that's the short version. One could fill a tome with what was wrong with UO, and what got worse. It was only after EQ and AC ate their lunch big time, that Origin even started considering fixing their game.

    If we're talking about looks and angular breasts, a lot of us actually thought that the 2D graphics of UO actually looked _better_ than the hideous 3D mess of EQ or AC. But UO just didn't give us what we wanted. So EQ won.

    Don't mistake players for the circle-jerk clique of online reviewers. Reviewers seem to get outright orgasms over "OMG, it's shiny" or, back then, "OMG it's 3D". The average player cared a lot more about gameplay. EQ may bore you to tears by nowadays standards, but back then it was the best by a wide margin. Or rather: the competition was even worse. If you will, EQ2 won by being the one-eyed in the land of the blind.

    And it seems to me like EQ2 is the result of just that kind of mistake. Sony got caught in the same mistaken belief that the servile "OMG, it's shiny" gang of reviewers actually represent the average gamer. And produced a game whose only merit was "OMG, it's shiny." And lost.

    It just turns out that Warcraft was a stronger brand and ate Sony's lunch. Oops.

    Brand only gets you so far. Star Wars was a bigger brand name than EQ and Warcraft _combined_, and it still got to be merely a niche game. The Sims had sold more copies than all Warcraft games _and_ Everquest _combined_, and it outright flopped. Etc.

    Basically a crap game with a good franchise, still flops.

    And if we're talking about EQ vs Warcraft, actually I remember it the other way around. Sony was _the_ big name in MMOs and everyone expected EQ2 to be teh uber-game that sweeps everyone off their feet. Blizzard was just another unproven "me too." People wanted a Starcraft 2 or Diablo 3 from them, not a MMO. The reaction to Blizzard's announcement that they're making a MMO was _disappointment_, not "yay, I'm preordering it because it's Warcraft." The average Warcraft player was a RTS player, and was just about as looking forward to an MMO as to root canal.

    So, no, Sony was the bigger name there, and it lost anyway.

    WoW had high detailed textures, but relatively low polygon counts for the models.

    High detailed is relative. By comparison to EQ2, which is what I was trying to do, WoW is a lot lower res. Or at least EQ2 needed 512 MB for max details, WoW ran decently on an 128 MB card. If that's not due to textures, well, I'm curious what it was.

  • by floodo1 ( 246910 ) <floodo1&garfias,org> on Wednesday July 16, 2008 @12:15PM (#24214503) Journal
    I wouldn't call this representative of the market as a whole. Valve sells ZERO graphics intensive games. Thus one could conclude that their survey is likely skewed away from high-end cards. One could imagine that if Crytek did the same survey it would be supremely skewed in the other direction, towards high end cards. Your point is still valid, in that most people DONT buy high end cards and stay in the sub $200 market (at best), or just stick with what their computer came with.

Love may laugh at locksmiths, but he has a profound respect for money bags. -- Sidney Paternoster, "The Folly of the Wise"

Working...