World's First 2GB Graphics Card Is Here 400
An anonymous reader writes "TUL Corporation's PCS HD4850 is the world's first graphics card to offer on-board 2gig video memory. The card is based on RV770 core chip, with 800 stream processors and 2GB of GDDR3 high-speed memory." That's more memory than I've had in any computer prior to this year — for a video card.
Re:Bottlenecks? (Score:5, Interesting)
The article mentions that too little video memory can be a bottleneck. But wouldn't squeezing 2 gigs of memory on a graphics card simply move the limiting bottlenecks elsewhere?
I understand your question, but the whole point is that sometimes a game can be sluggish only because there is not enough memory and not even remotely close because of core performance. Today's games and the future brings us more games that utilize all the extreme amounts of memory, which ultimately results in greater textures and more variety.
But to answer your question: there's always going to be at least one bottleneck, but by adding more memory, at least they raised the bar a bit. Not that today's games are going to run much faster with this, but upcoming titles will.
32-bit address space limitations (Score:5, Interesting)
In summary, I for one welcome our new 64-bit overlords...
Re:Huh (Score:2, Interesting)
I tend to agree with the other poster who mentioned Counterstrike.
I'll take it a step further, though, and say this: I believe game development by mom & pop shops is about to enter a golden age.
High quality open source engines like Cube 2 (as well as many others) and a greater emphasis on procedural content generation (I give it a year or two before high quality open source libraries for this are available) will enable small developers to take advantage of these (somewhat insane!) hardware capabilities. You don't need ridiculous poly counts to have great gameplay, the Wii has proved that beyond any doubt. The open source world is very well equipped to provide small developers with huge sets of textures and models under licenses (e.g., creative commons) that will enable awesome things that we can't even imagine yet. I believe we will end up with more open gaming platforms as a result of these developments.
In short, no offense, and maybe I'm just an optimist, but I think you're 100% wrong ;)
Why do you do this? (Score:0, Interesting)
I saw this [slashdot.org] posted elsewhere and took the time to read through it. I think the evidence presented there is very much incontrovertible.
Honest question, why do you do all that? Don't you have better uses for your time? A family? Hobbies?
Re:you have no idea (Score:5, Interesting)
i dont like fpses. but then again, that kind of graphics, makes some fpses worth playing. And that right there sums up the problem with the gaming industry. Game producers don't even need to worry about whether their game is any good simply because some people will play it just because it's shiny (unity100, I'm looking right at you).
That's one of the easiest ways to be modded +5 insightful on /., just complain about games with good graphics not having any creativity. What about the games with bad graphics and bad gameplay? The two are not mutually exclusive.
Games are a visual medium, they are supposed to look good.
Re:Better definition than real life. (Score:3, Interesting)
The human eye has about 100 million rods and cones. You need a 100 megapixel framebuffer (around 10,000 by 10,000 pixels to achieve this.
There was an article in the Independent newspaper about Virtual Reality a long time ago. In the article, one of the researchers stated that photorealistic quality was defined as 80 million textured triangles/second.
Re:2GB of memory for a videocard, eh? (Score:1, Interesting)
use it for swap..
http://www.hackszine.com/blog/archive/2008/06/use_video_ram_as_swap_in_linux.html
Re:2GB of memory for a videocard, eh? (Score:4, Interesting)
Re:2GB of memory for a videocard, eh? (Score:5, Interesting)
Just that the resolution of the framebuffer and the textures are two entierly different things.
The framebuffer, even at 2048 x 1600 x 48 bit uses a ridiculous 18.75 MB per frame... out of 2GB? That's nothing.
The rest of the memory gets used for textures, vertex data, normals, and so on... so you have to have color, normal, bump map, and specular reflection information, just for one texture. Then a mip map of everything. For large textures you can never have enough graphics memory as long as the chip can render the textures. Main RAM is useless for this. Just try an onboard graphics chip with memory sharing. Huge PITA.
Shaders are not even worth mentioning in terms of graphics memory. Code is usually the tiniest part.
Main RAM on the other hand holds mainly the word data, sound files, textures that are preloaded but not used yet (think GTA) and other game data like model data used for game calculations.
And: Yes, IAIGD (I am a game developer).
Screw Gaming... (Score:3, Interesting)
...at least in this context. Now OTOH, 3D/CG render engines that have OpenGL-rendering can do a whole hell of a lot with a beefy GPU and 2GB of RAM.
Normally, compared to software (CPU) raytracing, OGL rendering is pretty crappy on vidcards with low resources (shadows are jaggy, etc)... but with enough RAM and a high-end GPU, quality and speed could approach (if not surpass) the old-school "click 'render' then go have lunch" routine that most CG artists deal with nowadays.
Re:And maybe.... (Score:5, Interesting)
Re:you have no idea (Score:1, Interesting)
From my experience game companies try and make original games with strong enough stories and content that squeals can be created. So yes they do want you to play that title for years and years. Graphics card companies continue to push the envelope of what can be done with $500. Most of the work in creating games is making the game playable on today's hardware. Crysis is a good example of a game that really pushes today's hardware at full quality mode. You can scale it back to get better fps or you can get a better gfx card/computer. Crysis is made in a way that will let it stay on the shelf and still provide outstanding gfx for several years. By the time Crysis is playable on your standard office computer, the next great thing will be out to replace it with even more realism and bling.
As for old games, if enough people still play, most companies will still support them. I still play Starcraft online, and Blizzard still updates it. Starcraft has pretty good graphics tho for a 10 year old game. Need more minerals.
I'm still not convinced (Score:3, Interesting)
EQ1 was also the far better game at the time. Simply because the competition was even worse.
Since you mention UO, it was still a fucked-up, unbalanced, small, simplistic, gank-fest. The dynamic duo of self-centered narcisists, Lord British and his trusty sidekick Raph Koster (who'd later do the same with SWG) were still telling players what they should like, instead of even trying to notice what players actually want. Untested patches were issued that broke more than they fixed, and some had to be rolled back because they were a catastrophe. The fact that Lord British diverted the bug-fixing budget of UO to make Ultima 9, also didn't help.
And that's the short version. One could fill a tome with what was wrong with UO, and what got worse. It was only after EQ and AC ate their lunch big time, that Origin even started considering fixing their game.
If we're talking about looks and angular breasts, a lot of us actually thought that the 2D graphics of UO actually looked _better_ than the hideous 3D mess of EQ or AC. But UO just didn't give us what we wanted. So EQ won.
Don't mistake players for the circle-jerk clique of online reviewers. Reviewers seem to get outright orgasms over "OMG, it's shiny" or, back then, "OMG it's 3D". The average player cared a lot more about gameplay. EQ may bore you to tears by nowadays standards, but back then it was the best by a wide margin. Or rather: the competition was even worse. If you will, EQ2 won by being the one-eyed in the land of the blind.
And it seems to me like EQ2 is the result of just that kind of mistake. Sony got caught in the same mistaken belief that the servile "OMG, it's shiny" gang of reviewers actually represent the average gamer. And produced a game whose only merit was "OMG, it's shiny." And lost.
Brand only gets you so far. Star Wars was a bigger brand name than EQ and Warcraft _combined_, and it still got to be merely a niche game. The Sims had sold more copies than all Warcraft games _and_ Everquest _combined_, and it outright flopped. Etc.
Basically a crap game with a good franchise, still flops.
And if we're talking about EQ vs Warcraft, actually I remember it the other way around. Sony was _the_ big name in MMOs and everyone expected EQ2 to be teh uber-game that sweeps everyone off their feet. Blizzard was just another unproven "me too." People wanted a Starcraft 2 or Diablo 3 from them, not a MMO. The reaction to Blizzard's announcement that they're making a MMO was _disappointment_, not "yay, I'm preordering it because it's Warcraft." The average Warcraft player was a RTS player, and was just about as looking forward to an MMO as to root canal.
So, no, Sony was the bigger name there, and it lost anyway.
High detailed is relative. By comparison to EQ2, which is what I was trying to do, WoW is a lot lower res. Or at least EQ2 needed 512 MB for max details, WoW ran decently on an 128 MB card. If that's not due to textures, well, I'm curious what it was.
Re:Somehow, I'm not that sure (Score:1, Interesting)