World's First 2GB Graphics Card Is Here 400
An anonymous reader writes "TUL Corporation's PCS HD4850 is the world's first graphics card to offer on-board 2gig video memory. The card is based on RV770 core chip, with 800 stream processors and 2GB of GDDR3 high-speed memory." That's more memory than I've had in any computer prior to this year — for a video card.
2GB of memory for a videocard, eh? (Score:5, Funny)
Great for the pointless eye-candy first-person shooters. For everything else, there's MasterCard.
you have no idea (Score:2, Insightful)
Re:you have no idea (Score:4, Insightful)
And that right there sums up the problem with the gaming industry. Game producers don't even need to worry about whether their game is any good simply because some people will play it just because it's shiny (unity100, I'm looking right at you).
Re:you have no idea (Score:5, Interesting)
i dont like fpses. but then again, that kind of graphics, makes some fpses worth playing. And that right there sums up the problem with the gaming industry. Game producers don't even need to worry about whether their game is any good simply because some people will play it just because it's shiny (unity100, I'm looking right at you).
That's one of the easiest ways to be modded +5 insightful on /., just complain about games with good graphics not having any creativity. What about the games with bad graphics and bad gameplay? The two are not mutually exclusive.
Games are a visual medium, they are supposed to look good.
Re:you have no idea (Score:5, Insightful)
How about games with good gameplay and bad graphics? Those exist too, and they are better than games with good graphics and bad gameplay.
Re:you have no idea (Score:5, Funny)
How about games with good gameplay and bad graphics? Those exist too, and they are better than games with good graphics and bad gameplay.
Here's a novel concept: developers should strive for both graphics and content! That's just crazytalk...
Re:you have no idea (Score:5, Insightful)
Strangely this isn't the case with music, everyone declares the current medium to be shite almost straight away, cds? shite, vinyl? shite, tape? definatly shite.
Re: (Score:3)
I don't think the poster was complaining about that exactly, it is a pretty natural progression. He was complaining about last gen always being lambasted so much, when in fact it's still passable.
I agree with him - I buy some of my movies as blu-ray and some as DVD, depending on whether they're just a drama, comedy, action movie, documentary, etc. Eventually I'll be buying everything in HD as the prices come down, but at the moment I consider DVD very adequate for most movies. Especially compared to the Sta
Re:you have no idea (Score:4, Insightful)
The problem with that philosophy is that it drives the costs of making games *way* up, eventually creating a market where only big companies like EA are able to compete, and anything that's not a sequel is considered 'a risky investment', utterly crushing the chances of independant developers of going mainstream.
Re:you have no idea (Score:5, Insightful)
To game company CEOs, this translates as: "Customer Culture20 occasionally buys games with bad gameplay, but good graphics. We need more of these games for him to buy because we make more profit when he buys multiple crap games and plays them as little as possible."
They don't want me to play the games for years and years. They want me to get bored and buy the next shiny thing.
Re:you have no idea (Score:4, Insightful)
I disagree.
Paintings and photos are a visual medium. Even movies have sound too. Games have sound AND interaction.
I play games to interact, not to see pretty things. If I want pretty, I can watch Revenge Of The Sith, or Lord Of The Rings, Or Naked Women.
Games don't compete even vaguely with Hollywood in terms of graphics. They will always be many years behind due to being real time.
But hey, feel free to prioritize graphics, it means that reasonable video cards for the rest of us become dirt cheap :D. Late-adopter FTW.
Re: (Score:3, Insightful)
I'd agree that games are supposed to look good but the problem is, how do you define "good graphics"? personally I define it as "it preserves a distinct artistic style throughout the entire presentation", but many people seem to define it as "how many polygons does it use for the main character".
For example, just to pick two old games, which one do you think has the best graphics, Castlevania 3 for the NES or Syphon Filter for the PSX? me, I'd take the former, since as much as I enjoyed the latter, it's gra
Re: (Score:2)
hahahahahhaa (Score:2)
now i only play occasionally to have a change. and to be honest, that eye candy helps great deal with changing the mood into the game atmosphere.
Re: (Score:2)
Unfortunately it's rather difficult to create games like Diablo, Diablo 2 and World of Warcraft every year, and get the players interested.
Re: (Score:3, Insightful)
WoW had fantastically good gameplay in areas that you overlook - it's very easy to learn if you've never played an MMO before, and it's very easy for a casual player to get addicted to (until the single-player content runs out).
This is wht WoW has 10 million subscribers: Blizzard took the same repetitive gameplay as every other MMO, and made it accessible to the casual gamer. The game much simpler in gameplay, and vastly better un usability, than its competitors.
Somehow, I'm not that sure (Score:5, Insightful)
While _some_ people do buy based on screenshots, the blanket generalization is little more than wishful thinking on the part of the publisher. You know, right next to, "people don't mind it if it's released buggy and patched later" and "people don't talk to each other, they only take their information from our marketing department."
The most visible fly in the ointment is WoW. It has the least detailed graphics of any MMO since, I dunno, 2003 or so. Yes, it actually has less polygons and lower detail textures than some games _older_ than it. Shader effects, bump-mapping, and any kind of shiny stuff are almost non-existent. (Ok, ok, they added weather later.) It also sold like hot cakes.
EQ2 was launched roughly at the same time as WoW, and tried to have _much_ higher resolution graphics and a metric buttload of shader effects. You can't even have a freaking armour modelled as just a texture, it just has to have a shader that makes it look 3D. It required a 512 MB card just to play with all those details... at a time where such a card didn't even exist. I think it never managed to get more than 1/50 the number of players WoW had, and it went slowly downhill from there.
Interestingly enough, more people complained about EQ2's "sterile graphics" than about WoW's cartoonish ones. (See what Penny Arcade had to say about EQ2's graphics back then, for example.) Turns out that just using insane texture resolutions and polygon counts isn't a substitute for talent, you know?
City Of Heroes had a _major_ graphics upgrade in Issue 6 (which coincides with launching the City Of Villains standalone expansion-pack), and the new Villain zones _quadruple_ even that number of polygons on screen. But let's concentrate on the COH side alone, because that was almost the same old game as before, only with a ton of graphical upgrades. Funnily, it didn't produce much of a jump in the number of players, and certainly no lasting effect. Anyway, the game peaked at 175,000 players in the USA alone soon after launch, and went gradually downhill from there. Last I heard a number it was last year at 145,000 in all territories combined and including both COV and COH players.
Basically high-res, shiny graphics don't seem to do all that much. Sure, it helps if you're not butt ugly. But if you look at the number of subscribers, the effect of insane graphics just isn't there. EQ2 vs WoW, the better game won, not the one requiring a new graphics card. Or COH pre-I6 and post-I6, just doesn't show the players rushing in because of the graphics.
Or in the offline game arena, The Sims was launched as a mostly 2D game with 2D sprites (ok, it used primitive low-polycount 3D graphics for the characters), in an age of shiny 3D games. It outsold not only any of those shiny 3D FPS games from the same year, it outsold them all combined.
And I'll further guess that Crysis and all those other games presented as "proof" that graphics sell... they probably had some other merits too. A lot fewer people would have bough them, if their _only_ merit were the graphics. Games with good, shiny graphics have flopped before.
Re:Somehow, I'm not that sure (Score:5, Informative)
This is in support of your argument. Every quarter or so I do the Valve hardware survey that logs our gaming systems' specs so that they can get a handle on what paying customers are using. The top 15 right now are...
NVIDIA GeForce 8800 166,588 9.37 %
NVIDIA GeForce 7600 101,218 5.70 %
NVIDIA GeForce 8600 95,619 5.38 %
NVIDIA GeForce 6600 79,478 4.47 %
NVIDIA GeForce FX 5200 64,704 3.64 %
NVIDIA GeForce 7300 59,544 3.35 %
ATI Radeon 9600 54,727 3.08 %
ATI Radeon 9200 45,585 2.57 %
NVIDIA GeForce 7900 44,134 2.48 %
NVIDIA GeForce 6200 42,834 2.41 %
ATI Radeon X1950 41,533 2.34 %
NVIDIA GeForce 6800 40,839 2.30 %
NVIDIA GeForce4 MX 38,990 2.19 %
NVIDIA GeForce 7800 36,192 2.04 %
ATI Radeon X800 35,449 1.99 %
About 1/3 of the top 15 cards are what the "Oooo Shiny Crysis Crowd" would call obsolete, and frankly the presence of a DX7 card even raises my eyebrow. This is the target audience for a powerful graphics card, but if Valve wants to sell to their customer base they can look at this and think, "Gee, maybe we should make a game that doesn't require a fuckton of curiously high bandwidth LMNOPRAM.and maybe make a fun game that at least scales down well.
Re: (Score:3, Informative)
He never said that. He said about 1/3 of the top 15 cards are what the "Crysis Crowd" would call obsolete -- in other words, not "shiny Crysis material." He never said any of the cards in that list were "Crysis material."
I'm still not convinced (Score:3, Interesting)
EQ1 was also the far better game at the time. Simply because the competition was even worse.
Since you m
Re:2GB of memory for a videocard, eh? (Score:5, Insightful)
Re:2GB of memory for a videocard, eh? (Score:5, Informative)
Re: (Score:2, Informative)
But you are right, this is MEMORY, more models/animations = more memory requirements = bigger maps. So yes this is needed (eventually), add to the fact that physics is now being implemented in GPUs (I have only briefly touched the code, which doe
Re:2GB of memory for a videocard, eh? (Score:5, Insightful)
The maps tend to be stored in main system memory. The graphics tend to be stored in graphics memory. You indeed need extra memory capacity, processor speed, and memory bandwidth for some of the post processing. However, resolution is not post-processing. Higher resolution means more pixels. More pixels means more RGB values in memory. More pixels also means more things to post-process. A higher polygon count and more textures can use more video memory, too.
Re:2GB of memory for a videocard, eh? (Score:4, Interesting)
Re:2GB of memory for a videocard, eh? (Score:4, Funny)
Create your very own mindless zombie alien hovering ball of goo that shoots acid thing, and unleash it against the unsuspecting online community! Design your very own oversized ultra new tech nuclear powered futuristic double barreled organic biorocket launcher weapon and fight against hordes of the deadliest community-created horror creatures. Battle with your aim against others' wit in the first ever MMOFPS in history: S I M Q U A K E
Hm, I was joking but that sounds like something I would play! DAMN YOU Slashdot for giving this idea to a guy who is not the CEO of a game developer house!
Re:2GB of memory for a videocard, eh? (Score:5, Interesting)
Just that the resolution of the framebuffer and the textures are two entierly different things.
The framebuffer, even at 2048 x 1600 x 48 bit uses a ridiculous 18.75 MB per frame... out of 2GB? That's nothing.
The rest of the memory gets used for textures, vertex data, normals, and so on... so you have to have color, normal, bump map, and specular reflection information, just for one texture. Then a mip map of everything. For large textures you can never have enough graphics memory as long as the chip can render the textures. Main RAM is useless for this. Just try an onboard graphics chip with memory sharing. Huge PITA.
Shaders are not even worth mentioning in terms of graphics memory. Code is usually the tiniest part.
Main RAM on the other hand holds mainly the word data, sound files, textures that are preloaded but not used yet (think GTA) and other game data like model data used for game calculations.
And: Yes, IAIGD (I am a game developer).
Re: (Score:2)
I don't know what AF is though.
Anisotropic Filtering?
Re:2GB of memory for a videocard, eh? (Score:5, Funny)
Re:2GB of memory for a videocard, eh? (Score:4, Informative)
but I believe AA (anti-aliasing) is after processing to a scene
There are a number of ways to do anti-aliasing but IIRC the common way is to oversample, that is generate the output in a higher resoloution than will be output and then downsample it.
If you have a 2560x1920 monitor and oversample by 4 times in each direction you would be generating in 10240x7680. That would mean you would need over 300 megs just for the output buffer. I'm not sure if current cards could handle that at a reasonable framerate anyway though.
Afaict the big thing putting pressure on graphics memory is texture detail, if you double the horizontal and vertical resoloution of your texture you quadruple the memory required to store it. Ideally you want enough memory on your graphics card to store all the textures the game uses on the card. Texture detail is something the game developer can fairly easilly allow the user to alter, just design the textures in the highest resoloution and allow those with weaker hardware to select downsampled versions.
And also.... (Score:4, Insightful)
where the memory requirements are much higher (3D modelers and the like).
Also medical imagery (specially volumes, like MRI and CT).
And GPGPU (using Brook+) to perform complex calculation on huge datasets.
And maybe.... (Score:3, Funny)
...and maybe also playing Crysis 2.
With all settings put to "low".
And with Aero disabled.
Re: (Score:2)
No Crysis 2 requires 2.5 gigs. If you have less than that you can only run at 400 x 300, and replace all the textures with greyscale. The enemies still don't know how to climb stairs properly though.
Re:And maybe.... (Score:5, Informative)
And with Aero disabled.
Actually disabling Aero manually will not result in a performance increase. When an application enters full-screen mode, DWM essentially shuts down since there are no windows to manage.
But of course this will get modded down because people here don't want to believe that Vista doesn't suck as much as they think it does.
Re:And maybe.... (Score:5, Interesting)
Re: (Score:2)
Actually, it's pointless for FPS style games. They'll never use even a GB of that memory effectively because the games are designed around people with 512MB at the high end. The only reason I see to buy this card is maybe there are drivers optimized for professional work where the memory requirements are much higher (3D modelers and the like).
I always thought that FPS was the genre that was really pushing this gotta have a massive video card thing. I couldn't even tell you what my video card is other than n
Re:2GB of memory for a videocard, eh? (Score:5, Informative)
Actually, it's pointless for FPS style games. They'll never use even a GB of that memory effectively because the games are designed around people with 512MB at the high end.
They're only doing this because DDR3 is much cheaper than the DDR5 on the 4870. A 2GB 4850 with DDR3 is cheaper than a 1GB 4870 with DDR5. Me, I can't see the value of getting a card with more than 1GB, even for future games.
The only reason I see to buy this card is maybe there are drivers optimized for professional work where the memory requirements are much higher (3D modelers and the like).
There won't be. This card is marketed as a 4850, not a FireGL, which means it won't be all that useful or professionals. Without the drivers to accelerate professional applications, the extra memory is largely useless.
Re:2GB of memory for a videocard, eh? (Score:5, Insightful)
"Me, I can't see the value of getting a card with more than 1GB, even for future games."
Neither can I! Just like I can't see computers ever needing more than 640K of memory.
Re: (Score:3, Informative)
I'm sure he meant from *this generation* but it's what I instantly thought of as well.
That's exactly what I meant. You can attach all the memory you want to a video card, true, but there is a limit to how much you can conceivably use.
When you use more memory, you use more memory bandwidth; this is an undisputable fact. If you double the resolution of textures in a scene, the texture memory bandwidth you need to render that scene doubles. If you double your resolution, the amount of framebuffer writes dou
Re:2GB of memory for a videocard, eh? (Score:4, Funny)
Re:2GB of memory for a videocard, eh? (Score:4, Insightful)
It's an e-penis thing. Surely you walked by a 256MB Radeon 9200 in a Best Buy at some point. The chip on that card could hardly make use of 32MB, but I'll be damned if they won't add useless memory if it helps part a fool with his money.
Re: (Score:2)
The FireGL is intended for workstations. This is a consumer card. TFS didn't mention that, but it's a distinction worth making.
Bottlenecks? (Score:5, Insightful)
Re:Bottlenecks? (Score:5, Insightful)
But wouldn't squeezing 2 gigs of memory on a graphics card simply move the limiting bottlenecks elsewhere?
Well, sure. No matter how good your gaming rig is there's always going to be a bottleneck. And if it's an older game that runs 200 fps at full detail, then the bottleneck is the game itself capping maximum poly/texture counts (ie. the detail itself).
But the whole point of having and maintaining l33t gamer systems is to continually shift that bottleneck somewhere else which is also farther up the scale of performance so you keep getting a better gaming experience with each iteration.
Re:Bottlenecks? (Score:5, Interesting)
The article mentions that too little video memory can be a bottleneck. But wouldn't squeezing 2 gigs of memory on a graphics card simply move the limiting bottlenecks elsewhere?
I understand your question, but the whole point is that sometimes a game can be sluggish only because there is not enough memory and not even remotely close because of core performance. Today's games and the future brings us more games that utilize all the extreme amounts of memory, which ultimately results in greater textures and more variety.
But to answer your question: there's always going to be at least one bottleneck, but by adding more memory, at least they raised the bar a bit. Not that today's games are going to run much faster with this, but upcoming titles will.
Re: (Score:2)
Not that today's games are going to run much faster with this, but upcoming titles will.
I'm not entirely sold on that point. I'd imagine that developers already try to throw as much rendering information into the card's memory as-is allowing for the "overflow" to be stored in system memory. But I would imagine this process is done transparently by the driver anyway (AFAIK). Removing this "overflow" for most current games could improve performance by the same amount as any other newer games that come to market.
Re: (Score:3, Insightful)
Re: (Score:2)
Re: (Score:2)
There is always a bottleneck, somewhere.
If you want to call it that. Otherwise, I call it the weakest link in a chain, which seems more appropriate, because bottleneck implies substantial slowdown at a single point along the way, where a weak link indicates something that could be improved, but otherwise is functional.
At some point, all the graphic eye candy and having 50K FPS refresh at 8000 x 6000 is pointless. Unless you're playing in a holodeck, that is.
Re:Bottlenecks? (Score:5, Insightful)
Yes, it could, unless you're running a 64-bit OS and processor. Most computers, which are 32-bit, have a total or 4 GB of addressable memory space, which includes video memory, sound card memory (if you actually still use one) and system RAM. Therefore, if you put in a 2GB video card, you can't make use more than 2 GB of system RAM.
The 4GB address limit is probably the best argument for why we should see more progression to 64 bit computing, but there isn't yet enough demand in the market to force the issue for at least a few more years.
Re: (Score:2)
Most computers, which are 32-bit, have a total or 4 GB of addressable memory space, which includes video memory, sound card memory (if you actually still use one) and system RAM. Therefore, if you put in a 2GB video card, you can't make use more than 2 GB of system RAM.
Why would these devices' memories be mapped directly into system RAM?
Re: (Score:2)
Re: (Score:3, Informative)
Imagine a city with a limited road budget. The industrial areas (devices) have priority over res
Re: (Score:2)
It's just the nature of the architecture. It's why my 32-bit dell laptop only has 3.5GB of addressable RAM where my MBP has its full 4GB, and why systems with beefy graphics subsystems and 4GB often only show 3.25GB. This card is well past the point where it's counterproductive on a 32-bit system for exactly that reason.
Re: (Score:2)
Couldn't you make that same argument for EVERY component? If your CPU is the biggest bottleneck in your computer, and you replace it with a shiny new one, then something else (your 1 gig of RAM, what have you) will become the biggest bottleneck.
Re: (Score:2)
Re: (Score:2)
Not really. The graphics card has 800 stream processors all running off cached texture memory to cached framebuffer memory (deferred rendering). Instead of simply fetching pixel data direct from texture memory and writing it directly out into framebuffer memory, the graphics card will maintain a texture cache (the current textures being used), and a framebuffer cache (the current area of the framebuffer being rendered). Then when there is no more pixels to be written, the framebuffer cache is written back
Grammar Nazis, man your stations! (Score:3, Funny)
> from the way-too-much-overkill dept.
AKA, the recursive tautology dept.
That's cool but... (Score:3, Informative)
The R700 has dual GPUs on a single board, competes very well with nVidia, and here's the really cool part: It has nearly TWO BILLION transistors.
Re:That's cool but... (Score:5, Funny)
Re: (Score:2)
What with the jet engine fan cooler ATI cards come with, they're going to be cool alright. Now.. quiet? No. But when someone asks you why your computer just got really loud you can tell them "That, my friend, is the sound of power."
Finally (Score:4, Funny)
I can finally do the Explode open/close window Compiz effect on my 10 MP display!
Huh (Score:5, Insightful)
I'm still rockin 512 megs and doing fine - main system I mean. Integrated graphics.
The only reason this kind of thing bothers me a bit is that I imagine it's pushing videogames further and further into the world of being 1,000 employee, NASA sized engineering projects. Rather than charming little projects that say, that husband and wife that were Sierra could do on their own and be competitive.
This kind of reliance on jet-powered hardware kind of insures that the game is going to be all megacorporations working from market research.
Re: (Score:2)
Game development will always have room for the little guy, as long as he is making fun games.
Mod parent +1, Insightful (Score:2)
And when you've done that, feel free to mod me up too :)
Re: (Score:2)
There's nothing that a small group of programmers could do back then that they can't do now. It might be a little harder to stand out amongst the crowd, particularly if the crowd that you're most worried about is the big gaming magazines/websites. But you can still throw together a good game if you've got the time and the talent.
The tools do seem to lag behind the hardware potential a good bit, but they continue to improve and even individuals who dabble in this sort of stuff as a hobby can have access to s
Re: (Score:2, Interesting)
I tend to agree with the other poster who mentioned Counterstrike.
I'll take it a step further, though, and say this: I believe game development by mom & pop shops is about to enter a golden age.
High quality open source engines like Cube 2 (as well as many others) and a greater emphasis on procedural content generation (I give it a year or two before high quality open source libraries for this are available) will enable small developers to take advantage of these (somewhat insane!) hardware capabilities.
Re: (Score:2)
The only reason this kind of thing bothers me a bit is that I imagine it's pushing videogames further and further into the world of being 1,000 employee, NASA sized engineering projects. Rather than charming little projects that say, that husband and wife that were Sierra could do on their own and be competitive.
Um, come on games like Jewel Quest will always be more profitable and easier to create than Final Fantasy or heck next Mario game. Look at Tetris and solitaire games at the other big examples.
Impressive! (Score:5, Funny)
"FIRST" 2GB card? Err... (Score:5, Informative)
UNIX did it a decade ago (Score:2, Informative)
Re: (Score:2)
Market need? (Score:5, Insightful)
Is there any market "Need" for this, to be able to play your games better, or is this simply filling the "uber-leet-most-money-I-can-spend" market?
Moving the bar (Score:3, Insightful)
and I for one am glad to see products like this all the time. While I may not buy them they do move the bar further which usually brings the the lower range items down from the stratosphere in pricing.
I remember people clearly harping about cards with 32mb, or 64, or oh god no one will ever need 256.
Look at how much more resolution today's and tomorrows displays are bringing to us, then turn and realize how much memory it takes to address all of that.
Re: (Score:2)
Clearly yes. There are a lot of people out there who are very willing to spend large amounts of cold hard cash to outdo the joneses, even when their "one-up" doesn't make any sense at all. This product is intended to supply the e-penis market instead of any concrete technical need.
Not a soothsayer (Score:2)
I am not a soothsayer here once again to predict the death of PC gaming (once again). The PC is still a wonderful platform for development, flexibility, versatility, and complexity when compared to the consoles. Games will always keep coming for the PC, and not just MMOs, but all manner of wonderful things
Re: (Score:2)
Wolfenstein (Score:5, Funny)
32-bit address space limitations (Score:5, Interesting)
In summary, I for one welcome our new 64-bit overlords...
Re: (Score:3, Informative)
Graphics card memory won't be normally addressable with regular CPU opcodes, would they? You have to manually pipe data across the PCI/AGP/PCIe busses to make it to the card. They certainly don't sit in process address space.
Re: (Score:3, Informative)
The framebuffer is typically memory-mapped. While it's possible to program a video card just through indirect DMA and the GPU's command processor, most systems need to map the framebuffer for part of startup, and generally there's no reason to unmap it.
Re: (Score:2, Informative)
err, Vista Ultimate is capped at -128- gigs of RAM, thank you very much. And even if it was capped at 8, it would be enough for a 4 gigs + this card setup.
Re: (Score:2, Informative)
You remember incorrectly.
http://msdn.microsoft.com/en-us/library/aa366778.aspx#physical_memory_limits_windows_vista [microsoft.com]
Home Basic is capped at 8Gb, Home Premium at 16Gb, and Ultimate and Business at 128Gb
Pointless (Score:2, Insightful)
Useless. (Score:2)
Useless.
As the Tech Report [techreport.com] benchmarked some time ago, more than 512MB on any non-workstation graphics card at this point in time (and probably for some time too) is nothing more than useless.
This is just made to hunt those that don't know any better... "OMG 2GB RAM TEH IS FASTUR!!!!"
Re: (Score:2)
In some games if you have a 1680x1050 monitor (fairly standard for 20-22 inch 16:9 LCDs), and you crank up texture quality and antialiasing up the wazoo, your card's memory will be the bottleneck. It isn't -too- common, but i've been bit a few times. 1 gig and up is overkill unless you have one heck of a monitor setup, however (for now).
In 10 years (Score:2)
Re:Wow.. (Score:5, Insightful)
I bet that this thing would have enough power to run all the AERO stuff in vista.
Jesus, my ex's laptop runs Vista with Aero effects turned up to maximum with no problems and a crappy NVidia mobile GPU. This stupid "Aero eats your resources" meme needs to die.
By all means whale on MS, but at least do it for the right reasons.
Re: (Score:3, Insightful)
Some of us actually miss Clippy.
To add to your list: Internet Explorer (for lack of security and disregard for following standards), OOXML (design, corruption of standarisation process, non-implementation), abuse of office furniture (notably chairs), abuse of monopoly (at least according to the EU), overpricing (settled for a billion dollar), ... Pretty sure this list of right reasons can go on for a while.
Re:Wow.. (Score:4, Informative)
Re: (Score:2)
I was expecting to see the 4870 as well. That was a bit of a shock. Unfortunately there are actually people who are not as knowledgeable who will think that more RAM on the video card immediately equates to better performance.
Whenever I tell someone I got a new video card the first question out of their mouth is usually "how much ram does it have?" as if that's all that matters. There is still a lot of people out there who don't bother thinking about all the other variables and just focus on that one sta
Re: (Score:2)
Go look at Powercolor.com. There's a 4870 version with GDDR5 as well. The Slashdot poster just chose to post about the strange lower end uber memory part.
Re: (Score:2)
There's a guy in our work who is working on advanced lighting models (BRDF, BTF). He grades graphics cards by the amount of texture memory and the number of texture units available divided by the cost of the card.
Re: (Score:2)
that's not a limitation of computer hardware, but of the game designers' effort and time.
Re: (Score:3, Interesting)
The human eye has about 100 million rods and cones. You need a 100 megapixel framebuffer (around 10,000 by 10,000 pixels to achieve this.
There was an article in the Independent newspaper about Virtual Reality a long time ago. In the article, one of the researchers stated that photorealistic quality was defined as 80 million textured triangles/second.
Re: (Score:2)
We must be over 80 million by now. A Voodoo card could do a few million. Looks like that metric isn't used anymore, but a GeForce 2 can do 40 million.
Re: (Score:2)
Define "fool". I can look at my 10MP photos and be more than aware that I'm not looking at the actual landscape. It's not due to a lack of detail; probably mostly a function of the display medium. The same holds true for movies at the theatre - there's no lack of detail, but you can still tell that you're not looking at an actual scene (and disregard bad acting). 24FPS is way too low, even 60FPS probably won't cut it - not due to choppy playback, but because real life doesn't have a framerate (or it's a
Re: (Score:3, Insightful)
Whether a game is playable or not is irrelevant to this particular debate - if you want games to look better, or better-looking games to run faster, then you need more power.
I can't believe I have to actually explain that.
Screw Gaming... (Score:3, Interesting)
...at least in this context. Now OTOH, 3D/CG render engines that have OpenGL-rendering can do a whole hell of a lot with a beefy GPU and 2GB of RAM.
Normally, compared to software (CPU) raytracing, OGL rendering is pretty crappy on vidcards with low resources (shadows are jaggy, etc)... but with enough RAM and a high-end GPU, quality and speed could approach (if not surpass) the old-school "click 'render' then go have lunch" routine that most CG artists deal with nowadays.
Re: (Score:2)
You can be assured of one thing... When processors get to 2gb of L2 cache, whatever Windows version is out then will need more.