Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware Entertainment Games

World's First 2GB Graphics Card Is Here 400

An anonymous reader writes "TUL Corporation's PCS HD4850 is the world's first graphics card to offer on-board 2gig video memory. The card is based on RV770 core chip, with 800 stream processors and 2GB of GDDR3 high-speed memory." That's more memory than I've had in any computer prior to this year — for a video card.
This discussion has been archived. No new comments can be posted.

World's First 2GB Graphics Card Is Here

Comments Filter:
  • by Anonymous Coward on Tuesday July 15, 2008 @10:48AM (#24197275)

    Great for the pointless eye-candy first-person shooters. For everything else, there's MasterCard.

    • you have no idea (Score:2, Insightful)

      by unity100 ( 970058 )
      the 'eye candy' in that 'the pointless eye candy first person shooter' term of yours becomes SO real that it boggles your mind. i dont like fpses. but then again, that kind of graphics, makes some fpses worth playing.
      • by Anonymous Coward on Tuesday July 15, 2008 @10:56AM (#24197437)
        i dont like fpses. but then again, that kind of graphics, makes some fpses worth playing.

        And that right there sums up the problem with the gaming industry. Game producers don't even need to worry about whether their game is any good simply because some people will play it just because it's shiny (unity100, I'm looking right at you).
        • Re:you have no idea (Score:5, Interesting)

          by Jasonjk74 ( 1104789 ) on Tuesday July 15, 2008 @11:26AM (#24198057)

          i dont like fpses. but then again, that kind of graphics, makes some fpses worth playing. And that right there sums up the problem with the gaming industry. Game producers don't even need to worry about whether their game is any good simply because some people will play it just because it's shiny (unity100, I'm looking right at you).

          That's one of the easiest ways to be modded +5 insightful on /., just complain about games with good graphics not having any creativity. What about the games with bad graphics and bad gameplay? The two are not mutually exclusive. Games are a visual medium, they are supposed to look good.

          • by Torvaun ( 1040898 ) on Tuesday July 15, 2008 @11:37AM (#24198293)

            How about games with good gameplay and bad graphics? Those exist too, and they are better than games with good graphics and bad gameplay.

            • by Jasonjk74 ( 1104789 ) on Tuesday July 15, 2008 @11:41AM (#24198367)

              How about games with good gameplay and bad graphics? Those exist too, and they are better than games with good graphics and bad gameplay.

              Here's a novel concept: developers should strive for both graphics and content! That's just crazytalk...

              • by Oktober Sunset ( 838224 ) <sdpage103@yaho o . c o .uk> on Tuesday July 15, 2008 @06:37PM (#24205585)
                Why is it that anything other than the newest and most awesome graphics is considered bad, 4 years ago, people were tossing off over HL2 and Far Cry, which could easily be playing on a 256mb Radeon 9600, now just 4 years later, the graphics are considered bad. Not just lower detail, not just not as good, but graphics must go straight from best to bad as soon as something slightly better comes along. Its the same with video too, everyone declared DVD as awesome quality, when it replaced VHS, now everyone is denouncing it as totally shit, not just less good, but its shit, and we can't bare to watch it with all its nasty lo def blurryness, or course 4 years ago, it was amazing and crisp and awesome, now it's utter SHITE, and watching it make us all want to puke, and of course everyone claims they thought it was shit all along and hated it.

                Strangely this isn't the case with music, everyone declares the current medium to be shite almost straight away, cds? shite, vinyl? shite, tape? definatly shite.
              • by Draek ( 916851 ) on Tuesday July 15, 2008 @10:31PM (#24207897)

                The problem with that philosophy is that it drives the costs of making games *way* up, eventually creating a market where only big companies like EA are able to compete, and anything that's not a sequel is considered 'a risky investment', utterly crushing the chances of independant developers of going mainstream.

            • by Culture20 ( 968837 ) on Tuesday July 15, 2008 @01:39PM (#24200553)
              I still play games with good gameplay and bad graphics. I toss aside games with bad gameplay but good graphics.
              To game company CEOs, this translates as: "Customer Culture20 occasionally buys games with bad gameplay, but good graphics. We need more of these games for him to buy because we make more profit when he buys multiple crap games and plays them as little as possible."
              They don't want me to play the games for years and years. They want me to get bored and buy the next shiny thing.
          • by cliffski ( 65094 ) on Tuesday July 15, 2008 @04:10PM (#24203305) Homepage

            I disagree.
            Paintings and photos are a visual medium. Even movies have sound too. Games have sound AND interaction.
            I play games to interact, not to see pretty things. If I want pretty, I can watch Revenge Of The Sith, or Lord Of The Rings, Or Naked Women.
            Games don't compete even vaguely with Hollywood in terms of graphics. They will always be many years behind due to being real time.

            But hey, feel free to prioritize graphics, it means that reasonable video cards for the rest of us become dirt cheap :D. Late-adopter FTW.

          • Re: (Score:3, Insightful)

            by Draek ( 916851 )

            I'd agree that games are supposed to look good but the problem is, how do you define "good graphics"? personally I define it as "it preserves a distinct artistic style throughout the entire presentation", but many people seem to define it as "how many polygons does it use for the main character".

            For example, just to pick two old games, which one do you think has the best graphics, Castlevania 3 for the NES or Syphon Filter for the PSX? me, I'd take the former, since as much as I enjoyed the latter, it's gra

        • oh pal, im one of the grumpiest 33 year old gamers you can find on the planet. ive been playing games since 1982, zx spectrum, and played it all, had it all, bored from them all. including all prominen rtses, turn based strategies, rpgs, and all prominent mmogs.

          now i only play occasionally to have a change. and to be honest, that eye candy helps great deal with changing the mood into the game atmosphere.
        • Uh no. Only those interested in the shiny graphics will buy/download the game, but the real money lies elsewhere, meaning the game has to be good. Just about all the big hits don't have superb graphics, just superb gameplay.

          Unfortunately it's rather difficult to create games like Diablo, Diablo 2 and World of Warcraft every year, and get the players interested.
        • by Moraelin ( 679338 ) on Tuesday July 15, 2008 @12:07PM (#24198847) Journal

          While _some_ people do buy based on screenshots, the blanket generalization is little more than wishful thinking on the part of the publisher. You know, right next to, "people don't mind it if it's released buggy and patched later" and "people don't talk to each other, they only take their information from our marketing department."

          The most visible fly in the ointment is WoW. It has the least detailed graphics of any MMO since, I dunno, 2003 or so. Yes, it actually has less polygons and lower detail textures than some games _older_ than it. Shader effects, bump-mapping, and any kind of shiny stuff are almost non-existent. (Ok, ok, they added weather later.) It also sold like hot cakes.

          EQ2 was launched roughly at the same time as WoW, and tried to have _much_ higher resolution graphics and a metric buttload of shader effects. You can't even have a freaking armour modelled as just a texture, it just has to have a shader that makes it look 3D. It required a 512 MB card just to play with all those details... at a time where such a card didn't even exist. I think it never managed to get more than 1/50 the number of players WoW had, and it went slowly downhill from there.

          Interestingly enough, more people complained about EQ2's "sterile graphics" than about WoW's cartoonish ones. (See what Penny Arcade had to say about EQ2's graphics back then, for example.) Turns out that just using insane texture resolutions and polygon counts isn't a substitute for talent, you know?

          City Of Heroes had a _major_ graphics upgrade in Issue 6 (which coincides with launching the City Of Villains standalone expansion-pack), and the new Villain zones _quadruple_ even that number of polygons on screen. But let's concentrate on the COH side alone, because that was almost the same old game as before, only with a ton of graphical upgrades. Funnily, it didn't produce much of a jump in the number of players, and certainly no lasting effect. Anyway, the game peaked at 175,000 players in the USA alone soon after launch, and went gradually downhill from there. Last I heard a number it was last year at 145,000 in all territories combined and including both COV and COH players.

          Basically high-res, shiny graphics don't seem to do all that much. Sure, it helps if you're not butt ugly. But if you look at the number of subscribers, the effect of insane graphics just isn't there. EQ2 vs WoW, the better game won, not the one requiring a new graphics card. Or COH pre-I6 and post-I6, just doesn't show the players rushing in because of the graphics.

          Or in the offline game arena, The Sims was launched as a mostly 2D game with 2D sprites (ok, it used primitive low-polycount 3D graphics for the characters), in an age of shiny 3D games. It outsold not only any of those shiny 3D FPS games from the same year, it outsold them all combined.

          And I'll further guess that Crysis and all those other games presented as "proof" that graphics sell... they probably had some other merits too. A lot fewer people would have bough them, if their _only_ merit were the graphics. Games with good, shiny graphics have flopped before.

          • by Molochi ( 555357 ) on Tuesday July 15, 2008 @01:21PM (#24200193)

            This is in support of your argument. Every quarter or so I do the Valve hardware survey that logs our gaming systems' specs so that they can get a handle on what paying customers are using. The top 15 right now are...

            NVIDIA GeForce 8800 166,588 9.37 %
            NVIDIA GeForce 7600 101,218 5.70 %
            NVIDIA GeForce 8600 95,619 5.38 %
            NVIDIA GeForce 6600 79,478 4.47 %
            NVIDIA GeForce FX 5200 64,704 3.64 %
            NVIDIA GeForce 7300 59,544 3.35 %
            ATI Radeon 9600 54,727 3.08 %
            ATI Radeon 9200 45,585 2.57 %
            NVIDIA GeForce 7900 44,134 2.48 %
            NVIDIA GeForce 6200 42,834 2.41 %
            ATI Radeon X1950 41,533 2.34 %
            NVIDIA GeForce 6800 40,839 2.30 %
            NVIDIA GeForce4 MX 38,990 2.19 %
            NVIDIA GeForce 7800 36,192 2.04 %
            ATI Radeon X800 35,449 1.99 %

            About 1/3 of the top 15 cards are what the "Oooo Shiny Crysis Crowd" would call obsolete, and frankly the presence of a DX7 card even raises my eyebrow. This is the target audience for a powerful graphics card, but if Valve wants to sell to their customer base they can look at this and think, "Gee, maybe we should make a game that doesn't require a fuckton of curiously high bandwidth LMNOPRAM.and maybe make a fun game that at least scales down well.

    • by jandrese ( 485 ) <kensama@vt.edu> on Tuesday July 15, 2008 @11:02AM (#24197559) Homepage Journal
      Actually, it's pointless for FPS style games. They'll never use even a GB of that memory effectively because the games are designed around people with 512MB at the high end. The only reason I see to buy this card is maybe there are drivers optimized for professional work where the memory requirements are much higher (3D modelers and the like).
      • by TheEmrys ( 1315483 ) on Tuesday July 15, 2008 @11:13AM (#24197765)
        Depends on the resolution. If you are playing at 2560x1920, with AA and AF at high levels, and texture details set high, you can eat up quite a lot of memory.
        • Re: (Score:2, Informative)

          by VeNoM0619 ( 1058216 )
          IANA3DGFXE (I Am Not A 3D Graphics Expert) but I believe AA (anti-aliasing) is after processing to a scene, as well as 2560x1920 (resolution). Seeing how it smooths edges of objects, or paints extra objects to the extra pixel space. I don't know what AF is though.

          But you are right, this is MEMORY, more models/animations = more memory requirements = bigger maps. So yes this is needed (eventually), add to the fact that physics is now being implemented in GPUs (I have only briefly touched the code, which doe
          • by mr_mischief ( 456295 ) on Tuesday July 15, 2008 @11:54AM (#24198621) Journal

            The maps tend to be stored in main system memory. The graphics tend to be stored in graphics memory. You indeed need extra memory capacity, processor speed, and memory bandwidth for some of the post processing. However, resolution is not post-processing. Higher resolution means more pixels. More pixels means more RGB values in memory. More pixels also means more things to post-process. A higher polygon count and more textures can use more video memory, too.

            • by xouumalperxe ( 815707 ) on Tuesday July 15, 2008 @12:14PM (#24198975)
              A framebuffer for a 2560x1600, 32 bits per pixel display (the highest resolution you're likely to find on a monitor that's even remotely reasonable for home use) would take up around 15 MiB. make it triple buffering with 64 bpp (for what, exactly, I don't know. But it's a worst case scenario), and you're still only at 90 megs. Sure, 90 megs is a big chunk of a 512 MiB card, but I seriously doubt that it's going to have much impact on a 1 GiB card. It *is*, however, going to hurt -- a lot -- insofar as raw processing power is concerned. To fully use a 2 GiB card, you're either using massively large textures, or some never-before-seen technology, like fully loading map meshes into VRAM and using your card's geometry transform capabilities to do funky stuff with them. In those terms, I guess I'll buy one of these when Will Wright teams up with John Carmack. :)
              • by Godji ( 957148 ) on Tuesday July 15, 2008 @02:43PM (#24201765) Homepage
                when Will Wright teams up with John Carmack

                Create your very own mindless zombie alien hovering ball of goo that shoots acid thing, and unleash it against the unsuspecting online community! Design your very own oversized ultra new tech nuclear powered futuristic double barreled organic biorocket launcher weapon and fight against hordes of the deadliest community-created horror creatures. Battle with your aim against others' wit in the first ever MMOFPS in history: S I M Q U A K E

                Hm, I was joking but that sounds like something I would play! DAMN YOU Slashdot for giving this idea to a guy who is not the CEO of a game developer house!
            • by Hurricane78 ( 562437 ) <deleted&slashdot,org> on Tuesday July 15, 2008 @12:22PM (#24199147)

              Just that the resolution of the framebuffer and the textures are two entierly different things.
              The framebuffer, even at 2048 x 1600 x 48 bit uses a ridiculous 18.75 MB per frame... out of 2GB? That's nothing.
              The rest of the memory gets used for textures, vertex data, normals, and so on... so you have to have color, normal, bump map, and specular reflection information, just for one texture. Then a mip map of everything. For large textures you can never have enough graphics memory as long as the chip can render the textures. Main RAM is useless for this. Just try an onboard graphics chip with memory sharing. Huge PITA.
              Shaders are not even worth mentioning in terms of graphics memory. Code is usually the tiniest part.

              Main RAM on the other hand holds mainly the word data, sound files, textures that are preloaded but not used yet (think GTA) and other game data like model data used for game calculations.

              And: Yes, IAIGD (I am a game developer).

          • by init100 ( 915886 )

            I don't know what AF is though.

            Anisotropic Filtering?

          • but I believe AA (anti-aliasing) is after processing to a scene
            There are a number of ways to do anti-aliasing but IIRC the common way is to oversample, that is generate the output in a higher resoloution than will be output and then downsample it.

            If you have a 2560x1920 monitor and oversample by 4 times in each direction you would be generating in 10240x7680. That would mean you would need over 300 megs just for the output buffer. I'm not sure if current cards could handle that at a reasonable framerate anyway though.

            Afaict the big thing putting pressure on graphics memory is texture detail, if you double the horizontal and vertical resoloution of your texture you quadruple the memory required to store it. Ideally you want enough memory on your graphics card to store all the textures the game uses on the card. Texture detail is something the game developer can fairly easilly allow the user to alter, just design the textures in the highest resoloution and allow those with weaker hardware to select downsampled versions.

      • And also.... (Score:4, Insightful)

        by DrYak ( 748999 ) on Tuesday July 15, 2008 @11:13AM (#24197775) Homepage

        where the memory requirements are much higher (3D modelers and the like).

        Also medical imagery (specially volumes, like MRI and CT).
        And GPGPU (using Brook+) to perform complex calculation on huge datasets.

      • by kabocox ( 199019 )

        Actually, it's pointless for FPS style games. They'll never use even a GB of that memory effectively because the games are designed around people with 512MB at the high end. The only reason I see to buy this card is maybe there are drivers optimized for professional work where the memory requirements are much higher (3D modelers and the like).

        I always thought that FPS was the genre that was really pushing this gotta have a massive video card thing. I couldn't even tell you what my video card is other than n

      • by default luser ( 529332 ) on Tuesday July 15, 2008 @12:01PM (#24198751) Journal

        Actually, it's pointless for FPS style games. They'll never use even a GB of that memory effectively because the games are designed around people with 512MB at the high end.

        They're only doing this because DDR3 is much cheaper than the DDR5 on the 4870. A 2GB 4850 with DDR3 is cheaper than a 1GB 4870 with DDR5. Me, I can't see the value of getting a card with more than 1GB, even for future games.

        The only reason I see to buy this card is maybe there are drivers optimized for professional work where the memory requirements are much higher (3D modelers and the like).

        There won't be. This card is marketed as a 4850, not a FireGL, which means it won't be all that useful or professionals. Without the drivers to accelerate professional applications, the extra memory is largely useless.

        • by RedShoeRider ( 658314 ) on Tuesday July 15, 2008 @02:32PM (#24201569)

          "Me, I can't see the value of getting a card with more than 1GB, even for future games."

          Neither can I! Just like I can't see computers ever needing more than 640K of memory.

    • by the_humeister ( 922869 ) on Tuesday July 15, 2008 @11:42AM (#24198393)
      You could always use it as a ram drive [linuxnews.pl]!
  • Bottlenecks? (Score:5, Insightful)

    by Squapper ( 787068 ) on Tuesday July 15, 2008 @10:49AM (#24197293)
    The article mentions that too little video memory can be a bottleneck. But wouldn't squeezing 2 gigs of memory on a graphics card simply move the limiting bottlenecks elsewhere?
    • Re:Bottlenecks? (Score:5, Insightful)

      by Applekid ( 993327 ) on Tuesday July 15, 2008 @10:56AM (#24197443)

      But wouldn't squeezing 2 gigs of memory on a graphics card simply move the limiting bottlenecks elsewhere?

      Well, sure. No matter how good your gaming rig is there's always going to be a bottleneck. And if it's an older game that runs 200 fps at full detail, then the bottleneck is the game itself capping maximum poly/texture counts (ie. the detail itself).

      But the whole point of having and maintaining l33t gamer systems is to continually shift that bottleneck somewhere else which is also farther up the scale of performance so you keep getting a better gaming experience with each iteration.

    • Re:Bottlenecks? (Score:5, Interesting)

      by eebra82 ( 907996 ) on Tuesday July 15, 2008 @10:56AM (#24197447) Homepage

      The article mentions that too little video memory can be a bottleneck. But wouldn't squeezing 2 gigs of memory on a graphics card simply move the limiting bottlenecks elsewhere?

      I understand your question, but the whole point is that sometimes a game can be sluggish only because there is not enough memory and not even remotely close because of core performance. Today's games and the future brings us more games that utilize all the extreme amounts of memory, which ultimately results in greater textures and more variety.

      But to answer your question: there's always going to be at least one bottleneck, but by adding more memory, at least they raised the bar a bit. Not that today's games are going to run much faster with this, but upcoming titles will.

      • Not that today's games are going to run much faster with this, but upcoming titles will.

        I'm not entirely sold on that point. I'd imagine that developers already try to throw as much rendering information into the card's memory as-is allowing for the "overflow" to be stored in system memory. But I would imagine this process is done transparently by the driver anyway (AFAIK). Removing this "overflow" for most current games could improve performance by the same amount as any other newer games that come to market.

    • Re: (Score:3, Insightful)

      A chain is only as strong as its weakest link. You beef up the weakest link. The chain still has a weakest link, but the overall strength is raised.
    • by qw0ntum ( 831414 )
      Maybe, but you've eliminated this one... There's always going to be a limiting factor somewhere, you've just reduced the effect that too little memory will have.
    • There is always a bottleneck, somewhere.

      If you want to call it that. Otherwise, I call it the weakest link in a chain, which seems more appropriate, because bottleneck implies substantial slowdown at a single point along the way, where a weak link indicates something that could be improved, but otherwise is functional.

      At some point, all the graphic eye candy and having 50K FPS refresh at 8000 x 6000 is pointless. Unless you're playing in a holodeck, that is.

    • Re:Bottlenecks? (Score:5, Insightful)

      by El Gigante de Justic ( 994299 ) on Tuesday July 15, 2008 @11:03AM (#24197579)

      Yes, it could, unless you're running a 64-bit OS and processor. Most computers, which are 32-bit, have a total or 4 GB of addressable memory space, which includes video memory, sound card memory (if you actually still use one) and system RAM. Therefore, if you put in a 2GB video card, you can't make use more than 2 GB of system RAM.

      The 4GB address limit is probably the best argument for why we should see more progression to 64 bit computing, but there isn't yet enough demand in the market to force the issue for at least a few more years.

      • by mmkkbb ( 816035 )

        Most computers, which are 32-bit, have a total or 4 GB of addressable memory space, which includes video memory, sound card memory (if you actually still use one) and system RAM. Therefore, if you put in a 2GB video card, you can't make use more than 2 GB of system RAM.

        Why would these devices' memories be mapped directly into system RAM?

        • I think that's just how things are setup. Kind of like how video memory used to addresses at $a000:0000 on DOS machines. That 64KB block of memory couldn't be used for anything but video access on systems with a VGA card.
        • Re: (Score:3, Informative)

          by Geekner ( 1080577 )
          They are not directly mapped to RAM, this is simply a limitation of 32bit computing. All devices need addressable memory space, including video RAM, and the total 32bit limit (4GB) is split among these devices. When a driver accesses the device, it preforms a call to that devices memory address and the device responds. When the OS runs a process, it is copied into the system ram using the same kind of address.

          Imagine a city with a limited road budget. The industrial areas (devices) have priority over res
        • by Firehed ( 942385 )

          It's just the nature of the architecture. It's why my 32-bit dell laptop only has 3.5GB of addressable RAM where my MBP has its full 4GB, and why systems with beefy graphics subsystems and 4GB often only show 3.25GB. This card is well past the point where it's counterproductive on a 32-bit system for exactly that reason.

    • Couldn't you make that same argument for EVERY component? If your CPU is the biggest bottleneck in your computer, and you replace it with a shiny new one, then something else (your 1 gig of RAM, what have you) will become the biggest bottleneck.

    • by qoncept ( 599709 )
      Sure, but isn't that how performance is improved? Through removing bottlenecks one at a time?
    • by mikael ( 484 )

      Not really. The graphics card has 800 stream processors all running off cached texture memory to cached framebuffer memory (deferred rendering). Instead of simply fetching pixel data direct from texture memory and writing it directly out into framebuffer memory, the graphics card will maintain a texture cache (the current textures being used), and a framebuffer cache (the current area of the framebuffer being rendered). Then when there is no more pixels to be written, the framebuffer cache is written back

  • by Anonymous Coward on Tuesday July 15, 2008 @10:50AM (#24197317)

    > from the way-too-much-overkill dept.

    AKA, the recursive tautology dept.

  • That's cool but... (Score:3, Informative)

    by The Living Fractal ( 162153 ) <banantarrNO@SPAMhotmail.com> on Tuesday July 15, 2008 @10:56AM (#24197433) Homepage

    The R700 has dual GPUs on a single board, competes very well with nVidia, and here's the really cool part: It has nearly TWO BILLION transistors.

  • Finally (Score:4, Funny)

    by Rinisari ( 521266 ) on Tuesday July 15, 2008 @10:56AM (#24197445) Homepage Journal

    I can finally do the Explode open/close window Compiz effect on my 10 MP display!

  • Huh (Score:5, Insightful)

    by colmore ( 56499 ) on Tuesday July 15, 2008 @10:56AM (#24197451) Journal

    I'm still rockin 512 megs and doing fine - main system I mean. Integrated graphics.

    The only reason this kind of thing bothers me a bit is that I imagine it's pushing videogames further and further into the world of being 1,000 employee, NASA sized engineering projects. Rather than charming little projects that say, that husband and wife that were Sierra could do on their own and be competitive.

    This kind of reliance on jet-powered hardware kind of insures that the game is going to be all megacorporations working from market research.

    • Game development will always have room for the little guy, as long as he is making fun games.

    • There's nothing that a small group of programmers could do back then that they can't do now. It might be a little harder to stand out amongst the crowd, particularly if the crowd that you're most worried about is the big gaming magazines/websites. But you can still throw together a good game if you've got the time and the talent.

      The tools do seem to lag behind the hardware potential a good bit, but they continue to improve and even individuals who dabble in this sort of stuff as a hobby can have access to s

    • Re: (Score:2, Interesting)

      by erudified ( 958273 )

      I tend to agree with the other poster who mentioned Counterstrike.

      I'll take it a step further, though, and say this: I believe game development by mom & pop shops is about to enter a golden age.

      High quality open source engines like Cube 2 (as well as many others) and a greater emphasis on procedural content generation (I give it a year or two before high quality open source libraries for this are available) will enable small developers to take advantage of these (somewhat insane!) hardware capabilities.

    • by kabocox ( 199019 )

      The only reason this kind of thing bothers me a bit is that I imagine it's pushing videogames further and further into the world of being 1,000 employee, NASA sized engineering projects. Rather than charming little projects that say, that husband and wife that were Sierra could do on their own and be competitive.

      Um, come on games like Jewel Quest will always be more profitable and easier to create than Final Fantasy or heck next Mario game. Look at Tetris and solitaire games at the other big examples.

  • Impressive! (Score:5, Funny)

    by cashman73 ( 855518 ) on Tuesday July 15, 2008 @10:57AM (#24197459) Journal
    With graphics cards like this, Duke Nukem Forever will be damn good when/if it comes out! :-)
  • by TK2K ( 834353 ) <the.soul.hack@gmail.com> on Tuesday July 15, 2008 @10:57AM (#24197465)
    Workstation cards have been multi-gigabyte for ages! the ATI FireGL V8650 which was released a while ago is 2GB.
  • Didn't the Silicon Indy (or was it the Onyx) have a 2 GB video card? Glancing over the specs, the SGI Onyx4 could have up to 8 GB of graphics memory. Note that these machines are on the order of a decade old.... Granted, not exactly home gear, but still this isn't the "World's First 2 GB Graphics Card". So in fine tradition... another thing that UNIX had already done 10 years ago. (Hmm... maybe it was closer to 15....)
  • Market need? (Score:5, Insightful)

    by electrosoccertux ( 874415 ) on Tuesday July 15, 2008 @10:58AM (#24197481)

    Is there any market "Need" for this, to be able to play your games better, or is this simply filling the "uber-leet-most-money-I-can-spend" market?

    • Moving the bar (Score:3, Insightful)

      by Shivetya ( 243324 )

      and I for one am glad to see products like this all the time. While I may not buy them they do move the bar further which usually brings the the lower range items down from the stratosphere in pricing.

      I remember people clearly harping about cards with 32mb, or 64, or oh god no one will ever need 256.

      Look at how much more resolution today's and tomorrows displays are bringing to us, then turn and realize how much memory it takes to address all of that.

    • Clearly yes. There are a lot of people out there who are very willing to spend large amounts of cold hard cash to outdo the joneses, even when their "one-up" doesn't make any sense at all. This product is intended to supply the e-penis market instead of any concrete technical need.

  • But in all honesty, the number of games released each year for the PC that *require* a card like this to run at high settings can be counted on one hand. I'm pretty sure that didn't used to be the case.

    I am not a soothsayer here once again to predict the death of PC gaming (once again). The PC is still a wonderful platform for development, flexibility, versatility, and complexity when compared to the consoles. Games will always keep coming for the PC, and not just MMOs, but all manner of wonderful things

    • by dave420 ( 699308 )
      That's funny. The PC games industry (which is essentially all Windows gaming) is expanding by billions of dollars per year. Just because it's not running on Linux doesn't mean it doesn't exist :)
  • Wolfenstein (Score:5, Funny)

    by Nerdposeur ( 910128 ) on Tuesday July 15, 2008 @11:05AM (#24197611) Journal
    Finally, I will be able to play Wolfenstein 3D in all its beautiful glory!
  • by Spatial ( 1235392 ) on Tuesday July 15, 2008 @11:06AM (#24197633)
    We're really beginning to feel it now. With this card, you're limited to around 1,750MB of RAM on a 32-bit Windows system; 4GB minus the 2GB on the card, minus all the other mapped stuff which amounts to 250MB on my computer.

    In summary, I for one welcome our new 64-bit overlords...
    • Re: (Score:3, Informative)

      by bucky0 ( 229117 )

      Graphics card memory won't be normally addressable with regular CPU opcodes, would they? You have to manually pipe data across the PCI/AGP/PCIe busses to make it to the card. They certainly don't sit in process address space.

      • Re: (Score:3, Informative)

        The framebuffer is typically memory-mapped. While it's possible to program a video card just through indirect DMA and the GPU's command processor, most systems need to map the framebuffer for part of startup, and generally there's no reason to unmap it.

  • Pointless (Score:2, Insightful)

    by Teejaykay ( 1107049 )
    What's the point with 2Gb of GDDR3, or even 1Gb in that price segment? Even a 512MB HD 4850 is good enough for the people most likely to buy it (aka people with no fancy, high-resolution wide screen TFT monitors) -- it's certainly good enough to play stuff at at 1280x[whatnot]. (Yes, hello, it is I.) In that range, with this card, I'd wager the bottlenecks'll just be elsewhere; the CPU, the RAM, heck, maybe the GPU's memory bandwidth. Even if the GPU were the source of the bottleneck, just get a HD 4870 tha
  • Useless.

    As the Tech Report [techreport.com] benchmarked some time ago, more than 512MB on any non-workstation graphics card at this point in time (and probably for some time too) is nothing more than useless.

    This is just made to hunt those that don't know any better... "OMG 2GB RAM TEH IS FASTUR!!!!"

    • by Shados ( 741919 )

      In some games if you have a 1680x1050 monitor (fairly standard for 20-22 inch 16:9 LCDs), and you crank up texture quality and antialiasing up the wazoo, your card's memory will be the bottleneck. It isn't -too- common, but i've been bit a few times. 1 gig and up is overkill unless you have one heck of a monitor setup, however (for now).

  • This will be a minimum requirement to run Microsoft Word.

A committee takes root and grows, it flowers, wilts and dies, scattering the seed from which other committees will bloom. -- Parkinson

Working...