Forgot your password?
typodupeerror
Graphics AMD Technology

AMD's New Flagship HD 6990 Tested 164

Posted by CmdrTaco
from the pixels-rot-the-brain dept.
I.M.O.G. writes "Today AMD officially introduces their newest flagship GPU, the Radeon HD 6990. Targeted to counter Nvidia's current generation flagship GTX580, for AMD this is a followup to their previous generation 2xGPU on a single PCB design, the Radeon HD 5970. It represents the strongest entry AMD will release within their 6000 series graphics lineup. Initial testing and overclocking results are publishing at first tier review sites now. As eloquently stated by Tweaktown's Anthony Garreffa, the 6990 'punches all other GPUs in the nuts.'"
This discussion has been archived. No new comments can be posted.

AMD's New Flagship HD 6990 Tested

Comments Filter:
  • and your wallet too!

    $700. ouch.

    Hey, if you've got the money to play, lucky you. I'm envious.

    • It's about exactly twice as much as the budget I've set aside for building my next PC...
      • It's about exactly twice as much as the budget I've set aside for building my next PC...

        Last computers I built were budgeted at $500 a piece... Whole new systems - motherboard, CPU, RAM, HDD, power supply, LCD monitor, keyboard, mouse, all of it. Came in just slightly over after I was done with shipping and handling and whatnot...

        We're still using those computers, too. And we do a good amount of gaming. Obviously I can't crank all the settings up as far as they'll go... But I have yet to see a game that didn't play just fine on this machine.

    • by Anonymous Coward

      I've got the money and am in the market for a new graphics card, but I just don't see the point in a card like this. 450W draw is fucking retarded and currently there just aren't games out there that will legitimately make use of a card like this. And in a year or two when the games actually exist, you'll be able to buy a card that can keep up with this one for significantly less money and probably a lower TDP.

      • I admit I haven't kept up to date on the latest PC games. I was a WoWaholic for a long time, and I've been spending most of my gaming time with my PS3 recently. my GTX 470 has had a nice break the past few months. That said, in just about every generation of games and hardware, there's at least one game that still runs like crap even on the best current hardware when you push all the settings to ultra and run it on a 30inch monitor, be it mass effect, crysis, or what have you. Don't we have one of those thi

        • Doesn't have to be a new game. Older games have been having major issues with new cards, mostly because of DirectX8 incompatibilities. My pet game, FFXI, was utterly broken on GeForce 400s until the latest driver release in January, and is still sub-par on many ATI cards. Nothing like 5 FPS on an 8 year old game on a $200 card to make you angry!
        • by AnonGCB (1398517)

          That's called a 'Bad port' and game makers are /SLOWLY/ changing to actually focus on PC development.

          I remember when GTAIV came out for PC and it crippled everything by being the worst optimized game I've ever seen.

          • pretty sure crysis was ported -to- consoles, not from it, although I could be wrong. Mass effect was released on the xbox first, but since the 360 is running directx 9 I believe, it can't have been that much of a port job.

            as a side point, PC ports of console games often get a bad rep for performance compared to their console brethren. The PC version usually has a lot more eye candy in my experience. Aside from the convenience of using a mouse, I can't play the 360 version of mass effect nor the ps3 version

      • by hairyfeet (841228)
        If you are in the market and want cheap hotness head over to Tigerdirect as they have a refurb HD4850 for just $60! My GF just ordered one for me "to support my inner geek" as she put it and the reviews on the card is awesome. When that card came out it was $200 and it is still listed as an enthusiast card, with 800 stream processors and a 256 bit wide memory path. The only thing is you will either need a PSU with the PCIe connection, or have two molex free for an adapter. But for the price you really can't
    • and your wallet too!

      $700. ouch.

      Hey, if you've got the money to play, lucky you. I'm envious.

      Yep.

      I was always kind of amazed at these prices... I'd build an entire computer for $700, and then somebody would come along and tell me how they had two of these $700 video cards in their machine.

      I mean... If you've got the money, go for it. But I just can't see justifying $1400 in video cards alone. Especially when we're talking about the consumer-grade gaming cards. Are a few more frames per second in Crysis really worth $1400 to you?

      • true that.

        Had I that kind of money to waste on a computer, I'm pretty sure I'd be much better off in the long run spending tha tmoney on something like a single upper mainstream (say, gtx 570) card, a 30" monitor, and two solid state drives in raid 0, but that's just me.

        Come to think of it, anybody who spent $1400 on two of these probably have the 30" monitor and the two SSDs as well.

      • by chill (34294)

        Much more fun to interrupt their proxy-penis waving with a few well placed headshots.

        "Dual $700 cards, huh? How come you still suck?" *BOOM* headshot

    • Maybe it's more because I'm not a serious gamer, but I've been looking at a bank of these cards for GPU computing. More computing power, less space, faster transfer across cards rather than network, and less overhead costs associated with buying more systems. We bought the Radeon HD 5970 & GeForce GTX 580 two weeks ago for comparisons. This week will tell us which platform is better for our needs since we need to move from theory to reality.

      For home use, I'd keep using the GeForce 8800 GTX, but I'
  • 375+ watts. That's more than my whole computer. Oddly enough I have plenty of headroom in my power supply and it only requires a single slot so if I felt the need to punch myself in the nuts by loading drivers written by ATI onto my computer, I could slap it right in there.

    • Re: (Score:2, Informative)

      by Anonymous Coward

      I was ATI-only from 2000-2009. Thought the same thing about their drivers. Then I went Nvidia again because of my dislike and... nope, no difference.

      If you're planning on Linux though then yeah, Nvidia is obviously much better. ATI-on-Linux will make you want to hang yourself with a sock.

      • by billcopc (196330)

        Yeah, NVidia's driver quality has taken a nose-dive in recent years. At least ATI is no better or worse than they've ever been.

      • by Beelzebud (1361137) on Tuesday March 08, 2011 @01:48PM (#35420868)
        Since AMD took over ATI, their drivers have massively improved, even in Linux.
        • by hedwards (940851)

          And they're releasing source as their legal department signs off on it. So at least in theory they should be better integrated at some point in the future.

        • by Hatta (162192)

          This has not been my experience. I bought an HD4350 for an HTPC build. Open source drivers flicker every 10-20 seconds, that's unwatchable. The closed source drivers are another story entirely. They installed fine, and judging from the Xorg logs they were working. Except that *nothing* was displayed on the screen. No errors, no warnings, X correctly determined which displays were connected. Yet all I got was empty blackness.

          The card is completely useless in Linux. I can't see myself ever buying an A

          • I'm not a fan of ATI's drivers in general (I use the open source r600g at the moment and it's working fine for me), but obviously "empty blackness" is not the norm - not even for the closed source drivers.

            Another possible related thing: "Empty blackness" would be what you get with X nowadays before any window manager or similar is started (did you try clicking somewhere if you were using twm?). I can see why they removed the pretty horrible patterned background that was the default earlier, but there are
            • by Hatta (162192)

              Here [archlinux.org] is the cry for help I issued on the Arch forums. I didn't really expect any help, since the Catalyst drivers were exiled to AUR. For good reason it seems.

              Anyways, if you see any clues I missed, or if you know of a better place to ask for help, please let me know.

          • I didn't say they were perfect now, just that they were massively improved, which they are. I'm using an HD5770 and the drivers from the past two months have been awesome. They even fixed the tearing issues with the Tear Free feature in the control panel.
        • by drinkypoo (153816)

          Since AMD took over ATI, their drivers have massively improved, even in Linux.

          This has not been my experience. For instance, they have abandoned R690 chipset already. No fully working graphics drivers newer than Vista. I think my next CPU will be intel. My current video card is from nVidia and everything I hoped it would be. I've owned several Radeons and all were total nightmares. I've never, repeat NEVER had a problem with an nVidia card that wasn't solved with a driver update. That's just because I'm lucky with hardware I guess, but the ATI problems weren't hardware ones.

          When my R

      • I disagree. I was running a pair of 4850's in crossfire for almost 2 years. There was a bug that would make the mouse cursor icon go all corrupted [tinyurl.com] that they never bothered to fix despite knowing it was there. I switched to a pair of nvidia 460 GTX's in SLI and haven't had a problem since.
      • by MarkRose (820682)

        I recently bought new motherboard, and the Nvidia drivers wouldn't work in Linux. Not at all. It would practically lock the system with interrupts (it would pin one cpu core dealing with them). I even tried a completely different Nvidia card, working in another system. Same thing.

        I then bought an AMD card. And it just worked.

        AMD/ATI is worth looking at now.

    • 375+ watts. That's more than my whole computer. Oddly enough I have plenty of headroom in my power supply and it only requires a single slot so if I felt the need to punch myself in the nuts by loading drivers written by ATI onto my computer, I could slap it right in there.

      Holy hell. I've only got a 500w PSU in my box... I don't think I could even run one of those.

  • by Stele (9443) on Tuesday March 08, 2011 @01:08PM (#35420354) Homepage

    I don't know - the card is certainly fast, but when all you can do to beat your competition's single-GPU card is to stick two of your slower GPUs on it, it just feels hollow to me. All Nvidia has to do is come back with a $800 card with two 580s on it to decimate AMD's nuts in return. Is this *really* all that amazing?

    • by div_2n (525075)

      You say that as if it's a trivial thing to do.

    • by click2005 (921437) *

      All Nvidia has to do is come back with a $800 card with two 580s on it to decimate AMD's nuts in return. Is this *really* all that amazing?

      Thats the 590. Its out in a week or two.

      It makes me laugh that most sites reviewed it on a single screen system, most at 1080p. Most of the current top-end cards can easily do modern games at maximum detail even on 30" screens. These kinds of cards are only really worth it for multi-monitor gaming. The problem is 3 x 30" screens starts to fill that 2GB
      of video memory quite quickly.

      I hope the 7990 has better memory use. Use HyperTransport or some kind of NUMA setup and let the GPUs access all the memory.

      • by Khyber (864651)

        The memory on a GPU card is typically MUCH faster than the system memory.

      • Why does screen size matter? Or did you actually find something that big that wasn't still 1080p?
        • I wouldn't buy a monitor bigger than 24" that only supported 1080p.

          TV sure.

          Computer monitor no.

          • by gstoddart (321705)

            I wouldn't buy a monitor bigger than 24" that only supported 1080p.

            I can only guess at what something like that would cost and where you'd buy it.

            I've never seen such a beast. That's not to say they don't exist, but it seems a fairly exotic thing.

            • by Anubis350 (772791)

              I wouldn't buy a monitor bigger than 24" that only supported 1080p.

              I can only guess at what something like that would cost and where you'd buy it.

              I've never seen such a beast. That's not to say they don't exist, but it seems a fairly exotic thing.

              Exotic? Really?

              On the consumer/normal workstation end of things off the top of my head you've got the Dell U2711, IPS, res. 2560x1440 [dell.com] (list 1k, but frequently on sale for ~$700) plus Apple's *only* display, in the same price range with essentially the same panel (glossy though, and LED backlight).

              On the true high end Eizo, NEC, and others make even better displays. Not to mention that with slightly lower DPI you cna get the same 2560x1440 resolution on nearly every 30" computer monitor made in the la

              • by hedwards (940851)

                You're frequently better off getting a second screen than going larger than 24". Hell even with a 20" screen it's likely better to get a second one. Unfortunately, most systems don't seem to handle multiple monitors very well. Meaning that if I'm playing a game on my primary screen, the OS doesn't know to put a screen saver on the other and restrict the mouse to the game screen. I haven't seen any WMs that handle that well, regardless of OS. And don't get me started with times when the screens aren't the sa

                • by Raenex (947668)

                  You're frequently better off getting a second screen than going larger than 24". Hell even with a 20" screen it's likely better to get a second one.

                  Better in what sense? I've never liked working with multiple monitors, but I really like working with my 24" widescreen. No annoying gap in the middle, fewer wires, more desktop space, and no interface issues like you described.

            • I've got an i-inc (rebranded Hanns-G) 28" 1920x1200 on my desk that cost me $250 from compusa/tiger direct?

              I really do prefer the 16x10 ratio for computer monitors. The thing about 1080p is that the vertical resolution on it is really about the same as the 19" crt I had on my desk 10 years ago, it's just wider.

              I will agree that getting to 2560x1600 does seem to take a big paycheck though.

            • I found six with "Recommended Resolution" of 2560 x 1600 from $1-3k.

              http://www.newegg.com/Product/ProductList.aspx?Submit=Property&Subcategory=20&PropertyCodeValue=1099%3A25153 [newegg.com]
              • by gstoddart (321705)

                I found six with "Recommended Resolution" of 2560 x 1600 from $1-3k

                Don't know about you, but I consider a "$1-3k" monitor to be exotic and pricey.

                Sure, they sound absolutely awesome, but you're talking about more money than I'd be willing to spend on a computer.

                Definitely 'niche' market kinda stuff.

        • by Tr3vin (1220548)
          He is going off the sort of standard resolutions for monitors of that size. Once you get past 24", most decent monitors increase in resolution past 1920x1200 or 1920x1080. A good 30" monitor is normally 2560x1600. This of course assumes you are buying a computer monitor and not a TV.
        • I assume he was referring to the 30" lcd's that run at 2560x1600 resolution... which are awesome for the record

          • by gstoddart (321705)

            I assume he was referring to the 30" lcd's that run at 2560x1600 resolution... which are awesome for the record

            *drools on keyboard*

            Wow! Seriously, wow! How much does something like that cost? This seems like you're way beyond gaming rig here -- and, if you're really talking about running 2 or 3 of these for a gaming machine (like some people are), well, then I strongly suspect you don't really care that your video card(s) cost. You've already spent a small fortune on monitors.

            • Yeah, I can't imagine having 3 of these... 1 takes up a good amount of desk real estate as it is, I got my Dell 3008 refurbished for about $1200 I think a year and a half ago with full warranty, etc.

            • by hedwards (940851)

              Except that monitors tend to last a lot longer than video cards do. Even during the 90s, the monitors would typically outlast several generations worth of videocards. It was a bit less one sided when the LCDs first came out, but at this point there's little point for most people to upgrade again if they buy a quality monitor.

      • by gstoddart (321705)

        It makes me laugh that most sites reviewed it on a single screen system, most at 1080p. Most of the current top-end cards can easily do modern games at maximum detail even on 30" screens.

        If they're both at 1080p, then the size of the screen doesn't matter, does it? It doesn't take more memory if the pixels are bigger but the same in number.

        Or, are you talking about running at resolutions higher than 1920x1080? I didn't think you could easily get monitors at much higher resolution.

        • 3 screens would be (3 * 1920) * 1080. So technically running at a much higher resolution.
          • by (H)elix1 (231155) *

            This is a handy chart for figuring out the number of pixels. When you start getting into the larger 16:10 monitors, you really need a lot of horsepower. Add in three large monitors...

            2400 x 600 . = 1,440,000 pixels | Triple 4:3
            1680 x 1050 = 1,764,000 pixels | Single 16:10
            1600 x 1200 = 1,920,000 pixels | Single 4:3
            1920 x 1080 = 2,073,600 pixels | Single 16:9
            1920 x 1200 = 2,304,000 pixels | Single 16:10
            3072 x 768 . = 2,359,296 pixels | Triple 4:3
            3840 x 720 . = 2,764,800 pixels | Triple 16:9
            3840 x 800 . = 3,

        • by Sique (173459)

          I have one at 1920x1200 which was on sale for 159 €. A 24" at 1920x1200 currently sells for 225 € here around.
          It's not easy though to get a monitor with a decent ratio (4:3 or 5:4) though. Ironically it's cheaper to buy a 24" at 1920x1200 than a 20" at 1600x1200, even though the last one has about the same dpi and less pixels.

    • by Nemyst (1383049)

      AMD's offerings usually have lower power consumption and heat generation. While I'm sure nVidia could come up with something, they'd probably have a hard time using the 580 as a basis, because it runs so hot already. I mean, the 6970 consumes a whole ~140W less than the 580 (!), yet they still had to notch it down so it fit in the standard and add that clever switch. AMD's current offerings are just far more power efficient than nVidia's, which means they'd need to underclock their dual-GPU card more than A

    • by billcopc (196330)

      The big problem with dual 580s is peak power draw would be around 750w, just for the GPUs. They would have to make certain sacrifices to fit any reasonable power envelope.

      If these GPUs keep sucking more and more power, they will have to start seriously considering making them external. You'll have your PC, a GPU box beside it with its own kilowatt power supply, and just an interface board and cable between the two. There is simply no sense in cramming more heat and power into the PC chassis, just to play

    • by hairyfeet (841228)

      Actually it is a damned smart thing to do and here is why: If you look up the talks that AMD released right after they bought ATI one of the things they stressed is how they would change development by instead of sinking huge amount of R7D into the "ePeen card" and then having to figure out how to selectively cripple it to fit the lower markets they would instead focus on the mid market chip where the vast majority of sale are and then simply add memory, a bigger pipe, a second GPU, etc to ramp UP to the e

  • why? (Score:3, Insightful)

    by Charliemopps (1157495) on Tuesday March 08, 2011 @01:14PM (#35420420)
    My $150 card I bought a year ago can play every game on the market right now. Why do I need a $700 card?
    • Exactly! Mod parent up. Right now the video game market is being driven largely by the consoles that have video cards from ten years ago. There's really not much to max out a ePenis like this card.
      • by TeknoHog (164938)
        IMHO, the only sensible use for these monster GPUs is parallel computing (OpenCL etc). For many problems they are the best bang per buck, as well as per power consumption. It seems that the HD5000 series maintains the lead in this sense; for example the HD6990 has fewer stream units than the HD5970, and the extra texture units are not generally used in computing.
        • Your opinion needs a little more humble in it. The only use you can see and the "only sensible use" have absolutely nothing in common. You just aren't that important.

        • Thats exactly the use for my GTX460 and a Radeon 6850 that I have in my BOINC computer. This machine alone does several times the computations that used to do on 3 desktops and one laptop before.

      • Exactly! Mod parent up. Right now the video game market is being driven largely by the consoles that have video cards from ten years ago. There's really not much to max out a ePenis like this card.

        It isn't about some specific need per se; overclocking and tuning is a hobby, an expensive and not always such a smart hobby, but nevertheless there's some even worse hobbies in the world. It just happens to be fun to see how far you can push your PC, how much more you can squeeze. Is it useful? No. Does anyone need such power for anything? Not really, atleast home users don't. And games simply have trouble taking advantage of it all even as it is. It STILL is fun.

        That said I personally would not buy such h

        • It isn't about some specific need per se; overclocking and tuning is a hobby, an expensive and not always such a smart hobby, but nevertheless there's some even worse hobbies in the world.

          As an example from another of my hobbies, the price of this card would get you halfway towards a top-of-the-line set of headphones... not counting of course the top-of-the-line amp to go with it... which put together are WAY cheaper than an equivalent speaker setup... which in turn is WAY cheaper than an offshore boat... which is of course way cheaper than manipulating the world financial markets for shits and giggles.

          That last hobby scares me.

          • by gstoddart (321705)

            As an example from another of my hobbies, the price of this card would get you halfway towards a top-of-the-line set of headphones... not counting of course the top-of-the-line amp to go with it... which put together are WAY cheaper than an equivalent speaker setup

            And, of course, the rest of us are convinced you're daft to spend that much money on a set of headphones.

            You may actually be able to hear the difference, or at least believe you can. To most of us, it seems like you're spending several times more

            • I can hear the difference between my old Sennheiser HD650 headphones and my new Beyerdynamic DT880s. I can hear the difference between my wife's HD580 headphones and my HD650s. I don't see any reason to suspect that I wouldn't be able to hear a difference between the HD650s / DT880s and a set of HD800s or Tesla T1s.

              That said, you are correct that there are diminishing returns for your money, but that's true of any hobby.

              Say you've got an old 1991 5.0 mustang. completely stock, it's going to probably run a h

              • by gstoddart (321705)

                I don't see any reason to suspect that I wouldn't be able to hear a difference between the HD650s / DT880s and a set of HD800s or Tesla T1s.

                I believe intellectually someone might be able to distinguish from one of those alphabet soup things which might impress me if I knew (or cared) what it was.

                However, as a practical matter, I just find it unlikely that everybody who claims to have such golden ears actually does. It's just hard not to believe that there's a bunch of people who have shelled out crazy amou

                • Oh I certainly don't have golden ears. That's why I'm confused that people can't hear the difference between quality headphones and not-so-quality.

                  I'm honest and right up front:
                  I can't hear the difference between LAME 160kbps and FLAC.
                  I can't hear the difference between audio cables (I really want to, but I can't)
                  I can't hear the difference beween 24bit audio and 16 bit audio

                  I can easily hear the difference between different headphones. I've got 3 sets of headphones right now, Sennheiser HD650s, Beyerdynami

      • Well I would imagine you are not running your $150 card at 5760 x 1200 (across three 24" monitors) with 4X AA and 16 AF now are you?

        There IS are market for this performance, and granted it may not include you, but some people are more than able to bring cards with these specs to their knees.

        As for console ports, granted there are quite a few, but I seriously doubt my GeForce 3 Ti500 (2001) could have have run any of today's games.

        • And exactly how many titles support this please? I'm going to guess very few. Frankly, I wish it was the other way around, but for now it appears that this is just bragging rights.
          • by asto21 (1797450)
            You don't seem to get the point? If there is even ONE title that supports such a configuration and if I want to play that title in such a fashion, I would need a graphics card like this. Yes?
            • I think you and I disagree on the definition of the word "need".

              IMHO, it's not worth it, or a "need" since it's far, far outside my normal usage, or most of the people I'm familiar with.

              Unfortunately, this is a consumer electronics component, so they need more than a very few fanatic people with 3 monitor setups to sell these cards. However, in recent years the number of titles that support these extremes has grown less, shrinking the pool of people who could potentially be interested. We've g
      • There's a difference between being able to play a game and running the game on ultra high settings. My laptop can run any game on the market right now, but I wouldn't say it would be the most pleasant experience and it certainly wouldn't be at anything more than medium to medium-low settings. Some people like the new shiny that PC games offer. While I (and apparently you) don't think it's worth the extra money just to be able to run the latest Crisis expansion across three monitors with the graphics up t

        • I've got a $200 video card that appears to run everything on the ultra settings, including the original Crysis. That being said, even the reviewers are forced to run the same 5-6 titles again and again because there are so few titles that really stress video cards anymore. So why pay $500-1500 for less than a half dozen titles?
          • I've got a $200 video card that appears to run everything on the ultra settings, including the original Crysis. That being said, even the reviewers are forced to run the same 5-6 titles again and again because there are so few titles that really stress video cards anymore. So why pay $500-1500 for less than a half dozen titles?

            You don't have to pay $500-1500... the low end cards in this generation sell for as low as $250. Those cards being the AMD Radeon HD 6950 [newegg.com] and nVidia GTX 560 Ti [newegg.com].

            • You don't have to pay $500-1500... the low end cards in this generation sell for as low as $250.

              I was referring to the mythical three monitors + video card setup. As I stated earlier, I've got a $200 video card that I'm very happy with.

    • Triple display + 3d. need > 120fps with at least 3 times the resolution that your monitor has. And if you consider the cost of such a setup, 700$ is a reasonable proportion of the cost. Not saying it's a good use of money, just saying there are systems that can use this power.
    • My $150 card I bought a year ago can play every game on the market right now. Why do I need a $700 card?

      Hell, the $150 card I bought about two years ago still works fine.

      Obviously I can't crank all the settings up as high as they'll go... But I have yet to run into a game that doesn't run well.

      Just finished playing through Dead Space 2 - it ran fine and looked great.

      • by Anonymous Coward

        {{cite}}

        I want to know what $150 card from two years ago plays your current games at max settings.

    • by Kjella (173770)

      Wait, I think I heard the exact same thing yesterday in the Intel Extreme cpu comments. Why? Because you can. This is luxury, like drinking a 70$ wine over a 15$ wine, nobody needs to do it but it's to spoil yourself. It's not necessary to be able to crank the quality settings all the way up to enjoy a game, but if you can afford it it's the little extra.

    • by Anonymous Coward

      Because mine does it with higher quality settings and with more than enough FPS to not have any slowdowns during high intensity moments, while yours doesn't.

    • by Atzanteol (99067)

      You don't. It also won't help you with the kids on your lawn, understanding "rock and roll" or with your shouting at a cloud.

      p.s. I read your post in Grandpa Simpson's voice.

    • Why do I need a $700 card?

      For the same reason that they're selling it: It'll arm you for your next dick-wagging contest.

      OTOH, as the price drops the mid-range cards will get cheaper too, so I can't complain...

  • "the 6990 'punches all other GPUs in the nuts." ...and steals your wallet at the same time. Aside from the epeen factor, realistically which currently available games require such a hardware. AFAIK, all the currently released games (e.g. Bulletstorm) run comfortably on the Nvidia 260, 280 cards at the highest settings (1920x1080 resolution) So a simple question, why bother...
  • by wisebabo (638845) on Tuesday March 08, 2011 @02:34PM (#35421472) Journal

    Whatever happened to VR? (Virtual Reality) A decade or two ago, it seemed to be (short of direct neural interfaces) where user interfaces were heading. I even remember going to a Disney mini-theme park where they had some true VR rides (you wore a tracking headset) so that you could ride Aladdin's carpet.

    Back then it seemed as if the main thing keeping this technology back was the room-sized SGI supercomputer required to render a reasonable scene in real time. I remember a presentation by the CEO of SGI saying that all they needed to get to was 60M triangles/sec, then VR would be achieve mass appeal. (Then again, he also dismissed delivering video from computers by saying computers wouldn't become video "jukeboxes" so maybe he wasn't so good at predicting the future.) Anyway, I don't know the latest spec's but I'm sure a modern video card could blow away one of those old SGI "Reality Engines".

    So why aren't we all wearing goggles (and wearing spandex) and looking like the characters in "The Lawnmower Man"? Is it because micro-displays never got good enough? Or something else?

    • Two major problems in order.

      1. Socially unacceptable. Not a technical issue, but a social/psychological one. It's hard to interact with friends in a home where everyone decides to blind themselves from reality. Ironic, I know.

      2. the HUD visor or helmet were (still?) exceedingly expensive due to the tiny LCDs spec-ed at SVGA and XGA resolutions. Proper marketing and economies of scale could resolve this.

      • the HUD visor or helmet were (still?) exceedingly expensive due to the tiny LCDs spec-ed at SVGA and XGA resolutions.

        Apple ships millions of phones with 3.5", 326 ppi screens that iSuppli estimates to cost $28.50 each. Maybe they underestimated, so let's say a pair would cost $80, which is still in the price range of a cool video game add-on like the Kinect.

    • by 19061969 (939279)

      Quoth: "So why aren't we all wearing goggles (and wearing spandex) and looking like the characters in "The Lawnmower Man"? Is it because micro-displays never got good enough? Or something else?"

      Because Apple haven't released a product with it causing all competitors to shit themselves?

From Sharp minds come... pointed heads. -- Bryan Sparrowhawk

Working...