Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Software AMD Operating Systems Upgrades Windows Hardware

AMD Demos DirectX 11-Capable ATI Graphics Card 107

An anonymous reader writes "Today at a press conference in Taiwan, AMD demonstrated the world's first GPU capable of DirectX 11 technology. The demonstrations shows the major improvements DirectX 11 gives us over DirectX 10 and also shows us what AMD has in store for an ATI Graphics Card coming out before the end of 2009 capable of DirectX 11. AMD shows three primary features of DirectX 11: a tessellator, which allows for less blocky and more fluid and realistic details; compute shaders which allows for less restricted programming; and finally, how DX11 is better designed to take advantage of multiple CPU cores."
This discussion has been archived. No new comments can be posted.

AMD Demos DirectX 11-Capable ATI Graphics Card

Comments Filter:
  • Direct X11? (Score:5, Funny)

    by TeknoHog ( 164938 ) on Wednesday June 03, 2009 @02:53PM (#28200107) Homepage Journal
    Or is Microsoft finally catching up with the unix world?
  • by gubers33 ( 1302099 ) on Wednesday June 03, 2009 @02:54PM (#28200127)
    That is the real question, the PS3 for example has amazing computing speed and a great graphics card, but game programmers have yet been able to utilize the system to its full potential. I'll be curious to see if the same occurs here.
    • I think not for some of the same reasons as the PS3: porting. If I use Direct X 9 its relatively easy to port it to the Xbox and will run on XP too. Same thing for the PS3, if they don't do all kinds of fancy stuff if the game doesn't sell well as a PS3 exclusive just change it up, add in a few extras an release it as a 360 game. I don't know why any game developer unless they were either MS owned or heavily invested in my MS would choose to use an incompatible Direct X version.
    • by TikiTDO ( 759782 )
      I remember seeing some articles where senior Sony execs essentially said the PS3 was made to be complex, so that is not really a good comparison. Microsoft has been pretty good about making DirectX easy to use. I imagine this release will continue with the trend.
      • by ifrag ( 984323 )
        I remember an interview with John Carmack where he said that developers would have to "sweat blood" to code for the PS3. He had somewhat more favorable things to say about developing on the Xbox. Now personally I don't really care, I have both systems and this is not fan-boy ranting or endorsement of either platform, but Sony really missed the mark by thinking programmers were going to be able to make the base engines use all those cores effectively. Yes, for graphics, going full out parallel has clearly
    • Unlike console gaming, very few graphically intensive PC games are designed to work at a specific quality with a specific frame rate for a specific consumer card. Rather, they're designed to be able to harness power from cards that don't exist at the time of development, and make good use of the features they know of at the time of development.

      On a console, the system you design on is the system your users play on and direct optimization for the platform is both necessary and worthwhile. In a PC gaming sc

    • by nobodylocalhost ( 1343981 ) on Wednesday June 03, 2009 @03:30PM (#28200649)

      I think tessellation will be controllable on the driver side, in that case, you wont need to write specialized code in order to take advantage of it.
      From what I understand, it is basically point based curve matching using differential calculus - a fundamental change in the way models are being rendered. So even for existing games, you just need to turn on tessellation processing with your graphics card driver, and you should be able to take advantage of it due to the fact it just changes the rendering method, models themselves and other parameters should remain the same.

      • Re: (Score:2, Informative)

        by Handlarn ( 911194 )

        That would produce some bad looking results, as the driver wouldn't know the difference between a model that is intentionally polygonal and one that is not.

        • by Creepy ( 93888 )

          Not necessarily - you could define an object using some form of CSG and have the tessellation tool create the actual geometry (which is pretty much what CAD software does today). It won't be as simple as the grandparent suggests, though - it will require some sort of primitive input unless it is designed to work with some sort of Level of Detail scheme (an area I do know Microsoft has patents in)

          A very simple instance of CSG is a sphere, which can be defined with a point and a radius. You could then do so

    • You can utilise the (agreed) good performance of the PS3, as witnessed by the number of eggheads who have hacked them to serve as cheap supercomputing clusters, (see /. posts Ad nauseam.

      I'd personnally rephrase your comment more along the lines of "is it financially viable"?
      Of course, the PS3 is a notorious horror to code for, but the other factor - market share - should be up there too.
      Naturally, the two are related.

      AMD & DirectX11 - sounds like a similar Pyrrhic victory...

      Now if only OpenGL etc. had t

  • Oblig (Score:1, Funny)

    by delta419 ( 1227406 )
    But will it run Vista?
  • Linux drivers? (Score:2, Insightful)

    by anjilslaire ( 968692 )
    Yes, but will it run with all of the bells and whistles (sans DirectX, of course) on Linux? Will they have solid drivers available on release?
    • When has ATI ever had solid Linux drivers?
      • Re:Linux drivers? (Score:4, Insightful)

        by Darkness404 ( 1287218 ) on Wednesday June 03, 2009 @03:07PM (#28200317)
        When has ATI had solid anything drivers. Even the Windows drivers cause BSoDs for no apparent reason.
        • You're not kidding. What's truly amazing is that it's *utterly* random.

          Sometimes I'm running WOW, playing a DVD, and have 5-6 browser windows open and my computer's solid as a rock for days. Sometimes, I've got nothing but a single browser window, and bam-- "Vista has detected your graphics driver has crashed."

          The good news is that Vista can recover from it 9 times out of 10. Even without crashing WOW, which is pretty impressive.

        • I've heard this a lot, but I can't say it matches my experiences.
          3 years on a laptop with Win2000 and a very old ATi chip (7xxx series): no video-caused crashes.
          2 years on a laptop with WinXP and an old-ish ATi chip (9xxx series): no video-caused crashes.
          6 months on a Vista laptop with (now outdated) Mobility 200M: no video-related crashes, EVEN WITH BETA DRIVERS (for comparison, nVidia's Vista drivers weren't what I would call release quality until more than 6 months after Vista's release. ATi was there ov

        • When has ATI had solid anything drivers. Even the Windows drivers cause BSoDs for no apparent reason.

          Very true, The only positive thing I have to say about ATI drivers is they are a shit load better than Nvidia drivers, But that is like being smarter than the brain damaged kid at the back of the Bus.

          • I've had very few ATI driver crashes myself, I have ONE game that every third patch or so I can depend on locking the drivers if I do things just right(a very certain sequence, easily avoided), and even then vista re-sets them after a bit.
            Nvidia on the other hand I've failed to get hardware that works long enough to comment on the drivers since the started making chip-sets for something more advanced than plain old pci.
            4 different systems and 5 different cards(6xxx and
          • Sadly, I have to concur. Things have gotten pretty bad in both camps these days. Particularly when it comes to older title support.

        • Their drivers are fine. That's the first thing AMD fixed after acquiring ATI.

          On the contrary,

          When has Nvidia had solid anything drivers. Even the Windows drivers cause BSoDs for no apparent reason.

          See how easy that was?

        • by dave420 ( 699308 )
          Not on Vista, fortunately. They would, but Vista can intercept the failed driver and restart it. It can even do that mid-game, which is quite impressive (to me at least :)). It's true, however, that their drivers are awfully shitty, from my experience anyway.
      • by moon3 ( 1530265 )
        That might also be a Linux community's fault. Linux never provided solid driver development kit. Look at Microsoft and their Windows WDK, I can only praise Microsoft for providing such a superior tools and resources to driver developers. Support developers, give them the best tools and they will follow.
        • Re: (Score:3, Insightful)

          That might also be a Linux community's fault. Linux never provided solid driver development kit.

          Doesn't seem to have stopped nVIDIA from making a pretty solid driver for Linux.

          • Re:Linux drivers? (Score:5, Informative)

            by Kjella ( 173770 ) on Wednesday June 03, 2009 @03:53PM (#28201041) Homepage

            Doesn't seem to have stopped nVIDIA from making a pretty solid driver for Linux.

            nVidia basicly overrode the lower third or so of X11 (it's a big function pointer table) and wrote their own implementation, ATI did the same except with less success. AMD/Intel is now trying to invent a proper open source stack with graphics execution manager (GEM) for memory management, kernel mode setting (KMS) for flicker free boots and more, low-level state tracking framework called Gallium3D to expose modern shaders, better direct rendering interface (DRI2), redirected direct rendering (RDR) and various other improvements but you're talking about things only 1-2 years old. nVidia has succeeded yes but for most intents and purposes they wrote the whole thing themselves, There's a reason it's a sore point for open source fanatics, it's not merely a blob addon it basicly ripped out a whole chunk of open source, said "not good enough" and replaced it with their own blob.

            • ATI did the same except with less success.

              Which was entirely my point. That ATI is still to this day unable to release a half-decent driver for Linux is their own fault since nVIDIA was clearly able to do so.

            • Re: (Score:3, Interesting)

              There's a reason it's a sore point for open source fanatics, it's not merely a blob addon it basicly ripped out a whole chunk of open source, said "not good enough" and replaced it with their own blob.

              No one gives a shit if Nvidia said the open source part was "not good enough" - we give shit because what nvidia replaced it with is broken and can't be fixed. I wasted over $600 on two top-end nvidia cards due to their supposed "great linux support" only to have them fail to work with my high end monitor because of an extremely simple TMDS configuration bug in their driver.

              When I jumped through all the hoops of their ultimately bullshit support on a freakin webforum and gave them all the debug output they

      • by Ultra64 ( 318705 )
        The driver quality has improved noticeably since they were purchased by AMD.
        Also, the open source drivers are progressing nicely. http://xorg.freedesktop.org/wiki/RadeonFeature [freedesktop.org]
  • so what, another update to Direct X and another batch of video cards that support it. Or partially support some of the features, or 100% all of the key features, but not some others. or some variation on that. Blah blah blah
    • by dave420 ( 699308 )
      DirectX 10 got away from that. For a card to be DirectX 10 compatible, it has to support the feature-set, to ensure different cards produce the same output.
  • Realistic 3D CGI porn. Of course.

    • 3D CGI Porn (Score:3, Insightful)

      by Petersko ( 564140 )
      "Realistic 3D CGI porn. Of course."

      I guess that's for people who find it's just too creepy to have actual porn actresses in their downloaded mpg's... watching them... laughing at them... judging them...

      With CGI porn, the disconnect is complete! It has become a truly solitary masturbatory experience, the last vestiges of shared sexuality banished.

      WOO.... hoo?
      • No, it's for people who are all like "yea baby, oh, touch yourself, yea more of that, NO NO DON'T LICK THE TITTY! who told you that licking your own titty is sexy? It's not, so stop that. God if I were the director, I would have slapped you for that. Now look what you did, you killed my boner."

        With computer CGI porn, no actress will lick her own titties ever again.

        • by Kjella ( 173770 )

          Hmm yeah, in addition to unrealistic depictions of what having sex is like and standards of beauty, let's top that off with some mindreading ability too. I wonder if any of that could make it hard connecting with real girls that need to be actually pleasured, look average instead of bombshell and have a mind of their own. No wonder realdolls sell, if they could make them semi-intelligent sex robots too they should be just the thing...

          • My girlfriend may not be able to read my mind, but she does know what I'm thinking when I tell her what I'm thinking. Porn lacks that ability. :-P

            Whoops, this is slashdot! Replace "girlfriend" with "Fleshlight" or something.

          • by adolf ( 21054 )
            It's far simpler than that: It can even have a textmode interface:

            Welcome to Pornmaker 9000. Please enter a number from 1 to 9 for each question:

            1. Guys?
            2. Girls?
            3. Other?
            4. All of the above?
            5. Scat?
            6. Tongue?
            7. Latex?
            8. Blonde?
            9. Redhead?
            10. Romantic?
            11. Anal?
            12. Fisting?
            13. Anal fisting?
            14. Animals?
            15. Furries?
            16. Tits?
            17. POV?
            18. Multicam?
            19. Enema?
            20. Shish?
            21. Gag?
            22. Tattoos?
            23. Shower?
            24. Golden shower?
            25. Old?
            26. Young?
            27. Fat?
            28. Siblings?
            29. Twins?
            30. Squirt?

            [bzzt.] [qrrrr.] [wrrrrrrrrgh.]

            Tha

      • by geekoid ( 135745 )

        And they do what you want without having to pay them.
        Well, anything someone on a screen could do and say.

      • With CGI porn, the disconnect is complete! It has become a truly solitary masturbatory experience, the last vestiges of shared sexuality banished.

        Say what you will about it being impersonal. CGI is the only way I could afford to complete my masterpiece: Lord of the Cock Rings: The Battle of Purple Helm's Deep Penetration.

    • by sharkey ( 16670 )
      You kids with your 'realism', and '3D' pr0n. Why, in my day, all we had was ASCII pr0n, running at 5 seconds per frame. And we liked it!
      • 5 seconds per frame? My first modem was 2400 baud-- 240 characters a second, or about 0.12 fps on a standard 80*24 terminal.

  • So... (Score:2, Insightful)

    by eexaa ( 1252378 )
    ...so they are shipping real drivers with ATI cards? Great!

    (In fact, I hope that they finally do something about this. I was forced to avoid any ATI hardware for over 5 years now, just because of driver incompatibilities. It's just sad.)
  • by ADRA ( 37398 ) on Wednesday June 03, 2009 @03:24PM (#28200531)

    that isn't in XP, hence nobody cares. You'll have the what, 30% market segment with Vista, and maybe 10% that are regular gamers who will be using this.

    This will just encourage the further brokenness that Windows is turning the PC gaming platform. Good Job!

    PS: Before everyone jumps in to say that everyone will jump into Win7, I think you're mistaken. The only way Microsoft will kill XP for most existing users would be to introduce a critical bug that they choose not to fix. I played with Win7 for a few days and can safely say that it doesn't add anything that I've ever wanted to use that a trivial search for google wouldn't find an as-good or better alternative. And maybe its just me, but pretty much every single UI 'enhancement' since circa Win2k is always a step backwards in terms of -my- productivity.

    Its lucky that I'm Linux competent since Fedora/Gnome makes practically everything I need easy and uncluttered. If the barrier for entry was a little lower, I could see mass exodus potential coming as XP users take an honest look at what they -really- want to update to.

    • Dude, that wasn't a PS. That was practically a P on its own merits.

      Not an FP, but a P none the less. A troll P, at that.

      Troll's pee!

      (Sorry ... long day.)

    • I'm pretty sure that it's not just you who feels this way about Windows 7, but that's mainly because this site is full of people who hate Microsoft for the sake of hating Microsoft. Amongst the general population Windows 7 is gonna own. Personal opinion, to be sure, but historically I'm pretty good at judging hype on its own merits.

    • The only reason XP has hung on so long is that it took so long for Vista to come out. Everyone was running XP and have continued to do so because Vista was expensive, it had problems until SP1 came out, and ancient XP machines couldn't run it. No one skipped from Win2k to Vista. Tons of people will be skipping from XP to 7. Just wait and see. New machines will have 7. Old machines will be replaced with machines running 7. Gamers running XP will switch to 7.
    • by geekoid ( 135745 )

      Yes, the bug is called "Not having DirectX 11"

      There will never be a mass exodus to Linux as long as corporations keep ties to an OS and developers write code to take advantage of specific items in that OS.

      If applications were self contained like they should be, then the OS wouldn't matter nearly as much.

    • Your frame of mind seems to be pretty common around here. "Nothing I can see" in Vista/Win7. I even might of agreed with that statement not all that long ago. Yet if you follow some of tech-notes related to Vista/Win7 ie the actual underpinnings of the system, not just the candy coated UI. Theres quite a lot of interesting stuff going on under the hood. Mark Russinovich's blog's since he was hired at Microsoft have been particularly enlightening - even more so than the stuff he used to post frequently on sy
  • by Anonymous Coward

    From the article:

    Lastly, DX11 is better designed to take advantage of multiple CPU cores. This should allow developers to offload some of the work on to the processors that are typically there not doing as much work, freeing up the GPU to do the more important processing and rendering.

    Interesting turnaround. The original motivation for the GPU was to allow the CPU to offload expensive graphics computation to a dedicated processor. Now it appears that that newer GPUs are allowed to offload their computation back to the CPU again.
    This is further evidence that the CPU/GPU divide is being eliminated, and that there will likely be no such distinction among processors in the near feature.

    • "Now it appears that that newer GPUs are allowed to offload their computation back to the CPU again."

      This is more an artifact of programmers not knowing how to utilize extra cores of modern processors then it is really "offloading", it's more like taking full advantage of the CPU, GPU cards and drivers have always shared the load between CPU and GPU, the fact of the matter is with many core CPU's many programmers haven't learned to utilize them effectively.

    • Interesting turnaround. The original motivation for the GPU was to allow the CPU to offload expensive graphics computation to a dedicated processor. Now it appears that that newer GPUs are allowed to offload their computation back to the CPU again.

      No, that is not what's going on. The GPUs have become so fast at doing their jobs that the CPU can't feed them fast enough. That's where the new features in DX11 will help. It will make it possible to efficiently use multiple threads to feed the GPUs. This has been an issue in DX10 and earlier.

  • by nycguy ( 892403 ) on Wednesday June 03, 2009 @03:48PM (#28200955)
    Will Duke Nukem Forever [3drealms.com] wait to take advantage of DirectX 11?
    • No.

    • Re: (Score:3, Funny)

      by ichigo 2.0 ( 900288 )
      I thought 3drealms going bankrupt would end the DNF meme. Guess I was wrong.
      • Well yeah. It's developers going bankrupt had no effect on the chances of DNF being released. It's equally likely that it will be finished in 2011 as it was to be finished in 2008. So, you know, anything (equal to nothing) could happen!

  • I now can play my favorite game of all time with decent performance: 3DMark

  • Come on, this is the depth of comprehension that the author has about what tessellation is?

    One of the technologies in DirectX 11 is something called tessellator.

    Tessellator allows for more smoother, less blocky, and more organic looking objects in games. Anti-aliasing shouldn't be confused with this, as AA does a descent job at smoothing out sharp edges but tessellator actually makes it look more fluid and frankly much more realistic. Tessellator makes things look more "rounded" instead of chunky and blocky. Instead of having to trade off quality for performance, like in the past, developers can now have the most realistic scenes without a performance hit.

    Tech Fragments is an appropriate name for the site, I guess, seeing as they can't even get the tense of the word right.

  • From TFA (in reference to the tesselator in DirectX 11):

    Instead of having to trade off quality for performance, like in the past, developers can now have the most realistic scenes without a performance hit.

    Yeah, I'm sure turning on tessellation won't cause any performance hit at all.

    Tech Fragments has the most sensationalist writers ever.

  • will it run Lin^H^H^HWindows XP? :P
  • This one goes to 11.

  • I, for one, am happy to see Nvidia on the run. I've seen what they will try to do ($649 for a GT280 card based on aging DDR3 memory technology) when they think that they rule the roost. Go ATI!
  • Excuse me, but, didn't DirectX 10.1 also provide for a tessellator?

    And isn't this the reason why there never was an Nvidia 10.1 card, but ATI ran it just fine?

No spitting on the Bus! Thank you, The Mgt.

Working...