Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

DX10 - How Far Have We Come? 210

MojoKid writes "When DirectX 10 was first introduced to the market by graphics manufacturers and subsequently supported by Windows Vista, it was generally understood that adoption by game developers was going to be more of a slow migration than a quick flip of a switch. That said, nearly a year later, the question is how far have we come? An article at the HotHardware site showcases many of the most popular DX10-capable game engines, like Bioshock , World In Conflict , Call of Juarez, Lost Planet, and Company of Heroes, and features current image quality comparisons versus DX9 modes with each. The article also details performance levels across many of the more popular graphics cards, from both the mid-range and high-end." PC Perspective has a similar look at DX10 performance.
This discussion has been archived. No new comments can be posted.

DX10 - How Far Have We Come?

Comments Filter:
  • DX9 looks better? (Score:5, Insightful)

    by Chemisor ( 97276 ) on Thursday October 04, 2007 @04:10PM (#20858303)
    Am I the only one who find the DX9 version of the pictures more appealing? With the exception of the Bioshock fog examples (which had sharp boundaries in DX9) they just look more "natural" to me.
    • Re: (Score:2, Informative)

      The fog looks bad because it was not designed around dx9.
      I've can't remember seeing visuals look as bad as those did, and even where glitches occur the action happens so fast its not noticeable.

      (one exception, in Half life 2, the frosted glass doors had a glitch near the edges of the screen, nothing major but ruined the effect)
      • The real joke (Score:5, Insightful)

        by Moryath ( 553296 ) on Thursday October 04, 2007 @05:12PM (#20859225)
        The real joke is that neither DX9 nor DX10 are inherently "better" any more than the original Glide API was inherently "better" than DirectX or OpenGL. Hardware has been changed constantly, to give "better" responses to this call or that call, but inevitably you have to write a driver that converts the OpenGL or DX9/DX10 or whatever into something your card understands.

        In the really old days, you had people actually coding for the card on hand. This is why there's a gazillion different releases of Mechwarrior 2, each of which varies greatly in image quality and features - each had to be hand tuned to the card.

        If Bioshock had been intended for DX9, it would probably look the same as that DX10 shot on DX9. They'd have figured out what they needed to do, perhaps coded a few "If ATI, do this, if NVidia, do this, if Intel Extreme fail 'your video card is too crappy to play this game'" decisions for specific hardware, and that would have been that. Since it was backported (and MS would have thrown a fit to have "no difference") they had to just do a more slappy job of it.

        Then again, if not for the emphasis on ridiculous graphics, think about how many games would be able to use their processing power for some seriously wicked AI. Even Bioshock only has half-decent AI that can be twigged to and predicted fairly easily - you know that a wrench guy is going to rush you, you know that the spider slicers will hang from the ceiling and lob stuff all day till you burn or freeze them, you know where the houdinis are going to land long before the animation starts merely because you can figure out what the AI tree says for them to do in what radius... it's sad.

        Hell, you can predict the precise spot on the health bar where they'll run for the health station, and if you're smart you trapped that thing half an hour ago. Now you get to watch as four of them all kill themselves on the same damn one, never paying attention to the 3 dead bodies on the floor that obviously just gassed themselve using a booby-trapped station.

        But nevermind. I know the reason they want graphics over AI - the same fucking morons that could never defeat a decently programmed AI (hell, they have trouble getting through Halo on NORMAL), drool over thinking that they can see the shoelaces on Madden's boot.
        • Re:The real joke (Score:5, Insightful)

          by Mikachu ( 972457 ) <burke...jeremiahj@@@gmail...com> on Thursday October 04, 2007 @05:30PM (#20859445) Homepage
          Hold on for a second here. The graphics are usually very GPU intensive, but the CPU is generally not overworked by them at all. If they wanted to write good AI, they could do so without sacrificing graphics quality at all.
          • Re: (Score:2, Interesting)

            by Anonymous Coward
            From what I can tell, the primary limiting factor in game AI isn't even hardware related; it's designer manhours. A good AI is one which makes use of a lot of different behaviors and has good rules for applying them. It's not really difficult for the computer to handle that, but it does take a lot of time for the designers to plan and implement all of the behaviors and rules. And if they really want to trick players into thinking the game is intelligent, they can incorporate scripted behavior in certain sit
            • Exactly.

              The computer resources are irrelevant. Graphics and AI both take man-hours to make, so this is a cost/management issue, rather than a technical one.

              You have a certain budget. Do you hire more graphics artists and graphics programmers, or AI designers and AI programmers?
        • Re: ai (Score:5, Insightful)

          by Anonymous Coward on Thursday October 04, 2007 @06:26PM (#20860121)
          As someone who writes AI for text-based games, let me clear you of some misconceptions.

          First, the goal of "AI" isn't always to be as smart as possible. Often, the goal is to make something believable and/or of the appropriate difficulty level. It's possible that Bioshock missed the mark there, but I haven't played Bioshock yet, so I don't know.

          I can write "AI" that will kick your ass every time, even without cheating. (Mobs have the advantage of being on home turf, and they outnumber you.) But that's not fun for the player, so I don't do it. Instead, I'll write something with a pattern you have to figure out. Once you learn one of the ways to beat it, the mob will be easy for you, and it's time to move on to the next area. Very few mobs get the full "try to survive at all cost" treatment, and even fewer are programmed to actually learn from your behavior.

          You're describing the classic "I wish this mob would keep getting harder" remorse, but think about it: would it really make sense for those mobs to learn from your new tactics? Are they supposed to be smart, or are they just supposed to be an obstacle?

          As for your dead bodies example: would you really prefer to have an infinite standoff as the mobs decide it's not worth getting killed, so they go hide somewhere with their own traps and wait for you to attack? Right... so get over it. If games were realistic, you would realdie on level 1.
          • Re: ai (Score:5, Insightful)

            by Chemisor ( 97276 ) on Thursday October 04, 2007 @06:49PM (#20860425)
            > I can write "AI" that will kick your ass every time, even without cheating.
            > (Mobs have the advantage of being on home turf, and they outnumber you.)

            You are assuming that the mob would just sit there and wait for the player, like it usually does in pretty much every game. In reality, a "level" would not necessarily know that Gordon Freeman is on his way. Neither will they have the patience to sit in their assigned ambush places, waiting for him all day long. A better AI would actually "live" in the environment where it is placed, so that it would react to the player instead of waiting for him. It would also be fun to watch. In Half-Life I really enjoyed watching those occasional scenes where monsters are wondering around doing things; like when the bullsquids feed on the headcrabs. I wish there were more things like that, things worth watching.

            > would it really make sense for those mobs to learn from your new tactics?
            > Are they supposed to be smart, or are they just supposed to be an obstacle?

            If the AI was smart, you wouldn't need a mob. You would only need a few individuals. It would be like a multiplayer deathmatch, and, judging from the popularity of those, would likely be more fun than the current mob situation.

            > As for your dead bodies example: would you really prefer to have an infinite standoff
            > as the mobs decide it's not worth getting killed, so they go hide somewhere with their
            > own traps and wait for you to attack?

            An infinite standoff will only happen if the game designer makes you kill off the entire mob before setting off some stupid trigger to open some stupid door. Don't program artificial obstacles and the player will be able to ignore the hiding mob and go on, just like in real life.
            • by Knara ( 9377 )

              In Half-Life I really enjoyed watching those occasional scenes where monsters are wondering around doing things; like when the bullsquids feed on the headcrabs. I wish there were more things like that, things worth watching.

              For all the guff that Tabula Rasa is getting,this one one of the things that (to me) made the world seem more dynamic and lived in. The worlds you play on are active battlefields with reasonably intelligent good and evil mobs that are jocking for tactical and strategic advantages. The "bad" mobs arrive in dropships in actual squads of various types, and will patrol/hunt through areas for "good" mobs (including the player). A lot of people don't seem to like TR very much, but this was a great idea, to me

              • by MORB ( 793798 )
                I've seen that in the tutorial part of the game, yes.

                However, after that it was back to the beaten-to-death formula: a elevation-map based terrain with the very occasional cave, and mobs spawning at random and standing there, attacking you when you get within some fixed radius from them.

                The only slightly original thing is that instead of just popping in existence out of thin air, they have this (badly looking) animation where a ship shows up and the guys drop from it.

                Add the also beaten-to-death "go collect
            • Re: (Score:3, Interesting)

              by SCHecklerX ( 229973 )

              You are assuming that the mob would just sit there and wait for the player, like it usually does in pretty much every game. In reality, a "level" would not necessarily know that Gordon Freeman is on his way. Neither will they have the patience to sit in their assigned ambush places, waiting for him all day long. A better AI would actually "live" in the environment where it is placed, so that it would react to the player instead of waiting for him. It would also be fun to watch. In Half-Life I really enjoyed

          • by mcvos ( 645701 )

            First, the goal of "AI" isn't always to be as smart as possible. Often, the goal is to make something believable and/or of the appropriate difficulty level.

            This depends a lot on the kind of game you're talking about. For FPS, you're right, but that's not nearly the most demanding kind of game for AI. So far, no company has been able to write a turn-based strategy game where the AI comes even close to being a challenge for a good player. There, the goal is still to make AI as strong as possible, and wi

        • I don't think that they did an intentionally crappy job of the backport; I suspect that it was more of a "We need it to run under DX9, but this is your deadline for making it happen" deal, where the programmers weren't given the time to write software versions of the effects that DX10 does directly but DX9 doesn't do as readily.
        • Re: (Score:2, Insightful)

          by pyrbrand ( 939860 )

          I don't think you get it. There's a reason people aren't writing assembly any more and there's also a reason they wrote in assembly instead of 1's and 0's. Yes, it's technically possible to write everything you write in C++ in binary, heck, the compiler and linker pump that out for you, but the point is to be more productive and make less errors so that you can get more done in the limited time you have. In that sense, it's way better to have modern programming tools since you can finish up the graphics

        • Re: (Score:2, Insightful)

          by Anonymous Coward

          If Bioshock had been intended for DX9, it would probably look the same as that DX10 shot on DX9.
          I can tell you for a fact that Bioshock was coded for DX9 and forward ported for DX10 in the last months of development. Features weren't sacrificed to go to DX9, they were simply added when the DX10 rendering path was made. Do check your facts before making disparaging comments...
    • by b100dian ( 771163 ) on Thursday October 04, 2007 @04:18PM (#20858447) Homepage Journal
      If you look at "Company of Heroes - Image quality", the first "grass effects" comparison shows am octogonal wheel.
      I mean, 2007! and we still have octogonal circles!!

      I think that the "realism" isn't worth it. Go out and create DX7 games that are fun :P !! (or openGL games that don't require much extensions;)
      • by WillAffleckUW ( 858324 ) on Thursday October 04, 2007 @04:22PM (#20858515) Homepage Journal
        I think that the "realism" isn't worth it. Go out and create DX7 games that are fun :P !! (or openGL games that don't require much extensions;)

        Oh, come on, everyone will buy the PS3 because it has better graphics than the Wii .... um, hello?
      • by Poromenos1 ( 830658 ) on Thursday October 04, 2007 @04:25PM (#20858583) Homepage
        TF2 is the best looking game I've seen in a while, and it's not realistic. That's a big part of why it's so much fun. I've died numerous times from standing around looking at stuff :(
      • If you look at "Company of Heroes - Image quality", the first "grass effects" comparison shows am octogonal wheel.

        I mean, 2007! and we still have octogonal circles!!

        I don't know what you're talking about but that's one funny line.

      • by bipbop ( 1144919 ) on Thursday October 04, 2007 @04:31PM (#20858685)
        "I mean, 2007! and we still have octogonal circles!!"

        Talk about reinventing the wheel!
      • All wheels are octagonal, you just don't have good enough vision to notice.

      • by Fred_A ( 10934 ) <fred@ f r e dshome.org> on Thursday October 04, 2007 @05:42PM (#20859619) Homepage

        I mean, 2007! and we still have octogonal circles!!
        All circles are octogonal, for large values of eight.

      • You perfectly summed up the point I wanted to make.

        I think that the "realism" isn't worth it. Go out and create DX7 games that are fun

        Back when DX7 was new, people thought it was a stupid waste of time...and 'fun' games in DX7 would be better.

        This same same thought will be put out here in 10 years when people complain about all of the emphasis on eye-candy in DX22.

        Graphics need to move forward just as much as the rest of the game does. DX10 isn't a problem. In fact, in 3 years it will appear quaint.
      • If you look at "Company of Heroes - Image quality", the first "grass effects" comparison shows am octogonal wheel.

        To be fair, the viewport is VERY zoomed-in for the purpose of that screenshot.

        Keep in mind that this is an RTS, not an FPS. There are rarely any reasons to zoom-in that close when you are actually playing. Take a look at the first and the third screenshot above, the "green" and the "brown" ones; that's what you look at in-game. Notice the overall amount of detail. Now, is a perfect wheel on a jeep important? You won't see it... Unless you play in a crazy resolution with 1600 pixels vertical. Even then you

    • Re: (Score:3, Insightful)

      by fbjon ( 692006 )
      Yes, a lot of the differences also seemed to be unrelated to DX10/DX9 differences. The soft fog and some lighting effects is really the only feature of significance I could see.
      • Re: (Score:3, Insightful)

        by QuietYou ( 629140 )
        When you consider the FPS differences of the tests, it's not a fair comparison of the abilities of each each API.

        On one test the DX9 version was running at 110fps and the DX10 version running at 30fps. The DX10 version damn well better haver higher image quality if it takes nearly 4 times as long to render a scene. Push the DX9 version futher by throwing more polygons and more complex shaders at it until you reach the performance of the DX10 version, THEN do a comparison. You'll find that there is preciou
        • They did that.
          Well, they reduced the detail of the dx10 to the level of dx9 (as the other way isnt possible), and that was only possible in one game (the others use different codepaths).

          And see, nvidia cards are about 10-15% faster doing the same under dx10...
    • No, you're not alone. DX10 just sucks, period. The article concluded thus, and I concur emphatically.
    • In some cases, DX9 versions are better, in some, DX10 versions are. Overall, the difference is minimal, except for "Call of Juarez" which uses a completely different set of textures and settings, so it's an apples-to-oranges comparison.

      Image quality: about the same, slightly different in both cases.
      Performance: usually twice as good for DX9, in some cases over 5x better.

      I would call neither of versions "more appealing" in general, albeit I admit that in a couple of cases DX10 had less artifacts. Yet, that
    • by suv4x4 ( 956391 ) on Thursday October 04, 2007 @05:09PM (#20859195)
      Am I the only one who find the DX9 version of the pictures more appealing? With the exception of the Bioshock fog examples (which had sharp boundaries in DX9) they just look more "natural" to me.

      Some did, some didn't.

      You gotta understand that DX10 can do absolutely everything DX9 can, so if the DX10 image looks less natural, it's more of a human flaw than technological: it's a new area and people are only starting to discover what works best, both devs and designers.

      Also I imagine that fine-tuned the DX9 version more since the majority of people out there have DX9 cards. DX10 are barely out there, they probably don't even have a good selection of DX10 cards yet to test everything thoroughly.

      The only thing that worries me is that DX10 shows up slower on the benchmarks. DX10 was promised to have better performance than DX9, but don't forget all of the reviewed game use different code paths for DX10, thus load more effects and use higher precision buffers/processes in the DX10 versions. So while DX10 may be faster, it's not a fair comparison when DX10 is loaded with twice the texture sizes and effects of the DX9 version.

      We'll need a more objective test written to use the same elements in DX9 and 10 and compare that.

      One way or the other DX10 is the future. Even if the first few generation suck, the new features show lots of promise that will come to fruit in the coming years. DX10 has no choice but to become great. If you don't want to burn, just don't buy DX10 card YET, it's the worst moment to do so.

      Wait at least until there's a DX 10.1 card out there with good price and review (DX 10.1 will come with Vista SP1). I don't expect this to be before Q3-4 2008 (which is great since Microsoft would have fixed lots of things in Vista by then, and 3rd parties would have better drivers and hardware for Vista).

      • by Shados ( 741919 )
        For the performance part, its easy to use DX9's past an example: remember the Geforce FX line of cards? That was totally rediculous. Same damn thing is happening to DX10: I bet you its the videocard manufacturers that are messing up again.
    • Re:DX9 looks better? (Score:4, Interesting)

      by brkello ( 642429 ) on Friday October 05, 2007 @12:51AM (#20863881)
      Clearly you must not be the only one since you were modded insightful. But I really don't know what you guys are looking at. In every head to head picture the DX10 looks far superior. Maybe hatred of DX10 and Vista is causing people to have selective sight or something.
    • According to the TFA dx9 is also capable of soft particles.
      So the most noticeable difference seems intentional.
    • I was thinking more about the comparison with shadows. The DX9 shadow is, IMHO, more natural than the 'crisp' DX10 shadow.
    • Graphics are still not as realistic as they should be to suspend disbelief. There are minor differences between DX9 and DX10 games, as shown by the screenshots in the article, differences not big enough to make me worry that I should run DX10 any time soon.
  • by 666999 ( 999666 )
    Is the project to backport DX10 to XP still active?

    Found it - http://alkyproject.blogspot.com/2007/04/finally-making-use-of-this-blog-i.html [blogspot.com]

    Alky compatibility libraries for Microsoft DirectX 10 enabled games. These libraries allow the use of DirectX 10 games on platforms other than Windows Vista, and increase hardware compatibility even on Vista, by compiling Geometry Shaders down to native machine code for execution where hardware isn't capable of running it.

    Anyone tried this or know if it's still being u

  • Motion (Score:5, Interesting)

    by Eccles ( 932 ) on Thursday October 04, 2007 @04:17PM (#20858419) Journal
    I wonder how many of these differences would be more apparently with some motion and several sequential frames. I know there are texture effects that look OK when the user isn't moving but terrible when he is, although DX9 already has enhancements for that.

    Still, nothing there makes me want to jump out and buy a $600 graphics card. Someday I'll have to move to PCIe, SATA, and multi-core; perhaps that will be the time. If it's with a 64 bit OS, so much the better.
    • Someday I'll have to move to PCIe, SATA, and multi-core; perhaps that will be the time. If it's with a 64 bit OS, so much the better.

      I (well my boss actually), just bought an Apple Macbook Pro, I just wanted to point out that your list doesn't mean Vista&DirectX, as the list sounds a lot like my new laptop. A bit off topic maybe, but it will be interesting how Apple will compare to DX10 & Vista when OS 10.5 is out in a month or so.
    • by AK Marc ( 707885 )
      Still, nothing there makes me want to jump out and buy a $600 graphics card.

      The GeForce 8400 for under $50 will do DX10. Not that it's the best, but there are many choices for DX10 under $600, and even decent choices for under $100.

      Someday I'll have to move to PCIe, SATA, and multi-core; perhaps that will be the time. If it's with a 64 bit OS, so much the better.

      You could build a system with all of that for under $600. It may not be the biggest and baddest, but for under $600 you could have a 64 bit
      • Re: (Score:3, Informative)

        by Phil John ( 576633 )

        That GeForce 8400 only has 16 stream processors (the basis of the Unified Architecture that makes up current gen graphics cards). The 8600's suffer a great deal with double that (32) as seen in their framerate tests (apart from BioShock most games were almost unplayable at 1280x1024 - which has become the "new 1024x768" baseline).

        The minimum card you want for the new crop of direct x 10 games (to actually get the "eye candy" at anything over 800x600) is the 8800 GTS with 96 stream processors.

        Of course, g

    • Re: (Score:3, Interesting)

      I wonder how many of these differences would be more apparently with some motion and several sequential frames. I know there are texture effects that look OK when the user isn't moving but terrible when he is, although DX9 already has enhancements for that.

      Still, nothing there makes me want to jump out and buy a $600 graphics card. Someday I'll have to move to PCIe, SATA, and multi-core; perhaps that will be the time. If it's with a 64 bit OS, so much the better.

      Well, the articles missed the most important part of DX 10. Gaming/hardware review sites sometimes touch on the issue, but rarely give it as much import as it deserves. It's not 9 vs 10 that's interesting, it's that for the first time in history DX 10 output is the same regardless of hardware vendor*. Long term it will pay off in spades for customers as doctored drivers and "cheats" are no longer part of the equation when trying to evaluate hardware. This is pretty much essential for moving window composti

    • Re: (Score:2, Interesting)

      by n00854180t ( 866096 )
      I think the real problem is with the article. Yes some of these games have tiny features which "require" DX10, but not a single one of them is a "DX10 game" which is the language used by the article throughout. The real potential of DX10 (or shader model 4 if you prefer, which doesn't require DX10 anyway) is the geometry shader, and *NO* game developer will be using that for the things that matter (i.e., radically gameplay changing elements) until DX10 hardware is ubiquitous. So to date, there hasn't bee
  • by Anonymous Coward on Thursday October 04, 2007 @04:17PM (#20858425)
    since it is from Micro$oft, DX10 is such a failure, not only are games not going from DX9 to DX10, they are going from DX9 to DX8.
    • Re: (Score:3, Interesting)

      by Mex ( 191941 )
      I know you mean it as a joke, but the sad part is that Team Fortress 2 players are finding that "downgrading" the game's directx to 8.1 is giving a significant performance increase with a negligible visual degradation.
  • by Zombie Ryushu ( 803103 ) on Thursday October 04, 2007 @04:17PM (#20858431)
    DirectX Will make just the advancements it needs to keep programmers from going SDL and OpenGL. Thats what it is for. The question is not how far has DirectX come, its how far does SDL and OpenGL have to go.
  • by shawnmchorse ( 442605 ) on Thursday October 04, 2007 @04:17PM (#20858433) Homepage
    Then the answer is going to have to be "not very far". I can't see game developers getting that excited about something supported only on a version of the operating system that people are specifically NOT migrating to in droves.
    • I can't see game developers getting that excited about something supported only on a version of the operating system that people are specifically NOT migrating to in droves.

      In the home market, migration is to the next generation of Windows hardware and software.

      The OEM system bundle.

      The DX10 system with mid-line performance and pricing is still quite new, probably shipping in significant numbers no earlier June. Not the prime shopping season for a PC.

      That said, in the W3Schools stats, Vista went from 0

    • Re: (Score:3, Interesting)

      Actually that's not entirely true. I've found the best indicator of what hardware and software people are using currently in the Steam hardware survey. Vista has been steadily moving up every month. It's up to 7.9% penatration which is quite good considering how many people are supposedly not adopting it. The interesting fact that of the 89,000 people that have it running as their OS only 18000 actually have a video card installed that is capable of running Dx10. That says to me a fairly large percent
  • Wow DX10 (Score:3, Insightful)

    by Dusty00 ( 1106595 ) on Thursday October 04, 2007 @04:19PM (#20858473)
    These numbers to me validate my suspicion that DX10 was nothing more than a cheep angle to sell Vista. The performance isn't a tremendous improvement and the resulting graphics are enough of an improvement that I'm going to let Vista suck down that much of my hardware.
  • I see enough problems getting them to adopt Vista, period. And its not just game developers. Hardware vendors don't seem to do much better. I have a computer that I built almost exactly two years ago. When I built it, all of the parts used had been released within the previous 6 months. So everything on there is younger than 3 years, at the oldest. As of September, the chipset driver hadn't been updated since Vista was in beta and the sound driver offered "limited support." All of the games that I tried ran
  • Shadows are wrong! (Score:5, Informative)

    by glpierce ( 731733 ) on Thursday October 04, 2007 @04:37PM (#20858769)
    "shadows in DX10 are crisper and more accurate than in DX9. In the image below, the shadow in DX9 has blurry edges while the same shadow in DX10 has sharp and crisp edges"

    That's great, except for the fact that shadows don't have crisp edges in the real world. Unless it's illuminated by a point-source (which immediately excludes the sun, lamps, flashlights, and pretty much every other light source you're likely to encounter), there will be a penumbra. The DX9 image here: http://www.hothardware.com/articleimages/item1031/big_stateofdx10_wic_shad.jpg [hothardware.com] is more realistic.

    Simple flash example: http://www.goalfinder.com/Downloads/Shadows.swf [goalfinder.com]
    • by MWoody ( 222806 )
      It's especially amusing given that one of the common features touted by modern game engines is often "soft shadows," where the shadow is given a false penumbra to approximate the effects of light reflected from a multitude of surfaces. Even if the softer versions were faked, I fail to see how a hard shadow is in any way technically impressive or new.
    • by TheNetAvenger ( 624455 ) on Thursday October 04, 2007 @06:19PM (#20860051)
      That's great, except for the fact that shadows don't have crisp edges in the real world. Unless it's illuminated by a point-source (which immediately excludes the sun, lamps, flashlights, and pretty much every other light source you're likely to encounter), there will be a penumbra. The DX9 image here: http://www.hothardware.com/articleimages/item1031/big_stateofdx10_wic_shad.jpg [hothardware.com] is more realistic.

      Not sure how this got confused by either bioshock or the reviewers...

      DirectX 10 allows for both 'crisp' or 'soft' shadowing, as some games demonstrate, the DirectX 10 shadows are 'softer' and more realistic.

      The 'difference' with DirectX 10 is that shadows are done on the GPU, in DirectX9 shadows are done on the CPU. This is the 'main' difference between DX9 and DX10.

      The 'crisp' choice by bioshock is NOT what DX10 is about, this is a game developer choice. PERIOD.

      I know reviews like this can lead people down wrong paths, but it doesn't hurt to look up this type of information before making fun of a fact that is incorrect in the first place.

      It is strange that any site 'reviewing' DX10 in comparison to DX9 would not even know the basic 'consumer' terminology for the differences, so they would know what they were looking at... Maybe someday we can get a review posted on SlashDot that is actually done by gaming professionals... (gasp)

      Here is a quick list from the MS Consumer Info site on DirectX10, notice the reference to shadows specifically.
      -----------------------
      Summary

      In summary, DirectX10 provides the following benefits to gamers:

      More life-like materials and characters with:
      Animated fur & vegetation
      Softer/sharper shadows
      Richer scenes; complex environments
      Thicker forests, larger armies!
      Dynamic and ever-changing in-game scenarios
      Realistic motion blurring
      Volumetric effects
      Thicker, more realistic smoke/clouds

      Other
      Realistic reflections/refractions on water/cars/glass
      Reduced load on CPU
          -Re-routes bulk of graphics processing to GPU
          -Avoids glitching & system hangs during game play
      • by Taulin ( 569009 )
        I would also just like to add to this comment that soft shadows are available in DX9, but mostly in Nvidia cards at first, and at certain resolutions. I think as we saw in dx10.1, that MS is now forcing card makers to provide all features? This would be good since I wouldn't have to compare cards just to find out what features of DX are not supported.
    • PS...

      For people that think there is 'little' difference between DX10 and DX9 for that 'precious 1-2fps lost', or that soft shadows are not a part of DX10, just look at this simple HD video that shows the difference. DX9 looks great, but DX10 looks almost real with far more 'actions' going on in the same scene.

      http://www.gametrailers.com/player/19965.html [gametrailers.com]
      • Yes, the DX10 version was very impressive. Most annoyingly, though, was that it wasn't the same action sequence for DX9 and DX10, e.g., no grenade in DX9.
    • Shadows are wrong because in Bioshock, DX10 shadows are "crisper and more detailed" with DX9 shadows being soft.

      AND in Call of Juarez, "DX10 mode offers softer, more natural looking shadows" while DX9 shadows are crisp.

      Wich means that both DX9 and DX10 can draw soft and crisp shadows, and the difference is just a stupid marketing gimmick to promote DX10 that game companies don't know how to use.
  • by argent ( 18001 ) <(peter) (at) (slashdot.2006.taronga.com)> on Thursday October 04, 2007 @05:04PM (#20859117) Homepage Journal
    Who cares about cool special effects to fake optical accuracy? Within a few years we'll have real-time ray tracing and everything using rasterized graphics will look so fake.
    • Ray tracing traces from pixel to light source. Unfortunately global lighting and soft shadows etc. are still issues. Ray tracing is elegant and simple but like its predecessor, we still have to approximate "real-ness" and use fake effects everywhere.
      • by Prune ( 557140 )
        This is ludicrous. Rays can be traced forward from lights as well and cached onto say surfaces as in photon mapping, or combined with various other methods to provide a physically correct global illumination.

        By the way, UBC > SFU, and prof. Heidrich is the top graphics researcher in Canada :P
    • i would mod you down, as you are stupid and dont really know what you talk about.

      Hint: real time raytracing will look so much more shitty than any rasterized engine of the last 5 years
    • by ardor ( 673957 )
      Yet again RT is seen as the magical wand that will make everything look nice.

      Hint: CGI studios do NOT use RT exclusively. In fact they use *rasterizers*, and resort to RT for stuff that is hard to fake with rasterizing (shadows, translucency, refraction, reflection, ....)

      Why? Because rasterizers are cheaper. Forget about the triangle throughput benchmarks, they are useless, especially for games. As Carmack said, game developers dont want _more_ triangles, they want _pretty_ triangles, which means that fillr
  • What's this "we" business? DX10 is only available with Vista, and Vista sales are abysmal. And with this being a *nix-oriented site, it's falling on deaf ears.

    The summary states that DirectX 10 was "introduced" to by the hardware manufacturers and Windows adopted it. I have always understood it to be the other way around. If it is the hardware makers, then why are they actively supporting two different 3D APIs (DX, OpenGL)? Does this mean that DirectX could be adopted by another OS, say Linux? Only

    • What's this "we" business? DX10 is only available with Vista, and Vista sales are abysmal. And with this being a *nix-oriented site, it's falling on deaf ears.

      Stories posted to the Game section of Slashdot rarely see more than fifty responses.

      The Slashdot Geek isn't really a driving force in PC gaming and anything said here about Microsoft and Vista tends to be tainted by wishful thinking. It isn't retail-boxed Vista that sells to the home market, it is the OEM system bundle.

      You'll find the neon-lit Game

    • Re: (Score:2, Informative)

      by n00854180t ( 866096 )
      Well technically the hardware makers support shader model 4, which has the main and most promising feature that DirectX 10 supports: geometry shaders. It is a fairly big distinction, but this is a more accurate way of saying what they actually meant, "Shader model 4 was introduced by the hardware manufacturers and Microsoft supported it in DirectX 10." Using OGL extensions, you don't *need* DX10 or Vista to make use of the geometry shader. Now, granted there are a number of other changes that are nice in
  • obvious parallel (Score:4, Insightful)

    by fred fleenblat ( 463628 ) on Thursday October 04, 2007 @05:14PM (#20859237) Homepage
    I don't think DirectX 10 will achieve any kind of market acceptance until DirectX 11 is released. Then everyone will bitch about DirectX 11's high-end hardware requirements, DRM lockdowns, and poor performance and they'll start clamoring for the good old days of Direct X 10.
  • A lot of the "improvements" are things that the game is doing differently in the DX9 and DX10 versions. Some of them, like the "litter objects" in one of the games, or gun movement effects in another, have nothing to do with DX10... it's like the game developers simply put more polish in the DX10 versions because they wanted the punters to "get their money's worth".
  • I want my $500 video card to drop 83 fps in performance too.

    http://www.hothardware.com/articles/The_State_of_DirectX_10__Image_Quality__Performance/?page=8 [hothardware.com]

    Seriously, why do people continue to put up with this abuse? Newer/More Expensive should be better in the computing world, no?

    Frankly, I'm glad I use Linux and need not worry about this crap anymore.

  • by aepervius ( 535155 ) on Friday October 05, 2007 @12:03AM (#20863505)
    quote :

    Are We There Yet? The DX10 exclusive effects available in the five games we looked at were usually too subtle to be noticed in the middle of heated gameplay. The only exception is Call of Juarez, which boasts greatly improved graphics in DX10. Unfortunately these image quality improvements can't entirely be attributed to DX10 since the North American version of the game -- the only version that supports DX10 -- had the benefit of a full nine months of extra development time. And much of the image quality improvements in Call of Juarez when using DX10 rendering were due to significantly improved textures rather than better rendering effects. Our test results also suggest that currently available DX10 hardware struggles with today's DX10 enhanced gaming titles. While high-end hardware has enough power to grind out enough frames in DX10 to keep them playable, mid-range hardware simply can't afford the performance hit of DX10. With currently available DX10 hardware and games, you have two choices if you want to play games at a decent frame rate; play the game in DX9 and miss out on a handful of DX10 exclusive image quality enhancements, or play the game in DX10 but be forced to lower image quality settings to offset the performance hit. In the end, it's practically the same result either way. While the new DX10 image quality enhancements are nice, when we finally pulled our noses off the monitor, sat back and considered the overall gameplay experience, DirectX 10 enhancements just didn't amount to enough of an image quality improvement to justify the associated performance hit. However, we aren't saying you should avoid DX10 hardware or wait to upgrade. On the contrary, the current generation of graphics cards from both ATI and NVIDIA offer many tangible improvements over the previous generation, especially in the high-end of the product lines. With the possible exception of some mid-range offerings, which actually perform below last generation's similarly priced cards, the current generation of graphics hardware has a nice leg-up in performance and features that is worth the upgrade. But if your only reason for upgrading is to get hardware support for DX10, then you might want to hold out for as long as possible to see how things play out.

    /quote

Don't tell me how hard you work. Tell me how much you get done. -- James J. Ling

Working...