DX10 - How Far Have We Come? 210
MojoKid writes "When DirectX 10 was first introduced to the market by graphics
manufacturers and subsequently supported by Windows Vista, it was generally
understood that adoption by game developers was going to be more of a slow
migration than a quick flip of a switch. That said, nearly a year later, the
question is how far have we come? An article at the
HotHardware site showcases many of the most popular DX10-capable game
engines, like
Bioshock ,
World In Conflict , Call of Juarez, Lost Planet, and
Company of Heroes, and features current image quality comparisons versus DX9
modes with each. The article
also details
performance levels across many of the more popular graphics
cards, from both the mid-range and high-end." PC Perspective has a similar look at DX10 performance.
DX9 looks better? (Score:5, Insightful)
Re: (Score:2, Informative)
I've can't remember seeing visuals look as bad as those did, and even where glitches occur the action happens so fast its not noticeable.
(one exception, in Half life 2, the frosted glass doors had a glitch near the edges of the screen, nothing major but ruined the effect)
The real joke (Score:5, Insightful)
In the really old days, you had people actually coding for the card on hand. This is why there's a gazillion different releases of Mechwarrior 2, each of which varies greatly in image quality and features - each had to be hand tuned to the card.
If Bioshock had been intended for DX9, it would probably look the same as that DX10 shot on DX9. They'd have figured out what they needed to do, perhaps coded a few "If ATI, do this, if NVidia, do this, if Intel Extreme fail 'your video card is too crappy to play this game'" decisions for specific hardware, and that would have been that. Since it was backported (and MS would have thrown a fit to have "no difference") they had to just do a more slappy job of it.
Then again, if not for the emphasis on ridiculous graphics, think about how many games would be able to use their processing power for some seriously wicked AI. Even Bioshock only has half-decent AI that can be twigged to and predicted fairly easily - you know that a wrench guy is going to rush you, you know that the spider slicers will hang from the ceiling and lob stuff all day till you burn or freeze them, you know where the houdinis are going to land long before the animation starts merely because you can figure out what the AI tree says for them to do in what radius... it's sad.
Hell, you can predict the precise spot on the health bar where they'll run for the health station, and if you're smart you trapped that thing half an hour ago. Now you get to watch as four of them all kill themselves on the same damn one, never paying attention to the 3 dead bodies on the floor that obviously just gassed themselve using a booby-trapped station.
But nevermind. I know the reason they want graphics over AI - the same fucking morons that could never defeat a decently programmed AI (hell, they have trouble getting through Halo on NORMAL), drool over thinking that they can see the shoelaces on Madden's boot.
Re:The real joke (Score:5, Insightful)
Re: (Score:2, Interesting)
Re: (Score:2)
The computer resources are irrelevant. Graphics and AI both take man-hours to make, so this is a cost/management issue, rather than a technical one.
You have a certain budget. Do you hire more graphics artists and graphics programmers, or AI designers and AI programmers?
Re: ai (Score:5, Insightful)
First, the goal of "AI" isn't always to be as smart as possible. Often, the goal is to make something believable and/or of the appropriate difficulty level. It's possible that Bioshock missed the mark there, but I haven't played Bioshock yet, so I don't know.
I can write "AI" that will kick your ass every time, even without cheating. (Mobs have the advantage of being on home turf, and they outnumber you.) But that's not fun for the player, so I don't do it. Instead, I'll write something with a pattern you have to figure out. Once you learn one of the ways to beat it, the mob will be easy for you, and it's time to move on to the next area. Very few mobs get the full "try to survive at all cost" treatment, and even fewer are programmed to actually learn from your behavior.
You're describing the classic "I wish this mob would keep getting harder" remorse, but think about it: would it really make sense for those mobs to learn from your new tactics? Are they supposed to be smart, or are they just supposed to be an obstacle?
As for your dead bodies example: would you really prefer to have an infinite standoff as the mobs decide it's not worth getting killed, so they go hide somewhere with their own traps and wait for you to attack? Right... so get over it. If games were realistic, you would realdie on level 1.
Re: ai (Score:5, Insightful)
> (Mobs have the advantage of being on home turf, and they outnumber you.)
You are assuming that the mob would just sit there and wait for the player, like it usually does in pretty much every game. In reality, a "level" would not necessarily know that Gordon Freeman is on his way. Neither will they have the patience to sit in their assigned ambush places, waiting for him all day long. A better AI would actually "live" in the environment where it is placed, so that it would react to the player instead of waiting for him. It would also be fun to watch. In Half-Life I really enjoyed watching those occasional scenes where monsters are wondering around doing things; like when the bullsquids feed on the headcrabs. I wish there were more things like that, things worth watching.
> would it really make sense for those mobs to learn from your new tactics?
> Are they supposed to be smart, or are they just supposed to be an obstacle?
If the AI was smart, you wouldn't need a mob. You would only need a few individuals. It would be like a multiplayer deathmatch, and, judging from the popularity of those, would likely be more fun than the current mob situation.
> As for your dead bodies example: would you really prefer to have an infinite standoff
> as the mobs decide it's not worth getting killed, so they go hide somewhere with their
> own traps and wait for you to attack?
An infinite standoff will only happen if the game designer makes you kill off the entire mob before setting off some stupid trigger to open some stupid door. Don't program artificial obstacles and the player will be able to ignore the hiding mob and go on, just like in real life.
Re: (Score:2)
In Half-Life I really enjoyed watching those occasional scenes where monsters are wondering around doing things; like when the bullsquids feed on the headcrabs. I wish there were more things like that, things worth watching.
For all the guff that Tabula Rasa is getting,this one one of the things that (to me) made the world seem more dynamic and lived in. The worlds you play on are active battlefields with reasonably intelligent good and evil mobs that are jocking for tactical and strategic advantages. The "bad" mobs arrive in dropships in actual squads of various types, and will patrol/hunt through areas for "good" mobs (including the player). A lot of people don't seem to like TR very much, but this was a great idea, to me
Re: (Score:2)
However, after that it was back to the beaten-to-death formula: a elevation-map based terrain with the very occasional cave, and mobs spawning at random and standing there, attacking you when you get within some fixed radius from them.
The only slightly original thing is that instead of just popping in existence out of thin air, they have this (badly looking) animation where a ship shows up and the guys drop from it.
Add the also beaten-to-death "go collect
Re: (Score:3, Interesting)
Re: (Score:2)
This depends a lot on the kind of game you're talking about. For FPS, you're right, but that's not nearly the most demanding kind of game for AI. So far, no company has been able to write a turn-based strategy game where the AI comes even close to being a challenge for a good player. There, the goal is still to make AI as strong as possible, and wi
Re: (Score:2)
Re: (Score:2, Insightful)
I don't think you get it. There's a reason people aren't writing assembly any more and there's also a reason they wrote in assembly instead of 1's and 0's. Yes, it's technically possible to write everything you write in C++ in binary, heck, the compiler and linker pump that out for you, but the point is to be more productive and make less errors so that you can get more done in the limited time you have. In that sense, it's way better to have modern programming tools since you can finish up the graphics
Re: (Score:2, Insightful)
Re:DX9 looks better? (Score:4, Insightful)
I mean, 2007! and we still have octogonal circles!!
I think that the "realism" isn't worth it. Go out and create DX7 games that are fun
Re:DX9 looks better? or do the consumers vote? (Score:5, Funny)
Oh, come on, everyone will buy the PS3 because it has better graphics than the Wii
Re:DX9 looks better? (Score:4, Insightful)
Re: (Score:2)
I don't know what you're talking about but that's one funny line.
Re:DX9 looks better? (Score:4, Funny)
Talk about reinventing the wheel!
Re: (Score:2)
Re:DX9 looks better? (Score:5, Funny)
Re: (Score:2)
All circles are octogonal, for large values of eight.
Or if you use Excel 2007 to calculate the input data for your rendering...
Re: (Score:2)
...or if you rotate the 8 90 degrees.
Re: (Score:2)
Re: (Score:2)
I think that the "realism" isn't worth it. Go out and create DX7 games that are fun
Back when DX7 was new, people thought it was a stupid waste of time...and 'fun' games in DX7 would be better.
This same same thought will be put out here in 10 years when people complain about all of the emphasis on eye-candy in DX22.
Graphics need to move forward just as much as the rest of the game does. DX10 isn't a problem. In fact, in 3 years it will appear quaint.
Re: (Score:2)
If you look at "Company of Heroes - Image quality", the first "grass effects" comparison shows am octogonal wheel.
To be fair, the viewport is VERY zoomed-in for the purpose of that screenshot.
Keep in mind that this is an RTS, not an FPS. There are rarely any reasons to zoom-in that close when you are actually playing. Take a look at the first and the third screenshot above, the "green" and the "brown" ones; that's what you look at in-game. Notice the overall amount of detail. Now, is a perfect wheel on a jeep important? You won't see it... Unless you play in a crazy resolution with 1600 pixels vertical. Even then you
Re: (Score:3, Insightful)
Re: (Score:3, Insightful)
On one test the DX9 version was running at 110fps and the DX10 version running at 30fps. The DX10 version damn well better haver higher image quality if it takes nearly 4 times as long to render a scene. Push the DX9 version futher by throwing more polygons and more complex shaders at it until you reach the performance of the DX10 version, THEN do a comparison. You'll find that there is preciou
Re: (Score:2)
Well, they reduced the detail of the dx10 to the level of dx9 (as the other way isnt possible), and that was only possible in one game (the others use different codepaths).
And see, nvidia cards are about 10-15% faster doing the same under dx10...
Re: (Score:2)
Re: (Score:2)
Image quality: about the same, slightly different in both cases.
Performance: usually twice as good for DX9, in some cases over 5x better.
I would call neither of versions "more appealing" in general, albeit I admit that in a couple of cases DX10 had less artifacts. Yet, that
Re:DX9 looks better? (Score:5, Insightful)
Yup. That's when I tested the speed of my car vs a train, I ran the car on the tracks. I was testing the speed of a car vs train, and you don't do that by changing two variables.
Re: (Score:2)
Re: (Score:2)
Thus, the best operating system that you would use DX9 on is XP.
It's a simple concept. And in practice, it's obvious why they should have used the much more efficient XP for the performance comparisons.
Re:DX9 looks better? (Score:5, Insightful)
Some did, some didn't.
You gotta understand that DX10 can do absolutely everything DX9 can, so if the DX10 image looks less natural, it's more of a human flaw than technological: it's a new area and people are only starting to discover what works best, both devs and designers.
Also I imagine that fine-tuned the DX9 version more since the majority of people out there have DX9 cards. DX10 are barely out there, they probably don't even have a good selection of DX10 cards yet to test everything thoroughly.
The only thing that worries me is that DX10 shows up slower on the benchmarks. DX10 was promised to have better performance than DX9, but don't forget all of the reviewed game use different code paths for DX10, thus load more effects and use higher precision buffers/processes in the DX10 versions. So while DX10 may be faster, it's not a fair comparison when DX10 is loaded with twice the texture sizes and effects of the DX9 version.
We'll need a more objective test written to use the same elements in DX9 and 10 and compare that.
One way or the other DX10 is the future. Even if the first few generation suck, the new features show lots of promise that will come to fruit in the coming years. DX10 has no choice but to become great. If you don't want to burn, just don't buy DX10 card YET, it's the worst moment to do so.
Wait at least until there's a DX 10.1 card out there with good price and review (DX 10.1 will come with Vista SP1). I don't expect this to be before Q3-4 2008 (which is great since Microsoft would have fixed lots of things in Vista by then, and 3rd parties would have better drivers and hardware for Vista).
Re: (Score:2)
Re:DX9 looks better? (Score:4, Interesting)
and it seems intentionally botched (Score:2)
So the most noticeable difference seems intentional.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2, Insightful)
I'd much rather see game developers expend their man-hours on making PC games creative and better to play (and in a perfect world, not restrictively ready-to-port-to-console), rather then focus on making them graphically unique to DX10.
Re: (Score:3, Interesting)
So, you're willing to reward Microsoft for bad behavior?
A surprising number of people I encounter in my work have decided to forgo Vista, no matter what Microsoft does to it. There are some people who have decided not to just bow to the dictates of corporations, who expect us to buy what they offer, to give them profits no matter how poorly they perform.
Just as organized labor had to bring rapacious corporations into line in the second 2/3 of the twe
Re: (Score:2)
Um, would that be now?
I'm with you though. They are selling low-end laptops that are totally ruined by Vista but work perfectly fine with XP, or better yet, Ubuntu. I'm surprised Microsoft would allow this to happen because it makes a decent machine unusably slow, which makes Microsoft look bad (of course, they are bad, but you'd think they wouldn't want you to know).
Anyhow, no Vista for me. It's bad enough I've paid the MS tax twice by virtue of wanting a ne
Re: (Score:2, Insightful)
Backporting DX10 to XP (Score:2, Interesting)
Found it - http://alkyproject.blogspot.com/2007/04/finally-making-use-of-this-blog-i.html [blogspot.com]
Anyone tried this or know if it's still being u
Re: (Score:3, Informative)
http://wiki.winehq.org/FAQ#head-fbaa851e07d7484640cc10b6d0c48abc741260b2 [winehq.org]
from that page
Does Wine support DirectX? Can I install Microsoft's DirectX under Wine? Wine itself provides a DirectX implementation that, although it has a few bugs left, should run fine. Wine supports DirectX 9.0c at this time. Plans for DirectX 10 are underway. If you attempt to install Microsoft's DirectX, you'll run into some
Re:Backporting DX10 to XP (Score:5, Funny)
I believe yes, it is [winehq.org].
Re: (Score:2)
Alky is vapourware, don't hold your breath waiting for it.
As the posters above noted, you can already use (most) Wine dlls on Windows. Currently the Wine d3d10 implementation isn't particularly complete, but that will change with time.
Motion (Score:5, Interesting)
Still, nothing there makes me want to jump out and buy a $600 graphics card. Someday I'll have to move to PCIe, SATA, and multi-core; perhaps that will be the time. If it's with a 64 bit OS, so much the better.
Re: (Score:2)
I (well my boss actually), just bought an Apple Macbook Pro, I just wanted to point out that your list doesn't mean Vista&DirectX, as the list sounds a lot like my new laptop. A bit off topic maybe, but it will be interesting how Apple will compare to DX10 & Vista when OS 10.5 is out in a month or so.
Re: (Score:3, Interesting)
Agreed, but I never said that Macs didn't have games. I asked if the games in the review were also on the Mac, and for a general status on Mac gaming. That said, I decided to check for myself.
No to Bioshock. Nothing on World in Conflict. Out of luck on Call of Juarez, no for Lost Planet. Company of Heroes? Nada.
...by whose standards?
Re: (Score:2)
The GeForce 8400 for under $50 will do DX10. Not that it's the best, but there are many choices for DX10 under $600, and even decent choices for under $100.
Someday I'll have to move to PCIe, SATA, and multi-core; perhaps that will be the time. If it's with a 64 bit OS, so much the better.
You could build a system with all of that for under $600. It may not be the biggest and baddest, but for under $600 you could have a 64 bit
Re: (Score:3, Informative)
That GeForce 8400 only has 16 stream processors (the basis of the Unified Architecture that makes up current gen graphics cards). The 8600's suffer a great deal with double that (32) as seen in their framerate tests (apart from BioShock most games were almost unplayable at 1280x1024 - which has become the "new 1024x768" baseline).
The minimum card you want for the new crop of direct x 10 games (to actually get the "eye candy" at anything over 800x600) is the 8800 GTS with 96 stream processors.
Of course, g
Re: (Score:3, Interesting)
I wonder how many of these differences would be more apparently with some motion and several sequential frames. I know there are texture effects that look OK when the user isn't moving but terrible when he is, although DX9 already has enhancements for that.
Still, nothing there makes me want to jump out and buy a $600 graphics card. Someday I'll have to move to PCIe, SATA, and multi-core; perhaps that will be the time. If it's with a 64 bit OS, so much the better.
Well, the articles missed the most important part of DX 10. Gaming/hardware review sites sometimes touch on the issue, but rarely give it as much import as it deserves. It's not 9 vs 10 that's interesting, it's that for the first time in history DX 10 output is the same regardless of hardware vendor*. Long term it will pay off in spades for customers as doctored drivers and "cheats" are no longer part of the equation when trying to evaluate hardware. This is pretty much essential for moving window composti
Re: (Score:2, Interesting)
Obviously... (Score:3, Funny)
Re: (Score:3, Interesting)
Just as far as it needs to to displace OpenGL. (Score:5, Interesting)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
But does anybody use SDL anymore? (That's a serious question. I haven't coded with SDL since the late '90s.)
Re: (Score:2)
Re: (Score:2, Interesting)
Re:Just as far as it needs to to displace OpenGL. (Score:5, Funny)
Re: (Score:2)
Re:Just as far as it needs to to displace OpenGL. (Score:4, Insightful)
Re: (Score:2)
I use Vista on a daily basis and like it. What am I doing wrong?
That would be the "liking it" part. :-)
Re: (Score:2)
Re: (Score:2)
DX10 still Windows Vista only? (Score:5, Insightful)
Re: (Score:2)
In the home market, migration is to the next generation of Windows hardware and software.
The OEM system bundle.
The DX10 system with mid-line performance and pricing is still quite new, probably shipping in significant numbers no earlier June. Not the prime shopping season for a PC.
That said, in the W3Schools stats, Vista went from 0
Re: (Score:3, Interesting)
Wow DX10 (Score:3, Insightful)
forget game developers adopting directx 10... (Score:2)
Shadows are wrong! (Score:5, Informative)
That's great, except for the fact that shadows don't have crisp edges in the real world. Unless it's illuminated by a point-source (which immediately excludes the sun, lamps, flashlights, and pretty much every other light source you're likely to encounter), there will be a penumbra. The DX9 image here: http://www.hothardware.com/articleimages/item1031/big_stateofdx10_wic_shad.jpg [hothardware.com] is more realistic.
Simple flash example: http://www.goalfinder.com/Downloads/Shadows.swf [goalfinder.com]
Re: (Score:2)
Re:Shadows are wrong! (Score:5, Insightful)
Not sure how this got confused by either bioshock or the reviewers...
DirectX 10 allows for both 'crisp' or 'soft' shadowing, as some games demonstrate, the DirectX 10 shadows are 'softer' and more realistic.
The 'difference' with DirectX 10 is that shadows are done on the GPU, in DirectX9 shadows are done on the CPU. This is the 'main' difference between DX9 and DX10.
The 'crisp' choice by bioshock is NOT what DX10 is about, this is a game developer choice. PERIOD.
I know reviews like this can lead people down wrong paths, but it doesn't hurt to look up this type of information before making fun of a fact that is incorrect in the first place.
It is strange that any site 'reviewing' DX10 in comparison to DX9 would not even know the basic 'consumer' terminology for the differences, so they would know what they were looking at... Maybe someday we can get a review posted on SlashDot that is actually done by gaming professionals... (gasp)
Here is a quick list from the MS Consumer Info site on DirectX10, notice the reference to shadows specifically.
-----------------------
Summary
In summary, DirectX10 provides the following benefits to gamers:
More life-like materials and characters with:
Animated fur & vegetation
Softer/sharper shadows
Richer scenes; complex environments
Thicker forests, larger armies!
Dynamic and ever-changing in-game scenarios
Realistic motion blurring
Volumetric effects
Thicker, more realistic smoke/clouds
Other
Realistic reflections/refractions on water/cars/glass
Reduced load on CPU
-Re-routes bulk of graphics processing to GPU
-Avoids glitching & system hangs during game play
Re: (Score:2)
Re: (Score:2)
No it means the cards must SUPPORT these GPU operations, unlike previous generation where NVidia or ATI did not have GPU support for many mainstream features. (ie. making it easier on developers, as when they call for shadows, they don't have to care what card is in the user's machine.)
This is the same as DX10 requiring GPUs to support pre-emptive scheduling being handled by the OS (Vista) and DX10 requiring GPUs to sup
Re: (Score:2)
For people that think there is 'little' difference between DX10 and DX9 for that 'precious 1-2fps lost', or that soft shadows are not a part of DX10, just look at this simple HD video that shows the difference. DX9 looks great, but DX10 looks almost real with far more 'actions' going on in the same scene.
http://www.gametrailers.com/player/19965.html [gametrailers.com]
Re: (Score:2)
Re: (Score:2)
AND in Call of Juarez, "DX10 mode offers softer, more natural looking shadows" while DX9 shadows are crisp.
Wich means that both DX9 and DX10 can draw soft and crisp shadows, and the difference is just a stupid marketing gimmick to promote DX10 that game companies don't know how to use.
I'm waiting for OpenRT (Score:5, Interesting)
Re: (Score:2)
Re: (Score:2)
By the way, UBC > SFU, and prof. Heidrich is the top graphics researcher in Canada
If i had mod points, (Score:2)
Hint: real time raytracing will look so much more shitty than any rasterized engine of the last 5 years
Re: (Score:2)
Hint: CGI studios do NOT use RT exclusively. In fact they use *rasterizers*, and resort to RT for stuff that is hard to fake with rasterizing (shadows, translucency, refraction, reflection,
Why? Because rasterizers are cheaper. Forget about the triangle throughput benchmarks, they are useless, especially for games. As Carmack said, game developers dont want _more_ triangles, they want _pretty_ triangles, which means that fillr
How far have WE come? (Score:2, Insightful)
What's this "we" business? DX10 is only available with Vista, and Vista sales are abysmal. And with this being a *nix-oriented site, it's falling on deaf ears.
The summary states that DirectX 10 was "introduced" to by the hardware manufacturers and Windows adopted it. I have always understood it to be the other way around. If it is the hardware makers, then why are they actively supporting two different 3D APIs (DX, OpenGL)? Does this mean that DirectX could be adopted by another OS, say Linux? Only
Time for a reality check? (Score:2)
Stories posted to the Game section of Slashdot rarely see more than fifty responses.
The Slashdot Geek isn't really a driving force in PC gaming and anything said here about Microsoft and Vista tends to be tainted by wishful thinking. It isn't retail-boxed Vista that sells to the home market, it is the OEM system bundle.
You'll find the neon-lit Game
Re: (Score:2, Informative)
obvious parallel (Score:4, Insightful)
A lot of the "improvements" are in the games. (Score:2)
Sign me up (Score:2)
http://www.hothardware.com/articles/The_State_of_DirectX_10__Image_Quality__Performance/?page=8 [hothardware.com]
Seriously, why do people continue to put up with this abuse? Newer/More Expensive should be better in the computing world, no?
Frankly, I'm glad I use Linux and need not worry about this crap anymore.
Just jump to the summary. (Score:4, Informative)
Are We There Yet? The DX10 exclusive effects available in the five games we looked at were usually too subtle to be noticed in the middle of heated gameplay. The only exception is Call of Juarez, which boasts greatly improved graphics in DX10. Unfortunately these image quality improvements can't entirely be attributed to DX10 since the North American version of the game -- the only version that supports DX10 -- had the benefit of a full nine months of extra development time. And much of the image quality improvements in Call of Juarez when using DX10 rendering were due to significantly improved textures rather than better rendering effects. Our test results also suggest that currently available DX10 hardware struggles with today's DX10 enhanced gaming titles. While high-end hardware has enough power to grind out enough frames in DX10 to keep them playable, mid-range hardware simply can't afford the performance hit of DX10. With currently available DX10 hardware and games, you have two choices if you want to play games at a decent frame rate; play the game in DX9 and miss out on a handful of DX10 exclusive image quality enhancements, or play the game in DX10 but be forced to lower image quality settings to offset the performance hit. In the end, it's practically the same result either way. While the new DX10 image quality enhancements are nice, when we finally pulled our noses off the monitor, sat back and considered the overall gameplay experience, DirectX 10 enhancements just didn't amount to enough of an image quality improvement to justify the associated performance hit. However, we aren't saying you should avoid DX10 hardware or wait to upgrade. On the contrary, the current generation of graphics cards from both ATI and NVIDIA offer many tangible improvements over the previous generation, especially in the high-end of the product lines. With the possible exception of some mid-range offerings, which actually perform below last generation's similarly priced cards, the current generation of graphics hardware has a nice leg-up in performance and features that is worth the upgrade. But if your only reason for upgrading is to get hardware support for DX10, then you might want to hold out for as long as possible to see how things play out.
Re: (Score:2)
Re: (Score:2, Interesting)
Re: (Score:2)
Re: (Score:2, Funny)
What do you have against Furries?
Re: (Score:2)