A Glimpse Into 3D future: DirectX Next Preview 222
Dave Baumann writes "Beyond3D has put up an article based on Microsoft's games developers presentations given at Meltdown, looking at the future directions of MS's next generation DirectX - currently titled "DirectX Next" (DX10). With Pixel Shaders 2.0 and 3.0 already a part of DirectX9 this article gives a feel of what to expect from PS/VS4.0 and other DirectX features hardware developers will be expected to deliver with the likes of R500 and NV50."
It would be nice (Score:3, Interesting)
Re:It would be nice (Score:4, Informative)
DX10 will work fine with your new card. DX has always done this. DX9 works fine with DX8 cards like the Radeon 9000/9100 and GeForce4 series.
However the cards do not have support for the new features of DX10 (like PS/VS3/4 etc). The cards can work with the new software, and do, but the hardware just isn't there.
Re:It would be nice (Score:2)
Tell that those millions who game with consoles. It's all matter of values.
Re:It would be nice (Score:4, Funny)
This is starting to sound like vegetarians are taking over or something. Where what you eat/software you run, determines your worth as a human being.
It really doesn't matter that much in the big scheme of things.
Re:It would be nice (Score:2)
Are you saying that the way you live your life doesn't determine your worth?
Individual choices may be relatively inconsequential some of the time, but what better way is there to determine the value of a person than by summing up the ones they've made over their entire life?
Re:It would be nice (Score:2)
Re:It would be nice (Score:2)
Even if DirectX 8 was, for arguments sake, completely backwards compatible with a 1980s graphics card that doesn't mean it would be able to suddenly make it do pixel shading or nifty T&L stuff.
Re:It would be nice (Score:2, Insightful)
You mean you don't upgrade your video card once a year?!
DX9, 10 or whatever already is "compatible"! (Score:5, Informative)
The issue is my card doesn't have the vertex shaders and other registers that DX9 takes advantage of so i won't be fully accelerating new DX9 features. I can run DX9 games just fine even though my card was designed with 8 in mind.
Its not that it isn't backwards compatible, it is that your hardware doesn't suport technology of the future since it didn't exist
Only way around this would be if your GPU core was software driven and they could update it. Otherwise to get new DX10 support, you need a DX10 card that was built with the new functionality in mind.
Backwards compatibility has nothing to do with it. Its just like in the days of MMX vs NON MMX. IF you had MMX it ran faster, if you didn't it never wouldn't work for you.. just would be slower.
Re:DX9, 10 or whatever already is "compatible"! (Score:4, Insightful)
Re:DX9, 10 or whatever already is "compatible"! (Score:3, Informative)
Now, in general you are correct - however, the Deus Ex 2 demo refused to run on my girlfriend's PC because it lacked support for pixel shaders (v1.1, iirc). That machine has the latest DX installed, but only has a GeForce 4 MX. My machine, with a Ti 200, runs the demo fine.
Perhaps it doesn't have
Has nothing to do with DirectX (Score:2)
Deus Ex 2 specifically uses hardware Pixel Shaders in its engine. If your card doesn't use them, it won't run, because Deus Ex 2 was programmed that way. The GeForce MX cards are not supported.
What people mean by backwards compatibility in DirectX is that DirectX 9 doesn't break compability with games that use earlier versions of DirectX. For instance, I'm typing this on a laptop that has a Mobility Radeon 7500, and I have DirectX 9 installed and can still play Black
Re:DX9, 10 or whatever already is "compatible"! (Score:2)
It is, however, an example of a DX9 app that does not degrade gracefully. It isn't DX9's fault, but that won't matter to the guy trying to play it (who, of course, will anyway blame the game, not
Re:DX9, 10 or whatever already is "compatible"! (Score:2)
I have the CD in front of me as I type - it is marked as being copyright 1998.
I know - IHBT, IHL, IWNHAND.
Re:DX9, 10 or whatever already is "compatible"! (Score:2)
Or if it was reconfigurable hardware... like an FPGA
Only problem is that performance FPGAs in the types of speeds graphics cards require are not cheap... 1 GHz units typically cost $300+, and those top out at about a million logic units, making for a final end-user cost on the wrong side of $1000, I'm sure...
Re:DX9, 10 or whatever already is "compatible"! (Score:4, Informative)
So it isn't likely DirectX is going to use an MMX implementation of a function when your processor flags don't agree. Other than that, most people aren't doing inline MMX assembly in their games now that DirectX has taken to supporting streaming instructions itself.
Re:It would be nice (Score:2)
DX9 (Score:1)
So it's not going to be called DirectX X??? (Score:5, Insightful)
DirectX Next?
I guess we all know what the Next version of Windows is going to be called!
Re:So it's not going to be called DirectX X??? (Score:2)
Re: So it's not going to be called DirectX X??? (Score:5, Funny)
Re: So it's not going to be called DirectX X??? (Score:3, Funny)
They could get Vin Deisel to write it!
Re:So it's not going to be called DirectX X??? (Score:2)
*wink-and-nudge*
Yeah... (Score:2)
The story.. (Score:2)
No really, I was there.
Re:The story.. (Score:2)
The reviewer was being a dick
Next Version of Windows (Score:2)
As I understand it, the next version of Windows is code-named Donghorn.
-kgj
Does this mean OpenGL is finished ? (Score:2, Interesting)
Re:Does this mean OpenGL is finished ? (Score:5, Informative)
Don't get this personal, I always post like this when someone compares these two
Re:Does this mean OpenGL is finished ? (Score:4, Informative)
Re:Does this mean OpenGL is finished ? (Score:2)
As for DirectX itself, well that's what SDL is for.
Re:Does this mean OpenGL is finished ? (Score:3, Insightful)
The fact that Quake3 is still used for measuring OpenGL performance of gaming cards says an awful awful lot about the number of game engines using OpenGL (engines based on Q3 not outstanding)
Re:Does this mean OpenGL is finished ? (Score:4, Informative)
Re:Does this mean OpenGL is finished ? (Score:2, Insightful)
If you meant OpenGL is dead in the Windows games market, I'd argue that it mostly has been for a while. Yeah John Carmack uses OpenGL, but most games are implemented in DirectX. It's not like it really even matters, though, actually rendering code is usually a pretty small portion of a game
Re:Does this mean OpenGL is finished ? (Score:5, Informative)
No it's not. With the approval of the ARB_vertex_buffer_object extension and GLSlang, both APIs expose about the same level of functionality. Render to texture is a mess in OpenGL right now. But there are Super Buffers/pixel_buffer_object extensions in the works. And the Super Buffers extension looks like it will cover most of the functionality that is slated for DirectX Next.
Revelant links:
http://oss.sgi.com/projects/ogl-sample/registry/
http://oss.sgi.com/projects/ogl-sample/registry/ ARB/vertex_buffer_object.txt
http://oss.sgi.com/projects/ogl-sample/registry/ ARB/shading_language_100.txt
http://www.opengl.org/about/arb/notes/meeting_no te_2003-06-10.html
http://developer.nvidia.com/docs/IO/8230/GDC2003 _OGL_ARBSuperbuffers.pdf
Note that OpenGL is usually updated once a year at Siggraph. The next version of DirectX is slated for after the release of Longhorn. That'll be 2005 or so.
Please do not perpetuate the myth that OpenGL is "falling behind" Direct3D. That is plain wrong. And a diservice to both the open source community and the graphics development community.
Re:Does this mean OpenGL is finished ? (Score:3, Insightful)
DirectX=Suite, how about GL (Score:2)
However, with that in mind, is there an equivilent suite that incorporates GL? 3D graphics are nice, but I do remember thinking nice things about DirectMusic (situation-themable audio events) and DirectSound. Is there anything we can use to compare to directX as a whole?
Being that I'm currently working on learning/developing
Overkill? (Score:2, Interesting)
Re:Overkill? (Score:3, Insightful)
Re:Overkill? (Score:2)
Re:Overkill? (Score:2)
The planning for such things always begins long before the final version is released. Right now they are getting input, looking at ways to do things that developers would want, etc. DX10 is probably at least two years away, and games for it farther than that.
In other news... (Score:5, Funny)
Re: (Score:2)
Re: (Score:2, Interesting)
Slayers (Score:5, Funny)
anyone else jealous (Score:1)
Who cares? (Score:5, Insightful)
I could care less about this functionality being exposed through a proprietary API.
My question is: when will it be available in OpenGL 2.x? :-)
Cross platform is the best way to go with game development...and OpenGL is the only game in town for cross-platform 3D graphics. It is also the official 3D API for Macintosh.
Re:Who cares? (Score:3, Insightful)
Re:Who cares? (Score:5, Interesting)
IMO, yes cross-platform is the way to go. If you use the right engine (Torque for instance), you get it for free, less the occasional support call. ;-)
Look at some of the top games that have been cross-platform:
See any successful games there? ;-) And even Microsoft is smart enough to do it, while trying to lock everyone else into Windows/DirectX. Pretty funny, actually...
so how are they going to feel developing for an os unproven in game development with users who are used to getting everything for free? (dont worry, i love linux, but i just dont think we should lie to ourselves)
If they get the port essentially for free, and provide it as an "unsupported" extra, they will get a ton of good press on Usenet, the web and so on, from alpha geeks. Look at the reception Baldur's Gate games get here on Slashdot. That's worth it right there! :-)
Re:Who cares? (Score:2)
You picked a really bad example (Score:2)
There is a real difference between somethign that was developed as a Windows game, then later because of its success, had effort spent on it to make it work on another platform, and a game that was designed using open standards as to run on more than one platform from the beginning.
There's no such thing as a free port (Score:2)
Re:Who cares? (Score:3, Informative)
Er, just what do you think was used for the MacOS and Linux ports?
There are three different scenarios:
Re:Who cares? (Score:2)
If you develop your game from the beginning to be cross platform, you will incur little, if any, development costs for your port because the cost will be part of the development.
Essentially the time and price to design (which is little relative to debug and test) and the time and price to test and debug; and if the game is developed with portability in the first place, you will see fewer bugs and problem
Re:Who cares? (Score:2)
Is there any sort of OpenGL support for PS2? Maybe PS3 will make the jump...
Re:Who cares? (Score:2)
>> for PS2? Maybe PS3 will make the jump...
Console makers in general, and sony in particular, benefit hugely from exclusive titles. They have a lot to lose by making it trivial to port off of their console to other systems.
Would playstation be the success it is without Sony's relationship with Square ?
Re:Who cares? (Score:2)
Re:Who cares? (Score:2)
Sorry 'bout that... ;-)
Horse, THEN Cart (Score:2, Interesting)
With Pixel Shaders 2.0 and 3.0 already a part of DirectX9 this article gives a feel of what to expect from PS/VS4.0 and other DirectX features hardware developers will be expected to deliver with the likes of R500 and NV50.
Shouldn't hardware vendors develop processing capability, then the software vendors implement the OS support? Or maybe I'm sensitive to the Evil Empire trying to dictate other computing advances through its 'embrace and extend' philosophy.
Compare thi
Re:Horse, THEN Cart (Score:3, Insightful)
Compare this to OpenGL, which is lagging so far behind that only rare titles take it seriously (Doom3 is the one that springs to mind).
Note for example that both NVidia and ATI provide better support for DirectX in their drivers than they do
Re:Horse, THEN Cart (Score:5, Insightful)
Compare this to OpenGL, which is lagging so far behind that only rare titles take it seriously (Doom3 is the one that springs to mind).
I can only see one property of OpenGL that is "lagging behind" DirectX: Whiz-bang features.
Is OpenGL "lagging behind" DirectX in portability? hardware support? scalability?
I would argue that OpenGL as a general-purpose 3D API is more useful than DirectX soley because it is more widespread. The API is implemented (or implementable) on a more diverse selection of hardware and software platforms than DirectX can ever dream of.
As a Intel-Windows-Cutting-Edge-Game-only API, DirectX is the way to go, but for everything else, we have OpenGL.
Re:Horse, THEN Cart (Score:4, Informative)
Wow, you are very uninformed for someone who was rated +5 Insightful.
OpenGL exposes new 3D functionality much faster than DirectX, through the OpenGL extension mechanism. It may not be as convenient as having a "standardized" API (and OpenGL 2.0 will address as much of that issue as it can), but it is still better to be able to use new functionality immediately, rather than waiting for the next DirectX release (or worse yet beta) from Microsoft. NVIDIA's drivers even support all of this under Linux.
As to your "rare titles" comment, see my other post for top games using OpenGL [slashdot.org]. Also reflect on the fact that every id game plus all the games based on id engines (Heretic 1/2, RTCW/ET and many more) all use OpenGL exclusively.
And guess what, when id releases Doom3, I'm pretty sure it'll raise the bar again. Perhaps by then quite a few people will have shader-capable video cards. ;-)
For more correct information about OpenGL, feel free to check out the official OpenGL website [opengl.org].
Re:Horse, THEN Cart (Score:2)
Unfortunately, each hardware vendor has it's own set of extensions they implement and support.... which isn't incredibly useful when you're writing software which needs to work with hardware from multiple vendors.
Re:Horse, THEN Cart (Score:2)
Where did I say anything about Doom3 (or any other id engine) not using DirectX? Please cite.
The point of this topic, and my posts, relates to 3D graphics. Direct3D is part of DirectX, and as far as I'm aware, any new version of Direct3D has been incorporated in a full DirectX release or beta. Therefore, you are w
Re:Horse, THEN Cart (Score:2, Insightful)
The game developers are the ones who really want new performance features (sure it will make the hardware manufacturers money but the developers are the ones who are really driving it).
The game developers don't ever work directly with the graphics card, only the API. So to them extensions to performance are basically extensions to the API (and just demand that users get a card that supports the API).
The API for DirectX is of course designed by Microsoft who want people to use it becaus
Re:Horse, THEN Cart (Score:2)
Re:Horse, THEN Cart (Score:2)
CPUs are fairly general-purpose. Graphics chips are anything but. As far as instruction sets go, there are so many instructions available to make a task efficient that it looks like the best they can do is just make the registers wider with new "enhanced" instructions to process more data bits per instruction.
Knowledge, THEN Post (Score:5, Insightful)
Your second bit of anti-Microsoft conjecture is no better than your first. When it comes to Microsoft working with Intel to add extensions to the x86 processor set, so what if they did? Do you think they wouldn't benefit all x86 operating systems? At the level of the instruction set, how would you design into an x86 CPU, instructions which only benefit one x86 OS? Yes, Microsoft has worked with Intel on the instruction set, but mostly vice verca. It is Intel who releases the manuals on "how to write an OS for our CPUs." But no matter how they're working together, that is a good thing, not "the evil empire at work."
Please, learn a little and think a little before you post your knee-jerk anti-MS reaction. There are plenty of legitimate reasons and opportunities to bash Microsoft. The problem I see is a lot of people look like that guy from Can't Hardly Wait who keeps trying to find the right second to start the slow clap.
Re:Knowledge, THEN Post (Score:2)
And please, get a real name. Chapterhouse Dune sucked.
Ooo ooh idea (Score:1)
Really it gets to the point where people are like I want 300fps *and* I want every pixel to be drawn perfectly.
I'd just go for decent poly/s and a card that doesn't catch on fire! Heck I play most games in 16bpp [yuck!
Wouldn't game companies.... (Score:2)
Re:Wouldn't game companies.... (Score:2)
anyways.. earlier it was probably the best way to be, coding the thing again for every platform as every operation was precious and the systems were so different that you really had to be thinking precisely what you were coding for. however as this has turned around in recent years that's no longer the case and no matter what system you're doing for you end up coding most(if not all) of the projects in higher level languages(using high level apis)
Re:Wouldn't game companies.... (Score:3, Interesting)
But there is more then just a technological cost to bring games to market. There is the marketing, QA, tech support, packaging, distribution, etc, etc, etc, of bringing games to additional markets. And even if it was just money, (most?) game companies are fairly small outfits; the 'distraction' of other markets might be too much for overworke
I want the nitty gritty (Score:2, Insightful)
No, wait, that would be bad marketing. You have to get everyone excited about it first, then when everyones asking for it, the other vendors will want to use it, and *then* the patents come out.
Ah, screw all this microsoft monopoly crap. I prefer free market capitalism. Give me Free Software any day.
Version mania (Score:2, Insightful)
Re:Version mania (Score:5, Insightful)
Disclaimer: I have never looked at or written a piece of code in my life that used DirectX.
However, your comment makes no sense. All games written for one version of DirectX should work in the later versions. Otherwise you'd have games failing left right and centre and people on here bitching about how they can't update DirectX without killing their favourite game.
Hell, I have a couple of DirectX 5 and 7 requiring games and they work just fine under v8 and my recently installed 9.
The only downside to the frequent updates is when you want to take advantage of all the new wizzy things the graphics cards are doing. But I don't think thats a fault of Microsoft, more an indication of the rapid pace of development (since MS merely support the things the graphics card makes tell them their next cards can do)
Re:Version mania (Score:3, Interesting)
(Disclaimer: I have written code for DirectX, but not since DirectX 7.)
Actually, you do get problems like this to a degree. When you want to get a DirectX interface, you have to go through COM+. COM+ requires that library develo
Re:Version mania (Score:3, Interesting)
COM+ was a horrible idea based around building/managing an application by dragging & grouping COM+ components some funky control panel/application dohicky. COM+ was built on top of COM. Most people like to forget that it ever existed.
The rest of what you say is essentially correct, though not technically. All COM interfaces are required to have a unique identifier associated with them (called a GUID), not just interfaces you want to have different versions of. Normally
Re:Version mania (Score:2)
Asmo
Re:Version mania (Score:2)
Re:Version mania (Score:2)
Implementing DirectX using COM is actually a very good way to do it. I won't really go into the details on why, but I think with the questions listed above y
Re:Version mania (Score:2, Interesting)
DirectX is COM-based, so it remains backwards-compatible. COM specifies that new versions of a COM component should support the older interfaces. Besides, the only time I remember that there was a drastic change in DirectX's architecture, was when they switched from DX7 to DX8 when DirectDraw en Direct3D where merged into DirectGraphics. Besides, even then you could still use the older interfaces if you wanted to.
Re:Version mania (Score:3, Informative)
OpenGL generally has features available BEFORE DirectX does, accessible via extensions.
However, once it's available through a vendor extension (NVidia or ATI proprietary) then it usually takes a while and some reworking to make your code work when an official extension is supported (ARB, or somethings EXT).
However your other comments are pretty much right on. You don't change your DX8 game to DX9 just because DX9 just came out, however you probably WILL change
Re:Version mania (Score:2)
Fair enough.
But in my defense, I was using as a starting point the previous poster's allegation that OpenGL is superior because of its lack of mutability. So I suppose my followup had a bit of GIGO to it. Thanks for the further info.
That annoying spellchecker (Score:2, Interesting)
They're failing to address a major bottleneck IMO (Score:2, Funny)
At Nintendo, we have been surprised that no major graphics vendor has really addressed to an adequate degree this probl
Taking this joke and running with it (Score:2)
In a nutshell, we move bandwidth and space-consuming model and texture data from RAM and disc media, where it is time-consuming to load, to super-fast ROM, contained within the GPU itself.
The Intellivision console actually tried storing 3/4 of its textures in ROM, which is why so many of the games for InTV look the same. I originally thought this was true of the NES as well, what with the uppercase Latin font being identical across so many early NES games and with early Game Boy games having a different
Re:They're failing to address a major bottleneck I (Score:3, Informative)
Re:They're failing to address a major bottleneck I (Score:2)
DirectX Next? (Score:2, Funny)
It would certainly get someone's attention...
Cliff's notes for people who don't want to read it (Score:4, Informative)
1. The big change is all memory goes virtual. What this means is that you don't need to load an entire texture to render a subset of it's pixels. This is a VERY good thing considering on most textures you're only using a low level mipmap anyway. Thus, texture memory on the card becomes more like a gigantic L2 or L3 cache that can be efficiently used. Also you can have massive texture spaces without having things go all slow over AGP. 3Dlabs' Wildcat already does this. This was originally mentioned by Carmack in the 3/27/2000
In addition, geometry is stored virtual as well, as are shaders, which can be loaded into the processor in pages, instead of being limited to a small block of instructions that have to fit entirely into the GPU registers. The registers now work more like an L1 cache, and shader programs can be effectively unlimited size. This means lots of neat special effects will be possible.
2. High ordered surfaces (curves) are getting mandated. No more n-patches vs truform, it's going to use standard curve systems like Beizer splines.
3. Fur rendering and shadow volumes are going into hardware as part of a new "tesselation processor"
4. You can have multiple instances of meshes. This means you can take one model, run a few vertex programs on it, and store each result seperately. Saves alot of time later.
5. Integer instruction set. This is so you don't have to deal with floating point data when you don't need to. There are some times you want simpler data for use in a shader program and having to pretend everything's a floating point texture isn't convenient.
6. Frame buffer current pixel value reads. This has been a developer request for a long time. It's not mandatory in the spec, but it can be used for all sorts of stuff. Basicly the GPU can read the current value in the framebuffer into the pixel pipeline without needing to maintain a second copy. This will both save alot of memory and allow you to do things such as light accumulation more efficiently.
Re:Cliff's notes for people who don't want to read (Score:2)
This is really sweet. There's any number of times I've wanted to store one shader in the light and one on the mesh and just combine them. it looks like this would allow you to run through once with the light shader and then run through on a second pass with your stored mesh on the specific mesh shader. Unless I'm misinterpreting (
Physics? (Score:2)
What about the networking code? (Score:2)
Configuring a firewall to pass the ridiculous number of ports required [microsoft.com] is a pain in the ass (I actually wrote a script to do it because it's too tedious otherwise) and you still can't have multiple players inside and outside if you're NATted to a single outside address. Well, OK, you can sort of do it if you are will
Re:So what is a shader? (Score:1)
Re:So what is a shader? (Score:4, Informative)
Shaders programs let you do cool things like features [e.g. skin, roughness to things, etc...]
What I don't get is why didn't they just make the GPU a generic RISC with say 32/32 registers [ALU/FPU] and a set of instructions that fast graphics would require [say saturated X bpp operations, fast division, etc...]
That way you have a processor you can just upload code to. Also make it a standard so instead of having "every joe and their brothers graphic processor specs...." you have something truly conforming...
Tom
Re:So what is a shader? (Score:3, Informative)
Documentation of the OpenGL side is in the OpenGL Extension Registry [sgi.com], look for "shader" and "program".
Re:So what is a shader? (Score:5, Informative)
This was tried in the past, with TI's TIGA (Texas Instruments Graphics Architecture) which supported the TMS34010 and TMS34020/34082 graphics coprocessors. This was a really neat architecture which accelerated 2D and basic 3D operations. Unfortunately, the CPU chip manufacturers (Intel, etc...) would identify the bottlenecks and optimise their CPU's so that the next generation chips would be faster than a current generation CPU/GPU. "Local Bus" basically whacked out TIGA from the market. A real shame, since you could write your own extensions which had complete access to GPU memory (maybe this was a bad thing). They even got as far as having a trapezium rendering algorithm (halfway to rendering triangles).
Going back to the present day, look for the extensions like ARB_vertex_program and ARB_fragment_program. According to Microsoft's plans, these will at least have identical instruction sets. I wonder how long it will be before we can completely define an entire graphics pipeline using a single program.
(This would probabl require virtual "clip_vertex", "render_triangle" function calls).
Re:So what is a shader? (Score:2)
I am a 3D coder guy, and your description is grossly accurate. Allow me to be more accurate. :-)
In 3D graphics, models are composed of vertex sets (all the edges) and fragments (pieces of the model). For performance reasons, you want to keep all the models on the video card, and not re-transmit them
Re:What are the real advantages? (Score:2)
OpenGL = portable, and for some people, easier to wrap their heads around than DirectX (the famous battle between Carmack and Microsoft was in the DX *3* era; things have improved vastly, since then, and Carmack agrees.)
DX = everything in one package (graphics, sound, networking, input, and so on,) and it all works on a windows PC. Oh, and less headaches for the developer in terms of drivers; I remember GLSetup.exe so you could get just the right combo of drivers, OpenGL release and so on.