Forgot your password?
typodupeerror
Graphics Software

A Glimpse Into 3D future: DirectX Next Preview 222

Posted by CmdrTaco
from the virtual-sandwich dept.
Dave Baumann writes "Beyond3D has put up an article based on Microsoft's games developers presentations given at Meltdown, looking at the future directions of MS's next generation DirectX - currently titled "DirectX Next" (DX10). With Pixel Shaders 2.0 and 3.0 already a part of DirectX9 this article gives a feel of what to expect from PS/VS4.0 and other DirectX features hardware developers will be expected to deliver with the likes of R500 and NV50."
This discussion has been archived. No new comments can be posted.

A Glimpse Into 3D future: DirectX Next Preview

Comments Filter:
  • It would be nice (Score:3, Interesting)

    by Pingular (670773) on Sunday December 07, 2003 @10:18AM (#7653284)
    If they could somehow program Dx10 so it was backwards compatiable with cards now (such as radeon 9800 etc), if I'd bought such a card I'd be quite annoyed if there wasn't decent support for it in the future.
    • Re:It would be nice (Score:4, Informative)

      by halo1982 (679554) * on Sunday December 07, 2003 @10:21AM (#7653294) Homepage Journal
      If they could somehow program Dx10 so it was backwards compatiable with cards now (such as radeon 9800 etc), if I'd bought such a card I'd be quite annoyed if there wasn't decent support for it in the future.

      DX10 will work fine with your new card. DX has always done this. DX9 works fine with DX8 cards like the Radeon 9000/9100 and GeForce4 series.
      However the cards do not have support for the new features of DX10 (like PS/VS3/4 etc). The cards can work with the new software, and do, but the hardware just isn't there.

    • The reason that it'll include new optimized CPU/GPU algorithms is that new graphics cards (i.e. hardware) will have new inbuilt routines / operations for this kinda thing.

      Even if DirectX 8 was, for arguments sake, completely backwards compatible with a 1980s graphics card that doesn't mean it would be able to suddenly make it do pixel shading or nifty T&L stuff.
    • by jsse (254124)
      if I'd bought such a card I'd be quite annoyed if there wasn't decent support for it in the future.

      You mean you don't upgrade your video card once a year?! :)
    • by cybrthng (22291) on Sunday December 07, 2003 @10:37AM (#7653366) Journal
      DX9 is backwards compatible with even my lowly NV25 and MX cards.

      The issue is my card doesn't have the vertex shaders and other registers that DX9 takes advantage of so i won't be fully accelerating new DX9 features. I can run DX9 games just fine even though my card was designed with 8 in mind.

      Its not that it isn't backwards compatible, it is that your hardware doesn't suport technology of the future since it didn't exist

      Only way around this would be if your GPU core was software driven and they could update it. Otherwise to get new DX10 support, you need a DX10 card that was built with the new functionality in mind.

      Backwards compatibility has nothing to do with it. Its just like in the days of MMX vs NON MMX. IF you had MMX it ran faster, if you didn't it never wouldn't work for you.. just would be slower.
      • by Fnkmaster (89084) * on Sunday December 07, 2003 @11:17AM (#7653532)
        That's how it's supposed to be. The problem is that in practice, I've seen cases where the "emulation" of a vertex shader in CPU didn't work properly (a DX8 vertex shader that ran fine on GPU, but had weird problems on CPU). The solution was a line-for-line port to C++ of the vertex shader, and having a separate execution path for non-VS supporting cards. In short, a big pain in the ass to program for.
      • DX9 is backwards compatible with even my lowly NV25 and MX cards... Its just like in the days of MMX vs NON MMX. IF you had MMX it ran faster, if you didn't it never wouldn't work for you.. just would be slower.

        Now, in general you are correct - however, the Deus Ex 2 demo refused to run on my girlfriend's PC because it lacked support for pixel shaders (v1.1, iirc). That machine has the latest DX installed, but only has a GeForce 4 MX. My machine, with a Ti 200, runs the demo fine.

        Perhaps it doesn't have
        • That has nothing to do with DirectX.

          Deus Ex 2 specifically uses hardware Pixel Shaders in its engine. If your card doesn't use them, it won't run, because Deus Ex 2 was programmed that way. The GeForce MX cards are not supported.

          What people mean by backwards compatibility in DirectX is that DirectX 9 doesn't break compability with games that use earlier versions of DirectX. For instance, I'm typing this on a laptop that has a Mobility Radeon 7500, and I have DirectX 9 installed and can still play Black
      • Only way around this would be if your GPU core was software driven and they could update it. Otherwise to get new DX10 support, you need a DX10 card that was built with the new functionality in mind.

        Or if it was reconfigurable hardware... like an FPGA

        Only problem is that performance FPGAs in the types of speeds graphics cards require are not cheap... 1 GHz units typically cost $300+, and those top out at about a million logic units, making for a final end-user cost on the wrong side of $1000, I'm sure...
    • And you are surprised by the prospect because...? This is how it's been for years now. Buy the latest card, get all the latest features, then 6-12 months later it's outdated. You don't have any reason to complain, especially since the next DX version won't ship until Longhorn ships, which will probably be sometime in the year 2020 ;)
  • So while we wait for some great games to make proper use of DX9, we can dream of games on DX10 and only dream of the wonders of DX11.
  • by Anonymous Coward on Sunday December 07, 2003 @10:21AM (#7653292)
    XBOX Next?
    DirectX Next?

    I guess we all know what the Next version of Windows is going to be called! :)
  • by Anonymous Coward
    At this point DirectX is years ahead of OpenGL
    • by lemody (588908) on Sunday December 07, 2003 @10:26AM (#7653309)
      OpenGL and DirectX are different kind of systems, DirectX offering interfaces to input devices, sound controlling etc. OpenGL is just for graphics!
      Don't get this personal, I always post like this when someone compares these two :)
    • by Molt (116343) on Sunday December 07, 2003 @11:10AM (#7653479)
      I'd say that if you're on about the graphics subsystem of DirectX then OpenGL is pretty much at the same level if.. and only if.. you are willing to use the standardised extensions [sgi.com]. If you're not using these expect slowness, if you're using the non-standardised vendor-specific extensions then expect more speed but more difficulty in making it work across the board.
    • I don't think that OpenGL will be "finished" as long as DirectX only works under Windows. There are other operating systems out there and while most support OpenGL in their windowing system, DirectX is only for Windows.

      If you meant OpenGL is dead in the Windows games market, I'd argue that it mostly has been for a while. Yeah John Carmack uses OpenGL, but most games are implemented in DirectX. It's not like it really even matters, though, actually rendering code is usually a pretty small portion of a game
    • by Screaming Lunatic (526975) on Sunday December 07, 2003 @02:42PM (#7654687) Homepage
      At this point DirectX is years ahead of OpenGL

      No it's not. With the approval of the ARB_vertex_buffer_object extension and GLSlang, both APIs expose about the same level of functionality. Render to texture is a mess in OpenGL right now. But there are Super Buffers/pixel_buffer_object extensions in the works. And the Super Buffers extension looks like it will cover most of the functionality that is slated for DirectX Next.

      Revelant links:

      http://oss.sgi.com/projects/ogl-sample/registry/

      http://oss.sgi.com/projects/ogl-sample/registry/ ARB/vertex_buffer_object.txt

      http://oss.sgi.com/projects/ogl-sample/registry/ ARB/shading_language_100.txt

      http://www.opengl.org/about/arb/notes/meeting_no te_2003-06-10.html

      http://developer.nvidia.com/docs/IO/8230/GDC2003 _OGL_ARBSuperbuffers.pdf

      Note that OpenGL is usually updated once a year at Siggraph. The next version of DirectX is slated for after the release of Longhorn. That'll be 2005 or so.

      Please do not perpetuate the myth that OpenGL is "falling behind" Direct3D. That is plain wrong. And a diservice to both the open source community and the graphics development community.

      • The people who say has DirectX destroyed OpenGL generally dont know what they are talking about. Basically both give you an interface to the graphics hardware and some legacy stuff (eg Direct X has a fairly complete software implementation of most stuff although many programs test the hardwre capabilities directly so this becomes irrelevant). There are a couple of problems with the GL interface (render to texture mainly). But there is also the "traditional" OpenGl interface, which is what most people learne
      • One thing that many people might want to consider when comparing DirectX and OpenGL is that DirectX is a suite, and GL an interface to the 3d extensions (and 2d) of your video card.

        However, with that in mind, is there an equivilent suite that incorporates GL? 3D graphics are nice, but I do remember thinking nice things about DirectMusic (situation-themable audio events) and DirectSound. Is there anything we can use to compare to directX as a whole?

        Being that I'm currently working on learning/developing
  • Overkill? (Score:2, Interesting)

    by zachusaf (540628)
    With few, if any, games fully supporting DX9, is DX10 a bit of overkill? I'm all for the advancment of technology, but it looks like the cart is coming before the horse, and dragging the horse with it.
    • Re:Overkill? (Score:3, Insightful)

      by JDevers (83155)
      Not really, they are saying this won't be out until at least Longhorn. By the time that comes out, you can bet a LOT of games will fully exploit DirectX 9...
    • Well, its not like DX10 is going to be released next week or anything!

      The planning for such things always begins long before the final version is released. Right now they are getting input, looking at ways to do things that developers would want, etc. DX10 is probably at least two years away, and games for it farther than that.

  • by Jarrik (728375) on Sunday December 07, 2003 @10:28AM (#7653319)
    Doom 3 was delayed.. again.
    • As was Duke Nukem Forever, in order to add DX10 functionality.
      • Re:In other news... (Score:2, Interesting)

        by DigiShaman (671371)
        I don't think Duke Nukem Forever is going to ever be released. In fact, I would go so far to say that it's nothing more then an internal R&D project to test the latest game engines for future games that WILL be released to the market. And yes, I'm bitter with sarcasm today

  • Slayers (Score:5, Funny)

    by zeroclip (700917) on Sunday December 07, 2003 @10:30AM (#7653331)
    So DX11 will be "DirectX Try"?
  • if theres one thing a business model can achieve, its quick and streamlined development on something as critical as directx. gnu/linux desperately needs improvement in this area
  • Who cares? (Score:5, Insightful)

    by Glock27 (446276) on Sunday December 07, 2003 @10:33AM (#7653343)
    Beyond3D has put up an article based on Microsoft's games developers presentations given at Meltdown

    I could care less about this functionality being exposed through a proprietary API.

    My question is: when will it be available in OpenGL 2.x? :-)

    Cross platform is the best way to go with game development...and OpenGL is the only game in town for cross-platform 3D graphics. It is also the official 3D API for Macintosh.

    • Re:Who cares? (Score:3, Insightful)

      by n0k14 (719810)
      cross platform is the best way to go with game development? hah! maybe on consoles, but with the staggering price of game development now-days its almost too risky to do cross-platform development. companies porting games to apple only have moderate sucess, so how are they going to feel developing for an os unproven in game development with users who are used to getting everything for free? (dont worry, i love linux, but i just dont think we should lie to ourselves)
      • Re:Who cares? (Score:5, Interesting)

        by Glock27 (446276) on Sunday December 07, 2003 @10:54AM (#7653425)
        cross platform is the best way to go with game development? hah! maybe on consoles, but with the staggering price of game development now-days its almost too risky to do cross-platform development.

        IMO, yes cross-platform is the way to go. If you use the right engine (Torque for instance), you get it for free, less the occasional support call. ;-)

        Look at some of the top games that have been cross-platform:

        • All id games
        • Baldur's Gate Series
        • Warcraft Series
        • Diablo Series
        • Sims Series
        • You Don't Know Jack Series
        • Age of Empires
        • Starcraft
        • Everquest
        • 3-D Ultra Pinball: Thrillride
        • Microsoft Close Combat 2.0: A Bridge Too Far
        • Monopoly
        • Terminus
        • and many more...

        See any successful games there? ;-) And even Microsoft is smart enough to do it, while trying to lock everyone else into Windows/DirectX. Pretty funny, actually...

        so how are they going to feel developing for an os unproven in game development with users who are used to getting everything for free? (dont worry, i love linux, but i just dont think we should lie to ourselves)

        If they get the port essentially for free, and provide it as an "unsupported" extra, they will get a ton of good press on Usenet, the web and so on, from alpha geeks. Look at the reception Baldur's Gate games get here on Slashdot. That's worth it right there! :-)

        • Your list is correct in terms of games ported to other OS's, but not correct in terms of graphics used. Most of those games(The Sims, Age of Empires, etc) were originally built using Direct3D, and then ported to OpenGL for the Mac release, the only notable exceptions I know of being the id games and Blizzard's games.
        • Ecerquest was Windows only for a LOOOONG time. It was a Windows/DirectX app from inception. Only long after it had been out (like 3 years) was it ported to other platforms, and that was after it's prime.

          There is a real difference between somethign that was developed as a Windows game, then later because of its success, had effort spent on it to make it work on another platform, and a game that was designed using open standards as to run on more than one platform from the beginning.
        • There's no such thing as a free port. Even if you're using largely the same APIs.
      • You forget the Mac OS, a platform where users are *known* for paying for things, and at a premium even!

        If you develop your game from the beginning to be cross platform, you will incur little, if any, development costs for your port because the cost will be part of the development.

        Essentially the time and price to design (which is little relative to debug and test) and the time and price to test and debug; and if the game is developed with portability in the first place, you will see fewer bugs and problem
    • Just one more thought on this: it's really too bad the Indrema console didn't make it. That would have been the first OpenGL based console.

      Is there any sort of OpenGL support for PS2? Maybe PS3 will make the jump...

      • >> Is there any sort of OpenGL support
        >> for PS2? Maybe PS3 will make the jump...

        Console makers in general, and sony in particular, benefit hugely from exclusive titles. They have a lot to lose by making it trivial to port off of their console to other systems.

        Would playstation be the success it is without Sony's relationship with Square ?
      • Probably not, but there is linux for PS2... anybody know whether SDL has been ported yet?
  • Horse, THEN Cart (Score:2, Interesting)

    by Gothmolly (148874)
    Doesn't this logic seem backwards?
    With Pixel Shaders 2.0 and 3.0 already a part of DirectX9 this article gives a feel of what to expect from PS/VS4.0 and other DirectX features hardware developers will be expected to deliver with the likes of R500 and NV50.

    Shouldn't hardware vendors develop processing capability, then the software vendors implement the OS support? Or maybe I'm sensitive to the Evil Empire trying to dictate other computing advances through its 'embrace and extend' philosophy.

    Compare thi
    • by Anonymous Coward
      Microsoft aren't dictating to NVidia and ATI what features to put in their next chips, either. NVidia, ATI, other card makers, and graphics programmers, are telling Microsoft what features they need in an API, and Microsoft are releasing APIs that have those features.

      Compare this to OpenGL, which is lagging so far behind that only rare titles take it seriously (Doom3 is the one that springs to mind).

      Note for example that both NVidia and ATI provide better support for DirectX in their drivers than they do
      • by Stiletto (12066) on Sunday December 07, 2003 @12:30PM (#7653959)

        Compare this to OpenGL, which is lagging so far behind that only rare titles take it seriously (Doom3 is the one that springs to mind).

        I can only see one property of OpenGL that is "lagging behind" DirectX: Whiz-bang features.

        Is OpenGL "lagging behind" DirectX in portability? hardware support? scalability?

        I would argue that OpenGL as a general-purpose 3D API is more useful than DirectX soley because it is more widespread. The API is implemented (or implementable) on a more diverse selection of hardware and software platforms than DirectX can ever dream of.

        As a Intel-Windows-Cutting-Edge-Game-only API, DirectX is the way to go, but for everything else, we have OpenGL.
      • Re:Horse, THEN Cart (Score:4, Informative)

        by Glock27 (446276) on Sunday December 07, 2003 @12:41PM (#7654022)
        Compare this to OpenGL, which is lagging so far behind that only rare titles take it seriously (Doom3 is the one that springs to mind).

        Wow, you are very uninformed for someone who was rated +5 Insightful.

        OpenGL exposes new 3D functionality much faster than DirectX, through the OpenGL extension mechanism. It may not be as convenient as having a "standardized" API (and OpenGL 2.0 will address as much of that issue as it can), but it is still better to be able to use new functionality immediately, rather than waiting for the next DirectX release (or worse yet beta) from Microsoft. NVIDIA's drivers even support all of this under Linux.

        As to your "rare titles" comment, see my other post for top games using OpenGL [slashdot.org]. Also reflect on the fact that every id game plus all the games based on id engines (Heretic 1/2, RTCW/ET and many more) all use OpenGL exclusively.

        And guess what, when id releases Doom3, I'm pretty sure it'll raise the bar again. Perhaps by then quite a few people will have shader-capable video cards. ;-)

        For more correct information about OpenGL, feel free to check out the official OpenGL website [opengl.org].

        • OpenGL exposes new 3D functionality much faster than DirectX, through the OpenGL extension mechanism.

          Unfortunately, each hardware vendor has it's own set of extensions they implement and support.... which isn't incredibly useful when you're writing software which needs to work with hardware from multiple vendors.
    • Here's my take:

      The game developers are the ones who really want new performance features (sure it will make the hardware manufacturers money but the developers are the ones who are really driving it).

      The game developers don't ever work directly with the graphics card, only the API. So to them extensions to performance are basically extensions to the API (and just demand that users get a card that supports the API).

      The API for DirectX is of course designed by Microsoft who want people to use it becaus

    • I wouldn't say that Microsoft in trying to dictate what Intel does, but they probably try to influence certain decisions that will affect how they program their Operating Systems. CPU makers and OS makers have to be in bed together.
    • Compare this to CPU design, however - Microsoft doesn't dictate to Intel what extensions to add onto x86. Or do they? (puts on tinfoil hat)

      CPUs are fairly general-purpose. Graphics chips are anything but. As far as instruction sets go, there are so many instructions available to make a task efficient that it looks like the best they can do is just make the registers wider with new "enhanced" instructions to process more data bits per instruction.
    • by BasharTeg (71923) on Sunday December 07, 2003 @11:31AM (#7653608) Homepage
      I try not to make it a habit to flame people, but do you know what you're talking about? Adding new functionality to DirectX *before* the new hardware comes out, means that when you buy your new GeForce FX 9999, you don't have to wait for Microsoft to release a new version of DirectX 6 months later to use the full potential of the card. This has absolutely nothing to do with embrace and extend. This is their proprietary graphics/multimedia API in the first place. How can they "embrace and extend" their own library?

      Your second bit of anti-Microsoft conjecture is no better than your first. When it comes to Microsoft working with Intel to add extensions to the x86 processor set, so what if they did? Do you think they wouldn't benefit all x86 operating systems? At the level of the instruction set, how would you design into an x86 CPU, instructions which only benefit one x86 OS? Yes, Microsoft has worked with Intel on the instruction set, but mostly vice verca. It is Intel who releases the manuals on "how to write an OS for our CPUs." But no matter how they're working together, that is a good thing, not "the evil empire at work."

      Please, learn a little and think a little before you post your knee-jerk anti-MS reaction. There are plenty of legitimate reasons and opportunities to bash Microsoft. The problem I see is a lot of people look like that guy from Can't Hardly Wait who keeps trying to find the right second to start the slow clap.
      • Take a page from your own book - please provide EVIDENCE (not a number of anecdotes) that MS has worked on the x86 instruction set with Intel.

        And please, get a real name. Chapterhouse Dune sucked.
  • Instead of say "ultra shade 2000 (tm)" how about "more polys render (tm)". Cuz really well drawn 2d trees/grass/foilage is getting kinda lame [no matter how high the bpp is!]

    Really it gets to the point where people are like I want 300fps *and* I want every pixel to be drawn perfectly.

    I'd just go for decent poly/s and a card that doesn't catch on fire! Heck I play most games in 16bpp [yuck! ... though I recall when 15/16 bpp was dreamy] cuz I concentrate on the game, not how nicely phong shaded some poor
  • ... be better off using something like OpenGL or SDL or some other cross-platform (if not Free, free, or combinations thereof) API, if for nothing more than to make porting to Mac or consoles or anything else much easier?
    • game companies tend to be very short sighted in this regard.

      anyways.. earlier it was probably the best way to be, coding the thing again for every platform as every operation was precious and the systems were so different that you really had to be thinking precisely what you were coding for. however as this has turned around in recent years that's no longer the case and no matter what system you're doing for you end up coding most(if not all) of the projects in higher level languages(using high level apis)
    • In theroy, yes. On a purely technological decision making, its not all that hard to develope cross platform apps, even apps that use bleeding edge features like games.

      But there is more then just a technological cost to bring games to market. There is the marketing, QA, tech support, packaging, distribution, etc, etc, etc, of bringing games to additional markets. And even if it was just money, (most?) game companies are fairly small outfits; the 'distraction' of other markets might be too much for overworke

  • How about a sneak preview of how many patent licenses it will require to implement?

    No, wait, that would be bad marketing. You have to get everyone excited about it first, then when everyones asking for it, the other vendors will want to use it, and *then* the patents come out.

    Ah, screw all this microsoft monopoly crap. I prefer free market capitalism. Give me Free Software any day.
  • Version mania (Score:2, Insightful)

    by hey (83763)
    I have done some work with DirectX and the biggest problem I see is that new versions come out too quickly. Do you want your project to be totally tied to DirectX version N with you know N+1 will be out next year making your huge project obsolete or requiring a rewrite. For that reason SDL or OpenGL (an API that hardly changes) appeal to me. Who wants to build on shifting sands.
    • Re:Version mania (Score:5, Insightful)

      by Mr_Silver (213637) on Sunday December 07, 2003 @11:42AM (#7653668)
      I have done some work with DirectX and the biggest problem I see is that new versions come out too quickly. Do you want your project to be totally tied to DirectX version N with you know N+1 will be out next year making your huge project obsolete or requiring a rewrite.

      Disclaimer: I have never looked at or written a piece of code in my life that used DirectX.

      However, your comment makes no sense. All games written for one version of DirectX should work in the later versions. Otherwise you'd have games failing left right and centre and people on here bitching about how they can't update DirectX without killing their favourite game.

      Hell, I have a couple of DirectX 5 and 7 requiring games and they work just fine under v8 and my recently installed 9.

      The only downside to the frequent updates is when you want to take advantage of all the new wizzy things the graphics cards are doing. But I don't think thats a fault of Microsoft, more an indication of the rapid pace of development (since MS merely support the things the graphics card makes tell them their next cards can do)

      • Re:Version mania (Score:3, Interesting)

        by DeadMeat (TM) (233768)

        However, your comment makes no sense. All games written for one version of DirectX should work in the later versions. Otherwise you'd have games failing left right and centre and people on here bitching about how they can't update DirectX without killing their favourite game.

        (Disclaimer: I have written code for DirectX, but not since DirectX 7.)

        Actually, you do get problems like this to a degree. When you want to get a DirectX interface, you have to go through COM+. COM+ requires that library develo

        • Re:Version mania (Score:3, Interesting)

          by Keeper (56691)
          You use DirectX through COM.

          COM+ was a horrible idea based around building/managing an application by dragging & grouping COM+ components some funky control panel/application dohicky. COM+ was built on top of COM. Most people like to forget that it ever existed.

          The rest of what you say is essentially correct, though not technically. All COM interfaces are required to have a unique identifier associated with them (called a GUID), not just interfaces you want to have different versions of. Normally
          • More mud in the mix with Managed DX9. So far Microsoft have already released an update which broke our compile - even though it was a BUILD number change not a minor version change the API was incompatible. Long term tho' the managed DLL versioning should prevent such problems and ensure a more stable API of which multiple versions can really co-exist

            Asmo
          • I stand corrected. Personally I've just grouped COM and COM+ together under the category of "APIs that need to be taken out back and shot". (I'm sure COM has its uses, but IMHO DirectX ain't it.)
            • Actually, COM is a very good way to go about solving the problem it attempts to solve: How do you communicate with processes/objects outside of your address space? And how do you do so in a way that allows you to share libraries/code between applications in a standardized way? And do so with code compiled with different compilers/compiler versions?

              Implementing DirectX using COM is actually a very good way to do it. I won't really go into the details on why, but I think with the questions listed above y
    • Re:Version mania (Score:2, Interesting)

      by sithlord2 (261932)


      DirectX is COM-based, so it remains backwards-compatible. COM specifies that new versions of a COM component should support the older interfaces. Besides, the only time I remember that there was a drastic change in DirectX's architecture, was when they switched from DX7 to DX8 when DirectDraw en Direct3D where merged into DirectGraphics. Besides, even then you could still use the older interfaces if you wanted to.

  • by gspr (602968)
    Notice how some words, such as "OSs", are underlined by the spellchecker in the pictures. Are they too lazy to remove those?
  • Since the earliest days of 3D graphics architetures on the PC, a major bottleneck has been the speed of the bus between main system memory and the graphics hardware, be it AGP, PCI, or some proprietary solution. This is usually the limiting factor when it comes to transferring models and textures of a size that we would like to use when rendering super realistic 3-D characters for games.

    At Nintendo, we have been surprised that no major graphics vendor has really addressed to an adequate degree this probl

  • by Reteo Varala (743)
    What, you mean it's not going to be called DirectXX?

    It would certainly get someone's attention...
  • by Gldm (600518) on Sunday December 07, 2003 @06:53PM (#7656100)
    The article's really long, and somewhat technical. Here's the layman highlights for anyone who just wants to know "Ok what should I care about?"

    1. The big change is all memory goes virtual. What this means is that you don't need to load an entire texture to render a subset of it's pixels. This is a VERY good thing considering on most textures you're only using a low level mipmap anyway. Thus, texture memory on the card becomes more like a gigantic L2 or L3 cache that can be efficiently used. Also you can have massive texture spaces without having things go all slow over AGP. 3Dlabs' Wildcat already does this. This was originally mentioned by Carmack in the 3/27/2000 .plan update which you can find here: http://www.bluesnews.com/cgi-bin/finger.pl?id=1&ti me=20000308010919

    In addition, geometry is stored virtual as well, as are shaders, which can be loaded into the processor in pages, instead of being limited to a small block of instructions that have to fit entirely into the GPU registers. The registers now work more like an L1 cache, and shader programs can be effectively unlimited size. This means lots of neat special effects will be possible.

    2. High ordered surfaces (curves) are getting mandated. No more n-patches vs truform, it's going to use standard curve systems like Beizer splines.

    3. Fur rendering and shadow volumes are going into hardware as part of a new "tesselation processor"

    4. You can have multiple instances of meshes. This means you can take one model, run a few vertex programs on it, and store each result seperately. Saves alot of time later.

    5. Integer instruction set. This is so you don't have to deal with floating point data when you don't need to. There are some times you want simpler data for use in a shader program and having to pretend everything's a floating point texture isn't convenient.

    6. Frame buffer current pixel value reads. This has been a developer request for a long time. It's not mandatory in the spec, but it can be used for all sorts of stuff. Basicly the GPU can read the current value in the framebuffer into the pixel pipeline without needing to maintain a second copy. This will both save alot of memory and allow you to do things such as light accumulation more efficiently.
    • 4. You can have multiple instances of meshes. This means you can take one model, run a few vertex programs on it, and store each result seperately. Saves alot of time later.

      This is really sweet. There's any number of times I've wanted to store one shader in the light and one on the mesh and just combine them. it looks like this would allow you to run through once with the light shader and then run through on a second pass with your stored mesh on the specific mesh shader. Unless I'm misinterpreting (

  • There doesn't appear to be a single thing in there to address physics. This means that I'm still going to get monsters in the wall, explosions that go through floors and clothing passing through players, it's just going to look nicer.
  • Don't DirectX games absolutely blow chunks when playing across untrusted networks? I mean, they always have in my experience, but that's limited to the Civilization series of strategy games.

    Configuring a firewall to pass the ridiculous number of ports required [microsoft.com] is a pain in the ass (I actually wrote a script to do it because it's too tedious otherwise) and you still can't have multiple players inside and outside if you're NATted to a single outside address. Well, OK, you can sort of do it if you are will

There is no opinion so absurd that some philosopher will not express it. -- Marcus Tullius Cicero, "Ad familiares"

Working...