Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Software The Almighty Buck

Universal3D vs. Real Open Standards 174

viveka writes "Back in April, Slashdot reported the announcement of a Universal 3D File Format by Intel, Microsoft & others - to be "as open as MP3". Of course, that's not all that open. And this turns out to be the sneaky part. There is a real open standard already - X3D is ISO-ratified, royalty-free, and has multiple open source implementations. U3D is "going to be submitted to ISO" - one day - but right now they're talking to ECMA, which allows royalty-bearing patents. I found this article by Tony Parisi, co-chair of the X3D Working Group a fascinating insider's picture of the standards wars, along with insights into what it takes to release an online game, what really killed VRML, and why open standards do (and don't) matter. I mean, a royalty-bearing, pseudo-open universal 3D format from Intel and Microsoft? Sorry, guys. That trick doesn't work anymore ;)"
This discussion has been archived. No new comments can be posted.

Universal3D vs. Real Open Standards

Comments Filter:
  • by g-to-the-o-to-the-g ( 705721 ) on Saturday August 28, 2004 @04:03PM (#10098438) Homepage Journal
    MS isn't interested in helping the competition, they're trying to push down the competition. As long as they have a monopoly and they ignore standards, it can make it even easier for them to retain their monopoly. We hear all the time about how people don't want to use non-MS products due to incompatabilities. I would be very surprised if MS ever actually does conform to web standards and such.
  • by Cyclone66 ( 217347 ) on Saturday August 28, 2004 @04:04PM (#10098445) Homepage Journal
    Ok seriously there are too many hyperlinks. Which one is the article. You don't need to hyperlink every single word to get your point across!
  • Are Real Open Standards anything like Real Ultimate Power?
  • Maybe that is why VRML and X3D were not successful. Storing binary data like 3D vector data and texture data in a text file and then compressing the text file to get acceptable file sizes is just plain stupid.

    Binary storage for 3D data makes a lot more sense since it is more compact and easier to parse, and there are also standards such as the IEEE float and double standard.

    But nowadays everything has got to be XML, even if it does not make any sense at all. XML is fine for configuration files and office documents, but for image and vector data it is just not the right tool.
    • by hopethishelps ( 782331 ) on Saturday August 28, 2004 @04:27PM (#10098600)
      Binary storage for 3D data makes a lot more sense since it is more compact and easier to parse

      A binary format for X3D is being defined. X3D supports multiple file encodings describing the same abstract model.

      But nowadays everything has got to be XML, even if it does not make any sense at all

      The XML encoding enables smooth integration with web services and cross-platform inter-application file and data transfer. An excellent idea, surely. See the X3D FAQ [web3d.org] for more details.

    • by RWerp ( 798951 ) on Saturday August 28, 2004 @04:30PM (#10098616)
      I disagree. Storing vector data in text files has tha advantage that in extreme case, I can always edit the file with just a text editor.
    • I agree! XML is overused just because it seems cool. A binary format is far superior for storing lots the shear number of verticies in todays 3d scenes. I skimmed through the X3D samples, and while primatives are as simple as tags, actual gemometry is rediculous! Storing a single component/attribute of a vertex (with as many as 4 components and 10+ attributes) is about 12 bytes to a binary format's 4 bytes (for a standard float). I see X3D has some compression mechenisms, but I can't imagin how those are o
      • Ah, the compression is nearly as I expected:

        http://www.web3d.org/x3d/content/examples/Binary Co mpression/index.html

        I'd be interested to see someone convert one of those samples to a binary format.
      • I don't see that as being enough overhead to avoid using xml. Squirting a complex X3D file at a device will only take a fraction of a second with _todays_ speeds, why not make it human readable/easily edited/open ?
    • 3D and XML (Score:5, Insightful)

      by jefu ( 53450 ) on Saturday August 28, 2004 @06:21PM (#10099345) Homepage Journal
      I knew someone was going to use this article to bring up the same old complaints about XML being inappropriate.

      While the size of some 3D data sets is a concern with XML, XML is otherwise very well suited for such data. It is often irregular (which makes relational databases tough) and hierarchical (with elements sitting at different places in a scene graph). So it fits XML almost perfectly.

      Furthermore, with XSLT, or any of the bindings that enable XML structures to be reflected as objects in a programming language, processing the data becomes easy.

      Finally, you can always edit it manually.

      Binary descriptions are nice, usually compact (not always). But with binary descriptions you always have to worry about floating point formats, endianness and how to represent the data in your program - so for every binary data description you have to write a reader for the data, a writer and a new converter for every output format you might choose. With XML, libraries for reading, writing and converting (XSL is very powerful for that) are being written for most languages so you can use one of those that is already there, or if you do have to write one, you can reuse it for other types of data in the future.

      I've written programs to read and write binary data of more types than I'd care to admit, and I've stared at hex dumps of the data files for way too long. I've had to look at un-documented or under-documented binary formatted files and puzzle out what every bit did more than a few times. (Of course since the DMCA I would never puzzle out undocumented binary data files.)

      Finally you say, "XML is fine for configuration files and office documents" but there are those who say that XML is precisely wrong for those kinds of files. In fact, every time someone mentions XML as being used for "Purpose X" on slashdot, you can expect the immediate response "XML is completely inappropriate for Purpose X" comments.

      I'm also a bit curious - for the 3D descriptions, how does bzipped XML compare to an equivalent binary file for size?

      • Yes, but you actually write software. Most of the people who sit on here complaining about XML have no idea how useful it can be, and how it makes interoperation so easy (anything can parse text), because they don't actually write software - they just like to talk like they do.

        Furthermore, it is highly annoying when people regard XML as a config file format, or a way to represent static data. XMLRPC and SOAP (its bulkier sibling) are so useful it's crazy.
        • Well, there is a long history of ASCII zealots trying to elevate file formats to languages so it's not surprising that people might expect XML to be the same.

          Parsing text is not inherently easier than parsing binary, but there's a cultural bias in favor of using a text editor for editing "as God intended".

          XML was introduced as a new standard and a lot of new code has been developed to support it. If a binary standard had been introduced instead, the tools required to edit it would not add a lot of additio
      • by Sycraft-fu ( 314770 ) on Saturday August 28, 2004 @08:46PM (#10100085)
        The issue is the decompressed size. Fact is, you are going to have to decompress the data before you can parse it. Something like X3D is going to be HUGE compared to a binary 3D format. So, that leaves you one of three options when dealing with it:

        1) Decompress just as needed and render. Ok, but bzip2 requires a non-trivial amount of processor power to execute, and in any srot of high performance situation, you'll be doing a lot of decompression. You'll end up slowing way down wating on that.

        2) Decompress all objects for a scene to memory. That'll work but require a massive amount of memory comparitivley. Not going to find too many takers on an engine that needs a GB to deal with a scene that normal engines can handle in 128MB or less.

        3) Decompress all objects for a scene to disk. Better, but still going to use a lot more memory as the objects are loaded. Also will be slower, because of more disk access, and slower loading times for a scene because of the decompression process.

        Look, a text based markup works well for something like the web because the size of files is not significant compared to the result, and most of the data in the document is text to be displayed anyhow. The same is not true of graphics, espically not in any modren context like a game's 3D engine. You need to be able to get the data into memory fast, and it needs to be as small as possible and still be usable. With UT2004 occupying 6 CDs, and Doom 3 occupying 3 (and being faster when it's data is decompressed, though most of it is binary) you do not want a file format that is going to drasitcally increase the space requirements.

        Notice that there are good open formats that are binary, and for good reason, like PNG and OGG. With good documentation and standardization, they are easy to deal with in a program, yet they occupy little space in disk or memory and parse quickly. Try and reimplement a graphics format like OGG in XML and see what you get. It'll either be huge, or well compressed, necessitating a decompression step.

        It's a nice thought that all file be human readable, but it's just not realistic for deceant performance. After all, why not take it a step further, have all program work from source, have the comptuer interpret them on the fly. Well, there's a good reason that's not done for many programs. Even Java compiles to a bytecode, and doesn't run straight from source.

        What is easy to use for a human is not the same as what is easy to use for a computer.
        • You were making sense until you said:

          Try and reimplement a graphics format like OGG in XML and see what you get.

          Heh. *steps on pedantic box*

          Ogg is just a container, a multi-media container. As far as I know, Ogg is not intended to contain just a pictures, but it could. It is intended to contain music, animation, and a combination of the two. It's most popular use right now is to contain vorbis-encoded sound.

          *steps off pedantic box*

          Here's a real problem for you to solve with your binary only format

          • You people that get all giddy about the ease of XML miss the problems of space and speed. XML data may be all nice and human readable but that doesn't matter. It's not machine readable and it's huge. As a really simple example: If I want to represent a colour in binary, I do it with 3 bytes, one for red, green and blue. That covers all teh values of a 24-bit space, which is what is commonly used. In XML I need up to 8 bytes to represent the same colour data, since it's all text. Also, given the control char
            • If I want to represent a colour in binary, I do it with 3 bytes, one for red, green and blue.

              ...and you have to write a custom tool for every platform to deal with it.

              This conversation is tired already. You've totally forgotten that computers are supposed to help humans, not vice versa. The computer works for you, so put it to work.

              Graphics cards do not speak XML.

              You're right, graphics cards are plagued with proprietary languages. So binary is definitely a better solution, I guess.

              The file format

            • On the contrary, a 24-bit space is usually represented in six bytes of text, not eight. Take a look at HTML. It's in hex values. (eg. FF0000, FFFFFF and 0000FF)

              So that's still twice as big, right? So on disk, just use a compressor like XMill or XGrind (or just plain gzip or bzip2). Those can shrink those nasty XML files down to size nicely.

              As for the parsing argument, are you really planning on loading a raw byte array into memory and expect it to be usable? Aren't you going to error check it? If s
          • NB. I deal with this stuff for a living (in the context of video games), so apologies if some of it seems to hit a nerve with me :)

            Taking some of these points in turn:

            Now, Blender is slated (and may already have) to adopt an xml-based file format. Assuming I can dig up specs on .ase, then I could write a little bit of XSL to do my conversion.

            Yeah, good luck with that. I hope Blender and .ase files share the same co-ordinate bases, otherwise you're going to have to transform the co-ords, texture map

            • Heh, we're not likely to see eye to eye on the whole, but for this part:

              Footnote: I also have a knee-jerk reaction when anyone says 'performance isn't a problem'. That's the sort of thinking that means iTunes runs like a total dog on my 800MHz PC.

              Same problem here. ;)

        • by dtfinch ( 661405 ) * on Sunday August 29, 2004 @03:57AM (#10101377) Journal
          While I agree that xml is not well suited for holding massive amounts of data like that, and I might not do it myself, it's not as bad as it looks. A lot of good compression formats can be decompressed faster than they can be read off the disk on modern hardware, as a stream requiring a fixed amount of data. Additionally, many good xml parsers can also work on a stream, reducing memory requirements. And xml compresses very well. So the cost of xml may very often be a fixed amount of ram and little more disk space.

          The advantage is that it's easy to write tools for compressed xml based formats. No binary specs to learn. Just decompress the xml, look at it, and you're ready to go. OpenOffice uses a zip compressed xml format and produces files typically 1/10th the size of Microsoft's binary Office formats. And anyone with an xml parser and a zip (de)compressor can jump into writing tools to read, modify, or create them. And getting into 3D, a lot of people like to make mods, and being able to read and modify the files without any special software will make it a lot easier.

          And nobody expects XML to replace PNG or OGG. They serve totally different purposes. And as compressed as they are, textures and other media tend to dwarf all other data, binary, bloated xml, or otherwise.
      • The problems you describe are due to the fact that you are using a lot of incompatible binary representations. A real binary standard will fully document floating point standards etc. ASCII text is not endian-free, it's typically character-by-character big endian, but you could certainly represent it differently and it would still be ASCII. People either adopt the standard or they don't, this is equally true for ASCII or binary representations.

        In the case of XML, the decision was made that being able to ed
    • As I recall, VRML had an semi-optional binary format that basicly converted the syntax into binary tokens. Very compact.
    • Maybe that is why VRML and X3D were not successful.

      I think it's a bit early to be calling X3D unsuccessful.

      Storing binary data like 3D vector data and texture data in a text file and then compressing the text file to get acceptable file sizes is just plain stupid.

      I would agree that in general you'd want to stick to binary formats for textures, largely because there's a lot of very-well supported image formats out there already, and vast numbers of programs and libraries that will read them. Also, the
    • Binary storage for [...] data makes a lot more sense since it is more compact and easier to parse, and there are also standards such as the IEEE float and double standard.

      Of course. That's why widely used and well-established for the past 20 years data transfer standards like PostScript use binary.

      Oh, wait...
  • by MJOverkill ( 648024 ) on Saturday August 28, 2004 @04:13PM (#10098503)
    Remember, the MP3 standard is covered by patents owned by the Fraunhofer Institute and THOMSON, and they enforce their patents.
    • Yeah, that was kind of the point the story submitter was trying to make when they put "as open as MP3" in quotes, and made the crack about a royalty-bearing, pseudo-open universal 3D format. Even if you're not going to RTFA you could at least RTF/.S.
      • Yes, I was responding more to the company's remark "as open as MP3" and not the summary, but I felt it necessary to add to the summary by pointing out that MP3 patents are ACTIVELY ENFORCED. The MPEG collection of technologies are, technically, open standards. A lot of companies donate patented MPEG technologies, but do not enforce those patents, except Fraunhofer Gesellschaft of course. This is what makes MP3 a pseudo-open format. This was not mentioned in the summary.
    • MP3 is open "The Open Group" way. Think of early Unix advertising before the free Unix-likes got popular. =)

      "Open" in Ancient Computer Marketing Vocabulary means roughly "if you have a little bit of money to cover our expenses and the holiday trips for our executives, you can get a specification from us - and if you pay even more, you can actually use the thing for making money."

      Yes, it was "open" - because you had a chance of getting the specification and license somehow.

      "Closed" was defined as "oh,

  • Collada (Score:3, Insightful)

    by Anonymous Coward on Saturday August 28, 2004 @04:13PM (#10098507)
    Don't foget Collada [collada.org] This format, headed up by sony and supported by all the major 3d modelling packages, was first released at SIGGPAPH, and it has a lot of promise.
    • Yeah it's kind of OpenGL ES centric to me, or maybe all the PS3 style demos are geared to that. I don't really like the idea of an XML format, but I guess a lot of people will make a binary format of some kind for storage.

      Well, gzip compresses text better than anything and can be decompressed at almost no cost. Maybe that's a choice to take...

      However to fully support it you need OpenGL 1.5+ for all the shader sections.
    • According to the Collada team, about Collada and X3D [collada.org]:

      "We do not have this handy, and doing a fair comparison will take some time. I am not sure we (The COLLADA team) are the best to answer this question, as we may be seen as biased. But, without waiting for a detailed comparison, I can give you some elements: We asked game developers and modeler companies about X3D, and we could not find one project using X3D as file format. If you are a game developer reading this and using X3D, please speak up now!
  • Almost all competition is a good ting, if there weren't competition we wouldn't be sat here using computers that can do what they can do and in a world that is as advanced as it is.

    A standard is a good standard if it does it's job well and fits in with business demands. Technical superiority isn't always the best predictor of a winner. May the best standard win (which is a no-lose statement btw ;-)).
    • But recent history has seen the world get behind various 'standards' that suck. For example, the practice of sending documents as MS .doc files. The question of which is technically better was never asked, since this simply resulted from MS's domination and marketing skills. IMO, it can't be called a standard if it's not open.

      I do agree with the gist of your statment though, the best standard _will_ eventually win, it's just that we might have to go through another few rounds of propietary crap first.

  • by hopethishelps ( 782331 ) on Saturday August 28, 2004 @04:14PM (#10098517)
    I mean, a royalty-bearing, pseudo-open universal 3D format from Intel and Microsoft? Sorry, guys. That trick doesn't work anymore.

    Unfortunately, with Microsoft's money and monopoly of the desktop, that trick might work. In fact it probably will work, unless some of us put together good ideas and good software using the open standard X3D before the bad guys get their bandwagon rolling.

    • Or, do like us, and just not care about MS. We make all our money in applications other than dealing with microsoft. (FWIW, I'm the architect of Xj3D, one of the links provided in the article). Our money comes from developing custom applications using our toolkits that happen to support X3D, or providing complete solutions for devices like CAVES or other large-scale visualisation devices. We get paid to improve open source codebases, something which MS is not interested in.

      FWIW, 3DIF is just Shockwave3D. T
  • WTF? When did they... oh, wait. I thought this was Open Standards by Real [real.com]... nevermind :)
  • by Anonymous Coward
    Here is stuff straight from their standards page:

    * Open source, so no licensing issues.
    * Has been officially incorporated within the MPEG-4 multimedia standard.
    * XML support makes it easy to expose 3D data to Web Services and distributed applications.
    * Compatible with the next generation of graphics files - e.g. Scalable Vector Graphics.
    * 3D objects can be manipulated in C or C++, as well as Java.

    Sure looks like everything VRML attempted to be an then some, guaranteed to be another crash and burn.

    "Hats o
    • by Anonymous Coward
      H"Hats off to WildTangent, the only one of the bunch who ever had a working business model"

      that would be Wildtangent the spyware manufacturers [google.com] ? yeah a buisness model based on user deception and datamining, nice ethical buisness model

      thats like saying "hats off to cometcursor and coolwebsearch for driving the development of toolbars,popups trojan horses and MSIE exploits"

      • OK, I don't normally respond to "WildTangent is Spyware" -- among other reasons, because it's not my job. It's D----'s job. My job is employee number 10, Wild Tangent. I'm a programmer, specializing in whatever we need to fill our customer needs. And I'm irritated enough to want to write about three Wild Tangent related points, but I won't. I'm going to talk about the business of 3D.

        In the beginning, nobody quite knew what the 3D market was going to be. E-greeting cards that sing and dance? 3D Bann
        • ...it's opt-in targeted e-marketing!

          To summarize: if your code is slipped into another product and not clearly mentioned in the license and listed in the installer, or if it phones home without telling the user that it's going to do it, IMHO it is spyware. WildTangent (as it was bundled with AIM) fits both those conditions.

          Maybe you're not selling spyware anymore, but you did in the past, and on slashdot that reputation takes a *long* time to live down. Just look at any thread about Realplayer...
  • by Anonymous Coward
    X3D isn't a file format standard. It's some lame web 3D, lets resurrect VRML with a new name, specification.

    There are at least two opensource 3d file format standards that I know of developed by actual companies in the industry:

    http://www.softimage.com/products/pipeline/dotxsi/ v36/ [softimage.com]
    http://www.tweakfilms.com/main/gto.html [tweakfilms.com]
    • by pyrrhonist ( 701154 ) on Saturday August 28, 2004 @05:14PM (#10098930)
      There are at least two opensource 3d file format standards that I know of developed by actual companies in the industry

      Both of which are proprietary formats.

      Just because a company chooses to make certain applications that use their format open source doesn't make their format a standard. Furthermore, neither of these companies have enough spin to make their proprietary format a de facto standard.

      X3D is being put through the ISO standards process, and U3D is being designed by huge industry players.

      • I can't speed for Softimage, but Tweak Film's GTO is not a "proprietary format". Its released under LGPL as evidenced by its FSF listing. There are a number of free tools in the download that read, write, and edit it. We have no aspirations to have it become a standard. On the other hand, there are people in the special effects industry using it right now on real projects.

        The "huge industry players" that have designed the U3D format are not people that use these types of things on a day-in day-out basis

        • I can't speed for Softimage, but Tweak Film's GTO is not a "proprietary format". Its released under LGPL as evidenced by its FSF listing.

          While it is true that the code is released under the LGPL, this does not prevent the company from discontinuing its use of the format, using a new format, and then not releasing that under the LGPL. A similar situation happened with AOL's TOC protocol. Standardization prevents this from happening, because usually, more than one company or group designs the standard a

    • Don't forget the Lightwave .obj format, which is complex but very well-documented and a de facto standard in the game industry.
      • .obj is great for moving meshes between apps, but it has no support for rigs, skinning/weighting, morph targets, clusters, etc. I higly doubt this x3d format will be able to handle all the possible extra stuff that 3d packages can create.

        Example, I have a fully setup character (rigged, weighted, morphtargets, control panels) in Maya. I need to get this sucker over to say, Max.

        1st the geometry has to come across. Is it nurbs, polys, subdivs? If its nurbs, is it one seamless surface? Probably not, so I have

  • Anybody who still believes that Microsoft and open standards fit on one page is misguided. Standards are ok with MS, as long as noone else can exert any control over them.
  • by Trailer Trash ( 60756 ) on Saturday August 28, 2004 @04:50PM (#10098744) Homepage

    I mean, a royalty-bearing, pseudo-open universal 3D format from Intel and Microsoft? Sorry, guys. That trick doesn't work anymore ;)

    Why not? Microsoft still has 95% of the browser market, if you think "that trick doesn't work anymore", you're a moron. They're still in a position to dictate standards, and they've shown that they have no qualms about doing so.

    Of course, this is yet another area where there is simply nothing that is truly patentable, but I'm sure they can sucker the idiots at the patent office to give them a few, anyway.

  • by joneshenry ( 9497 ) on Saturday August 28, 2004 @04:55PM (#10098775)
    Tony Parisi doesn't seem to get it--the best way to kill off X3D from getting mindshare was to make it an ISO standard, because almost all ISO standards cannot be freely shared in electronic form and the process takes too long to revise deficiencies. What is really pathetic is with all of his experience Parisi still wasn't able to see that the best way to spread a software technology and overthrow the existing order is to make the standard as freely accessible as RFCs or W3C standards.

    For software ISO standards only "work" with already existing market leaders. And even market leaders can be eventually dragged down by the restrictions of being an ISO standard, such as the deficiencies of C++ leading to the creation of Java and C#. Making a software technology such as X3D an ISO standard before it had any market share was simply madness, and Parisi should have known better.
    • Comment removed based on user account deletion
    • by Mithrandir ( 3459 ) on Saturday August 28, 2004 @10:05PM (#10100462) Homepage
      The point that you have completely missed is that X3D is published to the website. Thus, it costs nothing (unlike most other ISO standards). The ISO spec got ratified just under a month ago and we have to go through one more round of editing (typos only) and we have to wait for ISO to formally publish it before it can be seen on the Web3d.org website. Prior to that, if you're a member of the consortium, you have access to it right now.

      On your second paragraph, that's a matter of opinion, of which I vehemently disagree with you. Many large organisations will not touch something unless it is an ISO standard. This is particularly true with both the European companies and the US military complex. To give you an idea of just how that works - as soon as we had the announcement in hand, we had 2 different branches of the US military formally announce that they require it as their one and only 3D interchange standard. By not going to ISO, we end up with the same problem as before - a psuedo standard with multiple incompatible implementations thereof. This solves the problem up front.

      Where Tony and I disagree is that X3D nor VRML has any use being a "Web" standard. Where VRML has been hugely successful is out in industry in places that don't care about the web. If you'd walked around the floor at any show where they do modelling and scanning data, you'll find every single scanner manufacturer uses VRML as their output file format. All the tools that take that raw data in and process it, use VRML etc etc. None of the success stories even use a web browser in the system. In the future, that is less and less likely to happen too. It's all about the back-end and integration work. X3D has been succeeding for a couple of years in there and will continue to succeed.
      • Hey Mihtrandir, >Where Tony and I disagree is that X3D nor VRML has any use being a "Web" standard. Where VRML has been hugely successful is out in industry in places that don't care about the web. Actually, I agree completely. Sorry if I gave any other impression. If you go re-read things you'll see that I took great care to characterize X3D, and what our movement is trying to do now, as about "real time" and "communication" not about Web. Web is like 10 to 20 percent of what this is about IMO. Tony
  • by Anonymous Coward
    Going out on a limb here, but couldn't it also be that there is NO USE for 3d on the internet? Most people won't even dive deeper than 2 clicks to get where they want to be, and you think that they are going to walk down a virtual street into a virtual store and manually look around? No chance in hell, snow blind be damned.
  • radians? (Score:4, Interesting)

    by j1m+5n0w ( 749199 ) on Saturday August 28, 2004 @06:03PM (#10099250) Homepage Journal

    One of my gripes with vrml is that angles are specified in radians, and it looks like x3d is the same. Maybe it makes the transform math simpler, but Arghhh! Who in the world decided that radians were more intuitive to work with than degrees? Do they think no one will ever write 3d models by hand [georgefox.edu]? I like to have "turn this a quarter turn" work out to a rational number.

    Maybe there's some way to set the default input mode to degrees, and someone will enlighten me.

    -jim

    • Right-angle rotations are betters specified as matricies, where all the coefficients are 1 and 0. I have certainly been burned by math errors where exactly 90 turned into approximatly pi/2 and that turned into 1e-9 in the cos() and this error was later magnified to completely wrong output.

      Also use M_PI_2 if you can in your code (of coure Microsoft did not put that in their math.h header file, copy it from the linux header file). I would expect any 3D language to provide a symbol that produces that value wi
    • Who in the world decided that radians were more intuitive to work with than degrees?

      Well, given that radians are based on the actual mathamatics of angles (i.e., one radian is a "straight angle"), I guess God did. Or whatever deity you believe in. Athiests will have to blame it on random chance, and I'm not sure about agnostics.

      Seriously, degrees (and minutes and seconds) are just like other "traditional" units of measure (like yards and feet and inches), in that some humans in the past cooked them up
      • Well, given that radians are based on the actual mathamatics of angles (i.e., one radian is a "straight angle"), I guess God did. Or whatever deity you believe in. Athiests will have to blame it on random chance, and I'm not sure about agnostics.

        Huh? Pi radians is a straight angle. One radian is a little over 60 degrees, hardle a straight angle.

        While I agree that radians are related to a natural property of circles (specifically, the length of a one radian arc is equal to the radius of the arc), it

  • That's why everyone's got one!
  • My name is on the original VRML specs. Yes, really. One thing that I argued for, and lost, was how the VRML version of fraction_changed was going to work. X3D does it the same damn way VRML wound up doing it.

    fraction_changed events output a floating point value in the closed interval [0, 1]. At startTime the value of fraction_changed is 0. After startTime, the value of fraction_changed in any cycle will progress through the range (0.0, 1.0]. At startTime + N × cycleInterval, for N = 1, 2, ..., (i

  • Crazy (Score:5, Informative)

    by shplorb ( 24647 ) on Sunday August 29, 2004 @01:31AM (#10101126) Homepage Journal
    There'll never be a universal standard for 3D because it's so application-specific. Some applications work with polygons and some work with parametric objects. It's the reason why only 3ds max can read .max files - objects and modifiers are represented parametrically, and only the plugins that generated them know how to create them.

    Then of course, the rendering applications like to have their own formatting of data for speed and efficiency issues. A DirectX game will have data stored in an optimal format that's different from say how a PS2 game will.

    Using XML is ridiculous, it's a terrible waste of space and introduces a large processing overhead before the data is ready for rendering. There's a reason games often store 3D data in the format the platform directly processes - so it can be read off disk and immediately blasted to the screen.
    • Re:Crazy (Score:2, Insightful)

      by La Gris ( 531858 )

      An open XML format like X3D is not for usual storage or keeping internal 3D data.

      This is to enable a standard and easy to parse interchange and long term archive format for 3D data.

      Implementation specific 3D data may still be of any internal format suited for any specific application and hardware.

      Plugin specific and parametric 3D data can easyly be later integrated in the X3D document by using namespaces. That makes much sense that way.

      Think of XML as an alienX<--->alienZ protocol where aliens

      • But the whole point of the format was so that they could make web-games or whatever. Clearly, they were and still are barking up the wrong tree.

        The point is, a universal format is a stupid idea because graphics is a constantly evolving, application specific field.
  • by wombatmobile ( 623057 ) on Sunday August 29, 2004 @09:31AM (#10102206)

    2D graphics data expressed as XML can be highly efficiently gzipped.

    You can see from these [svgmaker.com] examples of Scalable Vector Graphics (SVG). The same documents in any other available 2D format are generally larger.

    And yet, when ungzipped, these SVG files are verbose XML text. You can see that by right-clicking when you view any of those examples with Adobe viewer and selecting View Source.

    SVG is a good example of how XML can be implemented efficiently over the wire (gzipped into efficient filesize) and yet accessed by the programmer at either end with no more than a text editor.

  • "I mean, a royalty-bearing, pseudo-open universal 3D format from Intel and Microsoft? Sorry, guys. That trick doesn't work anymore ;)"

    Wait - you guys can see this is a scam when it's ECMA accreditation of U3D, but can't see it's a scam with ECMA accreditation of .NET?

Single tasking: Just Say No.

Working...