Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Programming The Internet Technology

W3C launches Binary XML Packaging 239

Spy der Mann writes "Remember the recent discussion on Binary XML? Well, there's news. The W3C just released the specs for XML-binary optimized packaging (XOP). In summary, they take binary data out of the XML, and put it in a separate section using MIME-Multipart. You can read the press release and the testimonials from MS, IBM and BEA."
This discussion has been archived. No new comments can be posted.

W3C launches Binary XML Packaging

Comments Filter:
  • by Dancin_Santa ( 265275 ) <DancinSanta@gmail.com> on Thursday January 27, 2005 @10:55PM (#11500247) Journal
    I was drownding in debt. There was no where to turn. My wife left me, my friends all left me. Even my dog, he left me too. I had to do something.

    That's when I found Binary XML. They were able to help with the debt. They got the creditors off my back and got me back on my feet.

    Thanks Binary XML!

    (I thought this was going to be about a standardization of compressing XML files that got rid of the excess bloat in the markup.)
    • Comment removed (Score:5, Insightful)

      by account_deleted ( 4530225 ) on Thursday January 27, 2005 @11:20PM (#11500387)
      Comment removed based on user account deletion
      • Re:More bloat! (Score:5, Insightful)

        by Anonymous Coward on Friday January 28, 2005 @12:07AM (#11500595)

        So did I. Then I looked at that example [w3.org] and my heart sank. What the hell! 12 lines of bloated crap text turned into 46+ lines of worse bloated crap!

        The examples given in the article haven't included the binary data for berevity. The problem that exists now is that binary data has to be encoded into a form compatible with the charset of the document, which usually means base64. This increases the size of binary documents enourmously (think twice), and also requires CPU cycles to encode it.

        Being able to send the binary data in a seperate MIME payload means it doesn't need to be encoded in this manner which is a big help for any reasonable sized binary resources. It also means they become first class MIME objects and can have associated headers which provides additional benefits.


      • What the hell is wrong with just gzipping it?
        It's just another encoding that happens to be source-language agnostic and provide redundancy elimination.

        You have no problem with the overhead of parsing binary XML, but dictionary lookups and tree rotations involved in decoding a compressed file.. that's out of the question?

        Not to mention the added benefit that a standard compression layer shrinks not just the tags, but the content as well.

        Look, stop thinking of gzip (or bzip, or whatever), as a "compression
        • "You have no problem with the overhead of parsing binary XML, but dictionary lookups and tree rotations involved in decoding a compressed file.. that's out of the question?"

          huh... last time I checked binary was the language the computer natively understood and it didn't need to be parsed or processed in anyway by software.

          Also, it seems to me that he did have a problem with the parsing of the XML part.
          • Re:More bloat! (Score:3, Insightful)

            by Laxitive ( 10360 )
            Uhm, you still need to parse the XML structure.

            Technically, ASCII is binary, too. 'A' is 65, which is 01000001. Binary XML will not do away with parsing. The tags will still be there, the content will still be there. Only the restriction that the tags must be an alphanumeric string will be lifted.

            Making things "binary" doesn't magically remove the burden of parsing. You know the binary executables you run? The system loader loads it.. and parses it, and arranges it in memory the way it needs to be a
            • by starm_ ( 573321 )
              Yes but usually when one refers to binary, one refers to the prefered language of the computer. Not some transliteration or translation of it that adds yet another conversion step to the data.

              And as I suggested above he did not like the XML tags either calling them:"12 lines of bloated crap" and all.

              I feel like Im talking to a two year old. I don't know what else to say. If you can't comprehend that binary is much faster to parse than XML theres nothing I can do. Oh I give up you're right. I propose to ch
              • What I meant to add was:

                <ASM instruction="JMP"><PARAMETER type="32bitAdress">A3D2</ASM>
                <ASM instruction="NOP"></ASM>
                <ASM instruction="ADDA"><PARAMETER type="32bit Integer">D22A</ASM></ASM> ...
              • Re:More bloat! (Score:4, Insightful)

                by bit01 ( 644603 ) on Friday January 28, 2005 @02:40AM (#11501113)

                If you can't comprehend that binary is much faster to parse than XML theres nothing I can do.

                Where is your numerical proof that binary is much faster to parse than text? It is amateurish to just assume this is true. Good parsers are damn fast and can operate in O(n) time.

                Of course binary may be faster. I doubt that it will be much faster when compared to a decent parser and when you realise that the binary format should be platform agnostic for word size, endianness and forward and backward compatibility.

                For instance, gzip'ed text files can sometimes be much faster to access than uncompressed binary files because it reduces the amount of file IO. e.g. 64 bits of binary to encode the number 1 rather than 8 bits of text.

                While compression increases the CPU usage because the disk is so much slower and because the CPU might otherwise be idle waiting for the disk it can lead to an overall win. The same may apply to a slow network link. Unless you measure it is difficult to know. I've lost count of the number of binary formats I've seen that in hex dump had vast numbers of zero bytes and were thus highly inefficient. The people who work at a "high level" designing such file formats without checking such simple things are poor programmers. Even when using indexes the saving of a single extra random disk/network access can sometimes justify a huge amount of CPU usage.

                ---

                Don't be a programmer-bureaucrat; someone who substitutes marketing buzzwords and software bloat for verifiable improvements.

            • Re:More bloat! (Score:3, Insightful)

              by interiot ( 50685 )
              You're talking about two different kinds of parsing. Breaking out opcodes is hugely different from counting '<' and '>' characters 30,000 times in a row, just to find one little bit of information burried in the middle of text, but you just don't know where [slashdot.org].

              Why are databases fast? Indexes. What do all XML databases do? Store XML internally in a way that machines read much faster, but makes it a pain for humans to update. Indexes. So if you have all these computer programs passing around data,

              • Re:More bloat! (Score:3, Insightful)

                by Unordained ( 262962 )
                Databases (typically relational) aren't just fast because of indices; they can also assume more about the structure of the data. A table (relation) is a set of tuples, each one with the same attributes, each one with a single value (however complex it might happen to be, in spite of what OODBMS people think.) When you've got that, you only need to store the meta-structure of the relation once, in the relation header. Then you can assume all sorts of stuff about what's to follow, and you can optimize the hel
          • I use a trinary computer, you insensitive clod !

        • There is no binary XML here. The XML is not compressed (go ahead and gzip it as well if that is what you want).

          This is about packaging other binary data _within_ XML. RTFA
      • ..if you're going to transfer a small fixed header, XML is not for you. If you actually have no binary data of signifiance, XOP is not for you. Let's say you want to include a 300kB picture in your XML. Your choices are:

        1. External link (unpractical)
        2. XML/Base64 encoding (~450kB)
        3. XOP/binary encoding (~300kB)

        In that case, your 30+ lines of extra code are completely irrelevant. That being said, I was under the impression that you could do this already by sending your binary data in a "document fragment"
      • The point I believe is so you can serialize actual data in an xml format without having to uuencode it into a CDATA struct.
      • Why XML isn't like the following? it certainly has much less characters, plus it is similar to the most popular programming languages:

        envelope(soap="http://www.w3.org/2003/05/soap-enve lope", xmlmime="http://www.w3.org/2004/11/xmlmime") {
        body {
        data(m="http://example.org/stuff") {
        photo(contentType="image/png") {
        /aWKKapGGyQ=
        }
        }
        sig(contentType="application/pkcs7-signature") {
        Faa7vROi2VQ=
        }
        }
        }

        It's easy to parse, and editing tools need small modifications to handle it.

    • My wife left me, my friends all left me. Even my dog, he left me too. I had to do something.

      Your life is a country song. For better results, try playing it backwards.

      I got my wife back, my car back, my house back, and a full bottle of whiskey at the end!
  • by seanadams.com ( 463190 ) * on Thursday January 27, 2005 @10:59PM (#11500268) Homepage
    The tech industry seems really starved for ideas lately.

    Binary file formats are hard.
    Let's use XML because it's easier.
    No wait... let's represent that XML in a more efficeint binary format.
    Ah yeah that's the ticket - the best of both worlds!

    Now let me just fire up my code-morphing processor which, through emulation ahieves x86 compatibility with "low" power consumption. Never mind it's slower overall and has worse MIPS/mW than an underclocked x86 - look Ma, we *inveted* something!!!!

    There are some real technical problems out there... why are people chasing non-problems like XML?
    • by tomhudson ( 43916 ) <barbara...hudson@@@barbara-hudson...com> on Thursday January 27, 2005 @11:13PM (#11500352) Journal
      There are some real technical problems out there... why are people chasing non-problems like XML?
      Because they're hacks more into buzzword bingo and "selling the next big thing"?

      Whatever happened to the virtues of simplicity, like a file containing a header record detailing the field names, and rows containing the data in either fixed-length or delimited form? Damn fast to implement, debug, read from and write to. Parsing? What parsing? Read the first line, split it to get your headers, and read 1 line per record.

      Ideal for data exchange. Easy to manipulate via javascript on the client. Simple to display and manipulate via the DOM (Document Object Model). Not resource-hungry. Handles both text and binary data. Dirt easy on the server.

      I ran a test to compare, and I'm able to select, format, and serve 1000 records this way in less time than 100 records in simple HTML, never mind xml. By doing this, the client can page through, say, 25 records at a time without having to hit the server every few seconds to see the next/prev pages.

      • What you are talking about is CSV. CSV is great, but it's only any good for table structured data. You can't implement a tree or any arbitrary nested structure like you can in XML.
        • What you are talking about is CSV. CSV is great, but it's only any good for table structured data. You can't implement a tree or any arbitrary nested structure like you can in XML.

          Well, I was actually talking about fixed-length records as well (even quicker to manipulate - no complicated parsing involved, random access for r/w, etc). Need some data?

          fseek(HEADER_LENGTH + RECORD_SIZE * (desired_record_no - 1));
          if (bytes_read = fread(FH_IN, ibuff, RECORD_SIZE) == RECORD_SIZE) {
          // do processing
          } else {
          // err

      • I generally agree with you except:

        1) Endianness will probably cause problems when you least want them.

        2) Parsing wide-character string data can be a pain.

        I mostly think XML is 95% overrated and 5% genuine usefulness, but, in a world of people who have never heard of a big-endian computer regardless of a degree CS/CE/EE, it's a tough call.

        You know, I think colleges should start offering 4-year degrees in XML. That way we would be assured of having a few people in the world who actually know how to use i
        • " but, in a world of people who have never heard of a big-endian computer regardless of a degree CS/CE/EE, it's a tough call.
          "

          The solution to that is: Standardize on an endianness for binary serialisation and let the computer that doesn't follow the standard do the conversion.

          See? that wasn't hard, No need for XML.
          • See? that wasn't hard

            Hehe, look at the standardization processes between companies and you'll see that it's more than hard. ;-) After all these years, we're still living with disagreements on how the return character should look like.
            • You mean LF vs CR/LF? Well, CR/LF has a higher resolution than LF, and was introduced later (Windows), so I would have to say that Unix needs to catch up and standardize to CR/LF.

              It's true, though, that an ASCII character standard would be far more useful than XML ;)
      • The only good thing to come out of Electronic Arts was their IFF EA 85 data format standard. It is more or less to binary what XML is to text.
      • You are absolutely correct. What the industry needed was a standard format for data exchange, not the bloat XML is.
      • by Chris_Jefferson ( 581445 ) on Friday January 28, 2005 @06:09AM (#11501841) Homepage
        Whatever happened to the virtues of simplicity, like a file containing a header record detailing the field names, and rows containing the data in either fixed-length or delimited form? Damn fast to implement, debug, read from and write to. Parsing? What parsing? Read the first line, split it to get your headers, and read 1 line per record.

        Then of course you have the problem that your data wants to be variable length. Then you want to have the deliminator actually in the data, so you have to invent escape codes. Then in some lines you want to allow multiple occurances of some of the parameters so you put in some basic markup. Then you want to be sure that any data users enter is of the correct format, so you write a verifier. Then you are basically back at XML again.

        XML isn't that great. However take at face value, it saves time and programming errors, the same way I wouldn't expect to have to wite my own doubly-linked-list, or hash table. Neither are complicated, but my language should come with one pre-written which is safer and faster than one I could knock together.

        • All the problems you mention were solved decades ago.
          1. data wants to be variable length
            Not a big deal. You don't necessarily need embedded escape codes (though they work well) - you can also use overflow buckets like databases have used for, say, 30 years
          2. Then you want to have the deliminator actually in the data, so you have to invent escape codes.
            regexes make this easy to implement.
          3. Then in some lines you want to allow multiple occurances of some of the parameters so you put in some basic markup
            Not
    • by asb ( 1909 )

      You, and whoever modded you up as "interesting", are an idiot.

      This standard is not about representing XML in binary format.

      This standard is about representing binary content in an XML document in binary format.

      See, previously, if one wanted to include binary data in an XML file one had to Base64 encode it. This takes space and processor time.

      This standard moves the bloated Base64 content into a pure binary MIME object.

      Maybe you should have RTFA first, eh?

    • No wait... let's represent that XML in a more efficeint binary format.

      Except that's not what they're doing at all. They're encoding binary data IN an XML document. They're using a principle similar to how one would go about attaching a file to an email.
  • by noidentity ( 188756 ) on Thursday January 27, 2005 @11:05PM (#11500307)
    Here's my binary XML-like file format which gives the best of both text and binary file formats. It's human readable and efficient at the same time! Finally, an end to the text-versus-binary wars. Here's an example file:

    The following data is in binary.
    UH)(&T^( @#t79nui**&tb x9#@ $Y*_@$ji[P{O@JIOHXIOU$HIIU#$hiuoHOP$UJ [etc.]
    • Something else I hate about this whole thing is the perpetuating of "binary" to mean "non-plaintext". I mean Christ, are we to infer that plaintext data is represented in ternary while everything else can only be expressed in binary? Yep, no hexadecimal for non-text, no sirree...

      -b
  • This seems like it would be an ideal fit for services such as Flickr as it would allow for image (or other binary media files) to be sent with xml data - in a compressed binary format.
    • Exactly. It seems like a way to have a "text" file that is easily parsed (all the XML info -- in this case possibly a description, comments, image meta-data, etc.), yet binary info (a jpeg compressed image) fits along-side for when you want it. One file with all the goodies.

      How this is different than simply base64 encoding the image inside a tag is yet to be seen. Perhaps because it's a standard?
      • I would assume because base64 coding of binary data bloats its size (I think up to 40% additional size over the uncoded binary) and takes time to encode/decode. If you were to be able to put a marker in an element that says "binary blob 100 goes here" and include binary blob 100 in some other area that is pure binary then you would have the binary data without encoding overhead.
  • by Anonymous Coward on Thursday January 27, 2005 @11:09PM (#11500322)
    As a software developer I find this particularly good.

    While I myself would prefer to write a binary protocol and send the data through a TCP socket I can no longer do that.

    When we land big contracts at work that deal in government and health the key thing they need now is interoperability with others. What does this mean? XML. Whether or not you like it, XML is here to stay. Its what everyone is pushing.

    Therefore we had to adapt and start using it. Not just for B2B, our rich desktop clients now communicate with the server using XML web services.

    The problem we've encountered is sending binary data. Right now we have to encode the data in base64 XML which uses lots of resources. I will give more look at this but it looks particularly good.
    • Yes, but this is what ASN1 encoding is for. It's a structured, self-describing encoding scheme that works very well for structured data. What advantages does this binary XML have over ASN1? Both require external descriptions to attach meaning to the data.

      In your case, ASN1 is what you should be using, not XML in the first place.
    • All this does is create a standardized process for what it is you're doing. You won't see any improvements with this, because you have to encode the binary data in MIME form.
  • Uhhh... (Score:5, Informative)

    by Phexro ( 9814 ) on Thursday January 27, 2005 @11:10PM (#11500327)
    Unless I'm horribly misreading the specification, it appears to be a way to package up XML documents and binary data that they reference into a neat package with MIME - not a way to convert a (text) XML document into a binary one.
  • I'm a bit confused... reading the document, it seems that the difference between XML and SOP is just where the data is:

    XML:

    <mylabel>(text)</mylabel>
    <mydata>(stuff in binary)</mydata>

    XOP:

    <mylabel>(text)</mylabel>
    <mydata>"hey, there's stuff in binary here, id 1!"</mydata>
    ---- MIME ---
    Binary ID 1: (stuff in binary)

    Is this right? So the benefit is just standardizing the binary representation using MIME? But that doesn't make the tags less verbose... so how is

  • by IHateSlashDot ( 823890 ) on Thursday January 27, 2005 @11:22PM (#11500399)
    I can't believe all of the replies making fun of this because they think it's a binary representation of XML. Didn't anyone read the RFC that was referenced in the summary?

    This is simply a way to reference binary data from within an XML document and to have that binary data included in the same payload (using MIME).

    Passing binary data in XML is a big problem. Everybody just invents their own method of doing it (although most are just variations on the theme presented here).

    There is a need for this specicification but it is not ground breaking or even particularly /. newsworth.

    • This is simply a way to reference binary data from within an XML document and to have that binary data included in the same payload (using MIME).

      And you find this less absurd?!?
  • Critiques (Score:5, Informative)

    by Effugas ( 2378 ) * on Thursday January 27, 2005 @11:29PM (#11500432) Homepage
    Ummm...it's "OK". This is probably the least ambitious Binary XML spec imaginable. That may actually be good, but I don't know. Lets see what's up here...

    First of all, it's completely impossible to stream this format. All the binary chunks have to be read at some point in the future when the actual XML non-opaque content is complete. In a stream, that never happens. (Of course, XML isn't the most stream friendly protocol...you can't validate a stream.)

    Secondly, this isn't wonderful for large files either; you're constantly seeking for binary data that can be many megabytes away. We solve this in web pages by having the images be completely separate (binary) files.

    Thirdly, its telling that they used a PNG as a data type. Besides being yet another file format that needs its own custom binary parser (heh, I like PNG, I'm just complaining about it in the XML whinespace), it's big and simple and there's just one there. One of the things I really liked about the various Binary XML formats was the degree to which they expressly typed things like arrays of floating point values or little-endian integers. Converting values between binary and string format is an enormously painful process, one that frankly I'm astonished hasn't received CPU acceleration at this point. Every other Binary XML format has seriously thought about how to efficiently but correctly manage large arrays of such values. XOP just says...heh...you wanna dump alot of data efficiently? Check your typing at the door. Feel free to bring a buffer-overflow ridden parser in with you if you like, though.

    Don't get me wrong, there's a fundamental simplicity to XOP that I can certainly understand how it's appealing. But it seems to go so massively against what XML represents that I'm not entirely sure XOP encoded content deserves to be compliant with the very regulations that forced XML adoption in the first place: Opaque formats are too expensive to maintain for any amount of time, therefore either self-describe or don't get deployed. A self-decribing document that says "All performance-critical content is opaque" seems rather counter to this spirit.
  • by WasterDave ( 20047 ) <davep.zedkep@com> on Thursday January 27, 2005 @11:36PM (#11500464)
    "Remember the recent discussion on Binary XML? Well, this has nothing to do with it, but we are proud to present a standard for larding out XML even more before attaching it to an email."

    I, for one, welcome our new bandwidth eating plaintext overlords.

    Dave
  • For those who didn't RTFA:

    The main application of this XML-referencing-to-binary-attachments is SOAP, and that means web services.

    In other words, you can simplify your God-help-me-XML-handling-and-parsing-code into something maybe 10% simpler. This means leaving the binary stuff OUT OF THE XML PARSER, putting it into the upper levels or processing. Cleaner, faster.

    Also, it helps adaptive compression (gzip) by tightening up the textual data - remember web services are about information transfer, not stora
  • by Camel Pilot ( 78781 ) on Friday January 28, 2005 @12:27AM (#11500685) Homepage Journal
    I am currently writing a xul client/server application. I am using the xmlhttprequest function. however instead of processing xml data which is very slow, especially when you need to parse a data set several times a second, I started sending data stuctures in javascript code instead. This I believe is what Google Suggest does also.

    In addition the server code is written in perl so for storing status and configuration information, I used serialized perl data strucures processing requirements fell dramatically. With serialized scipt you still have the clear text editing and inspection capabilities without the speed and space issues. for example instead of
    <container>
    <title name="title">
    <item><name>Name1</name>
    <item><name>Name2</name>
    <description>Bla bla</description>
    </container>

    You have:

    {
    title=>"title",
    item=>[ { name=>"Name1" }, { name=>"Name2" } ],
    description=>"Bla bla"
    }
    It seems like serialized script code, in either perl, python, java provides the benefits of xml without the headaches.
  • Find me an XML file that is not already represented as binary data. Oh, not looking so revolutionary now, is it?

    Wait, you say this allows xml to reference binary data? I say "href" attribute, bi-atch, look it up.

    You say, but no, it allows you to send the binary data along in the same stream / document? Check out multipart/mime. It's been around a long time.

    Here's a wild thought. Have the XML file reference it's binary resources by relative filenames. Tar the XML file together with the resources. Now pay
  • Full Circle (Score:2, Insightful)

    by roman_mir ( 125474 )
    The circle is complete. We started with binary format, moved to XML for readability purposes and then switched XML back to binary for speed.

    Obviously someone needs a knock on the head - when you design your application, don't you think about such things as a balance between performance and maintainability first and then implement what is suited better for your specific case? Obviously not! Just a little while ago everyone and their grandmother switched to XML for whatever reason but then they realized:
  • I was just involved in writing a routine that involved Base64-encoding something to be included in XML, then GZipping the whole XML file to recover the space the encoding tacks on. Now I can just do a XOP file, Awesome!
  • Wouldn't it make more sense to include the B for Binary, which is the essential purpose of the new "standard"? Plus, XBop sounds more natural when spoken than XOP does, and it's way more fun too! :)
  • XML has become at least two things since its evolution:

    1. an abstract structure consisting of (possibly-nested) elements and their corresponding attributes.
    2. a human-readable representation of that structure

    The interesting part of the story is that #2 came first. Since then, the W3C has recommended the Infoset [w3.org] abstract concept.

    For the developers out there, think of how often you parse the "angle brackets" yourself. Most everyone these days (yes, I know there are exceptions) uses an API which presents ele

  • by phunqe ( 592716 ) on Friday January 28, 2005 @02:17AM (#11501059)
    Reminds me of a meeting I had a couple of years ago with some representatives for one of the largest market making houses in the US.
    Bascially we were promoting an automated trading system and the first question I get is...

    "Does it use XML?"

    There you have it.
  • by Anonymous Coward
    It's too early yet. I'm waiting until MSBinary_XML comes out

    I hear it's going to introduce 263 special MS tags and nodes and extra layers into the standard that only works on MSWord in Windows XP. It won't validate as XML anymore but who cares. You will use a special version of Front Page to do this.

    The files will be a little bigger too, so with MSBinaryXML will add approx 257k thanks to the special proprietary MS extensions but will have superior functionality compared to other types.

    It will be particu
  • by Kopretinka ( 97408 ) on Friday January 28, 2005 @03:28AM (#11501275) Homepage
    These specs (XOP and MTOM) were created becase Web Services people wanted to be able to add binary attachments to XML messages (in SOAP). Initially the attachment technologies (like SOAP with Attachments [w3.org]) worked by just slapping the binary data alongside the XML message, without a clearly defined processing model for the receiver. Now with XOP attachments are logically in the XML document, but physically transported outside without the bloat of base64 or other XML-safe encodings. It's important to notice that XOP is just an optimization of the situation where binary data is put inside an XML document.
  • There are two types of engineers. Those who can get the job done, and those who will try to use the buzzwords of the day to get the job done. The former tends to be a better engineer, while the latter tends to look better at the beginning of the development process.

    XML will come full circle when true binary XML is a w3c standard. People will be using high-level GUIs to generate text-based XML files, which will be converted into binary XML. On the other end, somebody will receive binary XML, convert it
  • by Drog ( 114101 ) on Friday January 28, 2005 @10:04AM (#11503382) Homepage
    There's a lot of XML-bashing going on here from people talking about how XML is just a buzzword and how XML is not necessary. Sure it's a buzzword, and sure it's unnecessary in some situations. But that doesn't make it useless.

    I create data-driven web apps for a living (i.e. data-driven graphics, UI and text via SVG and HTML), and I firmly believe that XML is the way to go for such creations. It offers a hierarchical structure that is excellent for temporarily storing data pulled from a database, which can then be converted to HTML or SVG or some UI markup (XUL, XForms, or your own thing) via XSLT.

    I don't really care that XML is human-readable--I like the fact that because it is extremely well structured, it is therefore easy to create with authoring applications as well as being easy to manipulate real-time by with script (i.e. manipulating its DOM).

    I have long wished for a true binary XML spec to make the transmission and parsing/decoding quicker, and this spec isn't it. But I think one day we'll have it, and that won't mean that we've "come full circle" and therefore XML is useless. It just means that we'll have the best of both worlds--speed plus standardized, hierarchical data structures.

  • Microsoft is committed to MTOM as the definitive solution for including opaque data in XML and SOAP messages, and we plan to implement support for MTOM across our XML-aware product line. -- Don Box, Architect, Microsoft Corporation

    Yea right.... It will be MSMTOM and won't work with anything BUT IE and M$ products. Look at what they did with their version of XML.

"I've finally learned what `upward compatible' means. It means we get to keep all our old mistakes." -- Dennie van Tassel

Working...