Google Submits VP8 Draft To the IETF 156
An anonymous reader writes "Google has submitted an Internet Draft covering the bitstream format and decoding of VP8 video to the Internet Engineering Task Force. CNET's Stephen Shankland writes, 'Google representatives published the "VP8 Data Format and Decoding Guide" at the IETF earlier this month, but that doesn't signal standardization, the company said in a statement. The document details the VP8 bitstream — the actual sequence of bytes into which video is encoded. "We submitted the VP8 bitstream reference as an IETF Independent RFC [request for comments] to create a canonical public reference for the document," Google said. "This is independent from a standards track." The IETF document could help allay one concern VP8 critics have raised: that VP8 is defined not by documentation of the bitstream but rather by the source code of the software Google released to implement VP8. But the IETF document still plays a subordinate role to that source code.'"
Venue choice? (Score:4, Interesting)
Is there any significance to the fact that Google chose IETF instead of ISO (where MPEG-LA and M$ submitted H.264 and OOXML)?
Re: (Score:1)
Ok I just posted below a similar question, and checking out wikipedia seems to imply that they work with ISO:
The Internet Engineering Task Force (IETF) develops and promotes Internet standards, cooperating closely with the W3C and ISO/IEC standards bodies and dealing in particular with standards of the TCP/IP and Internet protocol suite.
So is this ment to bypass the ISO standardisation, and get it put straight in an ISO standard? That's a pretty imaginative way to play the game, kudos to them.
In a way, they
Re: (Score:2)
So is this ment to bypass the ISO standardisation, and get it put straight in an ISO standard? That's a pretty imaginative way to play the game, kudos to them.
No, they did this because it's much easier to put out an IETF RFC than it is to make an actual ISO standard. In terms of anything resembling an actual, official standard, Google is starting from a very weak position with regards to WebM. By going through the IETF, they can try to make WebM actually appear as though it's an open standard, whereas H.264 actually is an open standard.
Also, stating that the IETF cooperates closely with the ISO does not imply that creating an IETF standard somehow grants ISO stan
Re: (Score:3)
Re: (Score:2)
Pretty spot on except for:
It's not open in the sense that you can't acually write either of these [encoder and decoder], as much of the vital mathematics required are subject to patents.
That is false. You can write both of these. You just have to license the patents, which is openly available to all parties.
and:
So it depends if you consider 'open' to mean access to the specification, or the legal right to use that access.
Anyone can buy the legal right to use that access. H.264 is the very definition of an open standard (while WebM is not). It's just an example of one that is patented.
You're right that patents limit the ability to completely freely use H.264, but no one is stopped from licensing those patents. They are openly available to one and all.
Re: (Score:2)
Re: (Score:3)
Also, stating that the IETF cooperates closely with the ISO does not imply that creating an IETF standard somehow grants ISO standard status.
Indeed. But the notion that a standard must be from ISO to "count", is of course incorrect -- the canonical example being TCP/IP, which utterly trounced the competing ISO-standardized protocols.
Re: (Score:2)
Also, stating that the IETF cooperates closely with the ISO does not imply that creating an IETF standard somehow grants ISO standard status.
Indeed. But the notion that a standard must be from ISO to "count", is of course incorrect -- the canonical example being TCP/IP, which utterly trounced the competing ISO-standardized protocols.
It trounced on technical superiority. There goes your example. WebM is technically inferior to H.264.
Re:Venue choice? (Score:4, Interesting)
- RFC 2026 [rfc-editor.org]
There aren't that many actual IETF standards. The standards process isn't even a standard. HTTP is only a draft standard. RFC 1918 (which defines the 10.0.0.0/8 - 172.16.0.0/12 - 192.168.0.0/16 private IP addresses) is only a proposed standard, yet was published in 1996, and is in universal use.
Re: (Score:3)
Re: (Score:2)
Re:Venue choice? (Score:5, Informative)
Yes there is. Read on the MSFT XML history of going through ISO. It says all that there is to be said about ISO certification.
IETF may have its own politics (same as any standards body). However, out of all standards bodies it is the one which is probably the least corrupt.
Re:Venue choice? (Score:5, Informative)
I think it's pretty clear why they created an IETF RFP.
Re: (Score:3)
IETF also has items like RFC1149: A Standard for the Transmission of IP Datagrams on Avian Carriers
In other words, there is no filter; it seems anyone can submit anything to the IETF. My main concern over the WebM "specification" is best summarized by by the great analysis at http://x264dev.multimedia.cx/archives/377 [multimedia.cx]:
Re: (Score:2)
IETF also has items like RFC1149: A Standard for the Transmission of IP Datagrams on Avian Carriers
So the fact that the IETF RFC's have a history of April Fools jokes makes it bad? Perhaps they see the date and allow it because it's a good April Fool's joke. It's also amusing that you chose that one because someone actually did implement it :). Had a latency of 56 minutes if i remember.
Re: (Score:2)
Re:Venue choice? (Score:5, Informative)
IETF RFCs are just that - requests for comments. Anyone can publish one. The IETF assigns them a number, and they are public, but that's all. The IETF does not necessarily endorse them, they just publish them so that they can get feedback.
Within the set of RFCs there are some that are designated 'standards track'. These are ones that will eventually become IETF-endorsed standards. Most Internet-related standards are defined by a set of standards-track RFCs. These have a number of requirements, such as being free to implement (no known lurking patents) and having two existing, interoperable, independent implementations.
In contrast, some are informational RFCs, which basically just document existing practice. A company often releases one of these to let everyone else know what they are doing. It's basically a central location for publishing documentation.
Unlike a submission to ISO, this is not a request for standardisation, it's just a slightly more formal way of publishing documentation than popping it up on your own web server.
It's worth noting that publishing an informational RFC is sometimes the first step towards getting something adopted on the standards track. If I were in charge at Google, I would invite the IETF to form a video encoding working group and take control of the evolution of WebM.
Re: (Score:2)
A standards organization that allows competing standards to battle it out is completely worthless in that respect. They're supposed to pick winners and losers otherwise you don't get an interoperable s
Re: (Score:2)
Really? I thought the point was to be able to say that this program uses XYZ standard, then any program using XYZ standard should be compatible with that program. A standards body does not dictate what you can and can't use. Truth is nobody can.... It's called freedom.
Re:Venue choice? (Score:5, Informative)
The IETF is the correct body for something like this, not ISO.
ISO is a standards body, and the function of a standards body in every other industry is to take multiple incompatible implementations of a concept, figure out the best of each and combine them into a single common standard that everybody can support. Politics are an inherent part of it, since the entity whose current products is closest to the eventual standard stands to do well financially. Look at how OpenGL is developed, for an example of a proper standardization process. Companies implement the standard, then add extensions to provide new features and give themselves a competitive advantage. Then at the next standards meeting, OpenGL is enhanced to a common base by taking these extensions and making them part of the next version of the standard.
But for some bizarre reason, software types view standardization as just a giant design process (except design by committee, an extremely political committee). If HTML and CSS were to follow normal standardization procedures, for example, Firefox, Opera, Chrome, Safari and even IE would be free to extend HTML however they want, and then every couple of years the best extensions from all would be combined and rolled into the next version of HTML.
The IETF is the correct body for VP8, because VP8 doesn't need standardization. There are no multiple competing implementations that need to be brought into alignment. It exists, it works, fait accompli. This is the process by which most successful Internet protocols were created. Maybe in the future when people have new ideas about how VP8 can be enhanced, it'll need a standardization process. But for the time being, all we need are the details, published openly and clearly, so anybody can implement it.
Standardization is about evolution, not intelligent design.
Re: (Score:2)
If HTML and CSS were to follow normal standardization procedures, for example, Firefox, Opera, Chrome, Safari and even IE would be free to extend HTML however they want, and then every couple of years the best extensions from all would be combined and rolled into the next version of HTML.
That's pretty much what happens with HTML and CSS. The canvas element in HTML5 and the transform property in CSS3 were initially created and implemented by Apple in WebKit, and later adopted by the W3C.
Re: (Score:3)
Re: (Score:2)
And how many incompatible extensions to H.264 where there in its development? By being a standard, participating groups were able to extend H.264, but they didn't have to go through the process jpmorgan laid out. The ISO process allowed the industry to extend MPEG-4 to what it is today. Such a thing is not currently feasible with WebM.
Re: (Score:2)
It is not desirable for WebM, it was specifically designed to be simple and easy to implement (in the sense that there aren't huge amounts of optional features). Having a million profiles and 3D video might be good for MPEG-4, but it is counterproductive to WebM's current goals (baseline for the video tag, at least I'm pretty sure that's where Google is
Re: (Score:2)
Which does absolutely nothing to defend your statements that somehow ISO standards (like H.264) are created by a bunch of companies going around making proprietary standards and all vying to have their standard adopted officially.
To your specific response here, that's exactly why WebM is technologically inferior to H.264.
Re: (Score:2)
Re: (Score:2)
You are putting words in my mouth, good day.
You can't debate yourself out of a paper bag.
Re: (Score:2)
Apparently you've never seen how Open Source projects work. There are plenty of incompatible modifications that compete with each other, often replaced, sometimes permanently forked, etc.
On the other hand, I'm unaware of multitudes of competing and incompatible MPEG-4 implementations out in the wild during the standardization process. I am aware of different and not-entirely compatible profiles within the standard, but the standardization process was open, and not merely a competition amongst shipping techn
Re: (Score:2)
You're looking at too narrow a scope. We need standard video codecs, because we have multiple competing implementations of "video codec" that need to be brought into alignment. We have a good one, H.264, but the patents on it displease industry members so they're trying to make another good one without patents -- which is okay, because we often have multiple standards when there are tradeoffs to be made.
Much as the industry doesn't need an independently published international standard for "Duracell form
Let us be honest about H.264 (Score:3)
Is there any significance to the fact that Google chose IETF instead of ISO (where MPEG-LA and M$ submitted H.264 and OOXML)?
Let us be honest about H.264. Where it comes from and how it is used.
H.264/MPEG-4 AVC is a block-oriented motion-compensation-based codec standard developed by the ITU-T Video Coding Experts Group (VCEG) together with the ISO/IEC Moving Picture Experts Group (MPEG). It was the product of a partnership effort known as the Joint Video Team (JVT). The ITU-T H.264 standard and the ISO/IE
Remember the details of OOXML certification? (Score:2)
That's current "ISO style", regrettably.
ISO has severely compromised itself by following no standards at all, when certifying certain proposals as a "standard".
I understand Google wanted to stay clear of such an extra-corrupt "standards body" that works under close control by one of Google's main competitors.
Source code is fine! (Score:5, Insightful)
Well, you know, as long as it's not terrible code.
Once upon a time, the RFC for IP and the BSD code base (that *everyone*) used differed in some subtle way. W. Richards Stevens was the first guy to notice, years after both were written.
Guess what happened? They changed the standard.
Re: (Score:2)
Re: (Score:2)
Source code is nothing more than a rigorous specification. In fact, it is so rigorous that you can write a translator which automatically translates the specification into a binary you can run on your platform.
No specification will stop a vendor from writing proprietary extensions. If they publish updated source code, then whoever can catch up.
Also, isn't the VP8 source code freely usable?
Re: (Score:2)
Source code makes for a crappy specification.
Take a line that reads 4 bytes that determine the length of a section. Is the field proper big or little endian, or the application has to be compatible with both? Is it supposed to be signed or unsigned? If signed, what does a negative size mean? Can it be 0 and if so how is that handled? Are there any special values that aren't a literal length but indicate something else? Like 0xffff being used to indicate it's a section with a size >4GB and the real size i
Re: (Score:2)
> Source code makes for a crappy specification. [....] But that only tells you what the code thinks it is. ...and if the code *is* the specification, then you know what the specification says.
If you re-implement the code so that it behaves differently, you are not following the specification.
Now, it's entirely possible that we're talking about crappy code here -- but like I said earlier, good code makes a fine spec, IMHO.
Code is almost always deterministic, i.e. it will only do one thing. I have read man
Re: (Score:2)
Well, you know, as long as it's not terrible code.
Once upon a time, the RFC for IP and the BSD code base (that *everyone*) used differed in some subtle way. W. Richards Stevens was the first guy to notice, years after both were written.
Guess what happened? They changed the standard.
The world is far worse off without Mr. Stevens.
Re: (Score:2)
> The world is far worse off without Mr. Stevens.
Truer words have never been spoken.
So what does this mean? (Score:1)
Google could have just released documentation providing the specification, so how does the IETF help? So that they can call it IETF.4628 (Or however the IETF standards are named), or are they looking to make it an internet standard, rather than just a video standard like H.264?
If Microsoft was doing this (Score:1)
If microsoft was doing what google is attempting to do we would all be screaming bloody murder
Re:If Microsoft was doing this (Score:5, Insightful)
If microsoft was doing what google is attempting to do we would all be screaming bloody murder
Google produces open source, makes Linux software, and gives away free web services and doesn't care if you block ads, which would be trivial to detect and act upon when you're talking about the architecture of google. Microsoft has been convicted of abuse of their monopoly position. It is utterly unreasonable to treat google the same as convicted criminal Microsoft.
Re: (Score:2)
I mean, it sounds like what you are saying. That Google can do this, and its awesome, but it cannot be tolerated if we do a 's/Google/Microsoft' ?
Re: (Score:2)
So far, the comments (such as yours) seem to be about why Microsoft is evil, and not about why this move from Google should not be considered bad.
If Microsoft did this very thing, would you be rejoicing the move or would you be screaming bloody murder?
He's been asked and hasn't answered. Now you have been asked.
Re: (Score:2)
If Microsoft did this very thing, would you be rejoicing the move or would you be screaming bloody murder?
He's been asked and hasn't answered. Now you have been asked.
No, I answered the actual question. When Google takes an action that looks suspicious but could benefit everyone, they are probably doing something that will benefit everyone, as suggested by prior performance. When Microsoft does something that looks suspicious but could benefit everyone, they are without exception doing something that will benefit them and fuck everyone else over. We don't know for sure what their goals are yet, and this action could turn out to be positive or negative. As I stated previo
Re:If Microsoft was doing this (Score:5, Interesting)
If microsoft was doing what google is attempting to do we would all be screaming bloody murder
What, you mean producing a standard that actually matches the implementation and irrevocably granting free use of the necessary patents to everyone? How do you know how people would respond? Microsoft has never done that. They'e done the exact opposite, though...
Re: (Score:2)
First off Microsoft submitted the specifications for the Word format for standardization, in full and with none of that "implement this like Word 97" crap they pulled with OOXML or any patent over it whatsoever, I guarantee you half the posts would be "crap, Microsoft is actually doing something nice for a change?" with the other half being "how long until it's included in OOo?".
And secondly, there's no viable standard for online video yet so the comparison isn't apropos, the MPEG lovefest on Slashdot notwi
Re: (Score:2)
The Google lovefest on Slashdot is clouding your judgement.
Go ahead and tell yourself that, but in reality, preconceived stereotypes are clouding your judgment.
Haven't you noticed by now that a lot of people on Slashdot are against software patents? That they don't like money being forcibly removed from their wallets just because they tried to give something freely to the world? And that many like to be able to use a Free Software desktop without being treated like second-class citizens due to frivolous IP nonsense?
Put two and two together, man. It's pretty clear w
Re: (Score:2)
Most people on Slashdot these days are obnoxious. That's why Slashdot has lost most of its good commenters.
Case in point.
You go ahead and keep destroying Slashdot, one comment at a time. Just keep insulting people who disagree with you by calling them "stupid". Then you can turn Slashdot into your own little echo cha
Re:If Microsoft was doing this (Score:4)
If microsoft was doing what google is attempting to do we would all be screaming bloody murder
What Google is doing is far from ideal technically, but they have given us reasonable grounds to believe that their intentions are honourable: code that we can use freely, and a patent grant with no strings attached.
The technical shortcomings can be forgiven in view of the need to challenge H.264 quickly, and the need to work around patents held by others. I wish we had a codec with the technical qualities of H.264 and the legal qualities of VP8, but we don't. H.264 is irrelevant to me if I can't use it for legal or economic reasons.
When Microsoft has done something similar (like .NET, OOXML or ActiveX) there have usually been details in the fine print that either tie the technology to other Microsoft products or make it legally dangerous to use. What they have done in the past is not comparable to what Google is doing now.
Even Microsoft were to reform their behaviour completely, they would quite rightly be scrutinised very closely because of their past misdeeds.
Re: (Score:3)
Re: (Score:2)
If Microsoft announced a non-patent encumbered (to the best of their knowledge) codec and released the specification of the format to the IETF, we would not be screaming bloody murder.
I'd be out in the streets, speaking in tongues, for surely if this happened, the rapture would be upon us.
Re: (Score:2)
If microsoft was doing what google is attempting to do we would all be screaming bloody murder
When Microsoft releases their software for Linux under terms that allow it to be included in Debian, we'll be able to judge the truth of this claim. I'm not sure whether you're right or wrong, but I think hell is likely to freeze over before we'll ever find out! :)
Microsoft's history of backstabbing their partners and doing everything they can to lock in their users makes them a very different proposition from Google. I suspect that at least, some people would be shouting "it's a trap" if Microsoft did so
Re: (Score:2)
Why? Releasing something with an irrevocable royalty-free license is not a bad thing to do at all.
Hey Google? You want to win this war? (Score:5, Interesting)
Take some that immense R&D budget that you have and put a team of programmers on the task of getting VP8 encode/decode acceleration via OpenCL/CUDA.
The x264 team is sitting back and saying it can't be done, meanwhile a university has already posted the code for a modified x264 that uses the GPU to accelerate the pyramid search. The race is already started.
If x264 is further improved for GPU support and this makes it into FFMPEG, then the race is over...
Re: (Score:2)
I doubt that's going to be particularly difficult for 3rd party devs to do - the libavcodec version is only 1400 lines. I'm sure that there's enough reusable code for decoder blocks floating around to assemble an OpenCL version of it...
Re: (Score:2)
Forgive me for being understandably skeptic when a competitor of a product says something can't be done.
Re: (Score:2)
Hardware acceleration is essential for wide adoption, but in mobile devices to a far greater extent than desktops. If Android devices can be targeted by OpenCL, then Google should definitely put a lot of effort into that. It would also be very interesting if there were a good OpenCL implementation of VP8 and iOS supported OpenCL, since that would hurt Apple's position that H.264 must be used for all the devices that have hardware support for it.
boolean entropy coder (Score:2)
"essentially the entire VP8 data stream is encoded using a boolean entropy coder."
Well, duh.
Re: (Score:2)
A boolean coder is a special case that has an alphabet of exactly 2 symbols, which is not required.. or even common.
Most compressors use an alphabet of 256 symbols.
Why do people that don't know data compression make ignorant comments about data compression?
Re: (Score:2)
The ignorant gp poster probably thought "boolean entropy coder" was just a synonym for "lossy encoder." I didn't know what it meant and didn't hastily jump to that conclusion.
Smart move, Google... (Score:2)
Creating an RFC was a very smart move for Google.
First off, remember that when WebM burst onto the scene, Google made it pretty clear that they didn't want to monkey-around with improving the VP8 codec. Sure, maybe it could be improved, but they basically said that they just wanted to leave it as-is and have people start using the darn thing. The benefit of having it in use out in the wild outweighed any delays.
So, by submitting VP8 to the IETF as an RFC, they're not (necessarily) revisiting the question of
Re: (Score:2)
Re: (Score:1)
Not the same thing.
But I understand your point about wanting to avoid another VHS versus Beta or HDdvd versus Bluray format confusion. It would have a negative impact on consumers, especially those who barely know how to use computers ("How do I make the window fill the whole screen?"). Or those wondering why they can't play the VP8-encoded video in their iGadget, since it only supports MPEG formats.
Re: (Score:2)
Of course, Blu-Ray was announced before [wikipedia.org] HD-DVD...
Re: (Score:2)
H264 is closed, and not compatible with an open web. The winner needs to be VP8, or the web will be less open.
Re: (Score:2)
H.264 infringes on a patent I own. When adoption is sufficient, I will sue everyone who uses it (MPEG-LA doesn't indemnify its users against outside patent claims either).
(sure, I'm probably lying, but can you prove it?)
Re: (Score:1)
H.264 infringes on a patent I own. When adoption is sufficient, I will sue everyone who uses it (MPEG-LA doesn't indemnify its users against outside patent claims either).
How many lawyers do you have, how much do you pay them, and how good are they?
Re: (Score:2)
How many lawyers do you have, how much do you pay them, and how good are they?
Hi; I'm mrnobo1024's partner in this. Our IP budget for 2011 is about 10MUSD, but for 2012 we have dedicated 85 billion dollars. That may seem quite a bit, but you should know a) that the dollar is expected to fall to about 10% of it's current value and b) we also have a bunch of suits lined up against companies using MS Windows. This still means that we have 80% of the top lawyers in IP working directly for us and will have well over a billion 2010 dollars to go after the MPEG-LA licensee list.
We eve
Re: (Score:2)
H.264 infringes on a patent I own. When adoption is sufficient, I will sue everyone who uses it (MPEG-LA doesn't indemnify its users against outside patent claims either).
(sure, I'm probably lying, but can you prove it?)
That's ridiculous. You can use that to argue against the adoption of anything. "HTML infringes on a patent I own, don't use it or I'll sue you!" "Keyboards infringe on a patent I own, don't use them or I'll sue you!" "You can't prove me wrong, so I win!"
The burden of proof is on you.
Re: (Score:3)
Re: (Score:2)
I have. There have been independent analyses that call the patent status of VP8 into question.
http://arstechnica.com/open-source/news/2010/05/google-support-aside-webm-carries-patent-risks-from-mpeg-la.ars [arstechnica.com]
Re:WebM will never catch on (Score:4, Informative)
Interesting read, and it brings up two points which needs repeating. Specifically as to how VP8 does it's intra frame prediction.
From the linked article, Jason Garrett-Glaser (one of the developers of X.264) had this to say:
The other interesting point is the fact that the more a company discusses specific patents, the more they legally expose themselves to potential 'willful' violation in patent claims.
This is probably why no one is saying much of anything until everyone is ready to lay their cards on the table.
Re: (Score:2)
Re: (Score:2)
LOL.
Re: (Score:2)
That's ridiculous. You can use that to argue against the adoption of anything.
Exactly! now you see the value of the patent system for companies such as Microsoft and Apple.
Linux gaining too much ground on the corporate arena? "the Linux kernel may or may not infringe on over 200 of our patents, and we may or may not decide to sue everyone that uses it". VP8 threatening their plan for dominance of the online video market? "VP8 may or may not infringe on our patents, and we may or may not decide to sue people over it, eventually".
And the best part? given how corrupt the system is, appr
Re: (Score:1)
WebM/VP8 probably infringe on several MPEG-LA patents
Which parts of WebM/VP8 do you think probably infringe on which patents, and why?
Re: (Score:1)
I'm not privy to the technical details, but the U.S. patent office hands out patents to pretty much anybody who asks for them, and I'm willing to bet MPEG-LA has way more patents than Google has.
Re: (Score:3)
I'm not privy to the technical details
Why not? VP8 is public and has two independent implementations (libavcodec and libvpx) for you to look at. All patents are public. All H.264 patents are listed clearly on the MPEG-LA's website. It's easy to validate any claim of patent infringement. So, unless you are just spreading FUD, point to the patent and point to the part of VP8 that it infringes.
Re: (Score:3)
Here you go [arstechnica.com].
From the article:
"VP8's intra prediction is basically ripped off wholesale from H.264," he wrote. "This is a patent time-bomb waiting to happen. H.264's spatial intra prediction is covered in patents and I don't think that On2 will be able to just get away with changing the rounding in the prediction modes."
Re:WebM will never catch on (Score:4, Informative)
Re: (Score:1)
Re: (Score:1)
3) Mobile hardware has H.264 compatibility built-in, not so for WebM,
Will this argument DIAF, please. There are basically NO H.264 decoder ASICs in general mobile use, they're all DSPs with an H.264 codec implemented in software. Since WebM/VP8 is so similar to H.264, it would take quite some effort to create an evil DSP that can decode H.264 but can't be made to decode VP8 with comparable features/resolution/bitrate, and I assure you the general-purpose DSP in your smartphone will handle it just fine.
Re:WebM will never catch on (Score:5, Informative)
VP8 probably infringe on several MPEG-LA patents
Does it? The MPEG-LA has not produced any patents that it infringes, On2 presumably checked the (easy-to-find) list of MPEG-LA patents before shipping VP8, and the MPEG-LA is currently asking people to come forward with patents that cover VP8 - not something it would need to do if it already had a large pool of them.
Google has not offered to indemnify anybody who uses WebM
The MPEG-LA does not offer indemnity either. This was demonstrated quite well a couple of months ago when MPEG-LA licensees were sued for patent infringement over H.264.
Mobile hardware has H.264 compatibility built-in, not so for WebM
Most 'H.264 hardware' is really a DSP with a few things like [I]DCT in hardware. This same hardware can used for VP8 (it's typically already used for MPEG-2 and MPEG-4 ASP).
The media companies have encoded their content in H.264, they can't be bothered to re-encode it to WebM
YouTube is owned by Google, and they're going to be making everything WebM soon. I wouldn't be surprised if they only made the low-quality versions H.264 in the future and required WebM for the higher-quality encodings. This would let them keep iPhone users happy (low quality encoding isn't such a problem on a tiny screen), while forcing desktop users to install a WebM plugin.
Re: (Score:2)
YouTube is owned by Google, and they're going to be making everything WebM soon. I wouldn't be surprised if they only made the low-quality versions H.264 in the future and required WebM for the higher-quality encodings. This would let them keep iPhone users happy (low quality encoding isn't such a problem on a tiny screen), while forcing desktop users to install a WebM plugin.
Three things wrong/silly that I can see with that statement.
Re: (Score:2)
Few iPhone users use the YouTube website, as there is a native YouTube app preinstalled on each device.
What does that have to do with how Google striates their encodings?
Re: (Score:2)
Well, a plugin would only be required for IE users, and only until the sheer momentum of Chrome, Mozilla and Opera force them to implement it anyways. They don't have much to gain from h.264, so it's unlikely they'd continue the holy war to the end, unlike Apple.
Re: (Score:2)
IE would not need a browser plugin. All they would need is the proper DirectShow/Media Foundation filters and codecs, and it would simply support WebM (it would also cause Windows Media Player, and many other windows apps to support WebM). Just like Safari would need merely need the proper Quicktime codecs, for WebM to just work.
Graphics Card Manufacturers (Score:2)
Most 'H.264 hardware' is really a DSP with a few things like [I]DCT in hardware. This same hardware can used for VP8 (it's typically already used for MPEG-2 and MPEG-4 ASP).
At what level is the h.264 decoding provided by the graphics card manufacturers? Do you get a show_h264(x1,y1,x2,y2,&stream_buffer) function or are only those hardware transforms exposed and you have to ship your own decoder?
I get that the DSP can handle WebM's essential bits, but how much buy-in is required from the graphics card m
Re: (Score:2)
To the best of my knowledge, full blown PCs generally don't have DSPs for that sort of thing, but simply use GPGPU techniques for this, which could be done even without GPU manufacturer support, but I believe for some common codecs they due provide special support.
However for mobile products, the DSP is often completely separate from the 3D graphics chip, and is often built into the CPU. (Well, not built into the the ARM core, but still on the same piece of silicon.)
Sample code for accelerating common codec
Re: (Score:2)
GPUs these days are quite programmable, I'm pretty sure there are independent (from card manufacturers) codec implementations using video cards. That said, both AMD and Nvidia are listed as supporters [webmproject.org], so they probably have bought-in.
Re: (Score:2)
(This message is offtopic and a question for TheRaven64)
I have been reading your comments on /. for sometime and I always found your post detailed and interesting. Do you happen to maintain a blog or something similar ?
Re: (Score:2)
Re:WebM will never catch on (Score:4, Insightful)
Also, Sun took years to open source Java, yet Google open sourced VP8 in months, indicating to me that Google was sloppy and didn't do their due diligence.
Do you have any idea how meaningless that comparison is? The problem with open sourcing Java was that some parts were owned by Sun, some were licensed from third parties, and the licensed parts had to be either relicensed or replaced. In contrast, On2 was already shipping VP8 and had been working on it - specifically working around patents to produce it - since before Google bought them.
It's also worth pointing out that, not only are you comparing completely unrelated things, you are comparing completely unrelated sizes of things. The Java code is a couple of orders of magnitude bigger than the VP8 code.
MPEG-LA indemnifies users for the patents they own, not for patents outside their patent pool, which is way more than Google is offering to do.
Uh, what? You don't actually know the difference between indemnifying and licensing, do you? MPEG-LA and Google both offer licenses to their patents. MPEG-LA has a complex fee scale, Google provides you with 'a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable' (quoted from the patent license itself) to all of the patents that they own, or will acquire in the future, related to VP8.
Neither indemnifies you against damages from infringing on third-party patents.
More hoops to jump through.
Yes, shockingly, you actually do need to write some code to support new features. Not much (the libavcodec implementation is about 1,400 lines of code, reusing existing decoder building blocks), but slightly more than none.
I'm going to get modded down as a troll again for saying this (even though every one of my posts on this topic has been sincere, and labelled "troll" by reactionary Slashdotters), but Google doesn't own any of the content outside of users' home videos. The RIAA, MPAA, and gave studios produce most of the content that people are interested in, and that comprises more than 10 minute clips, and they're not going to re-encode in WebM. Even Apple couldn't bring those companies to their knees so what makes you think Google will?
You've not visited YouTube recently, have you? They stream TV shows [youtube.com] and movies [youtube.com] (most of them only in the USA, currently), with the consent and cooperation of the studios that own them. Google provides all of the infrastructure for this, including the choice of format.
Re: (Score:2)
Re: (Score:2)
You do realize that H.264 has the same problem? If someone was waiting for the hardware encoding to reach full saturation of H.264 in everything before coming forward with a patent that covers it, then the MPEG-LA can't do anything to help anyone. Everyone would get sued for tons of money and no one could argue against it. All that is needed is a single patent that covers H.264. Thus, H.264 is also a ticking time bomb, why shouldn't people stay away from it?
Re: (Score:2)
15-minute YouTube limit (Score:2)
Re: (Score:2)
Re: (Score:2)
Google hasn't offered to indemnify anybody, as MPEG-LA has.
Who exactly are you claiming that MPEG-LA have offered to indemnify, and what do you claim that they have offered to indemnify them against?
From their FAQ [mpegla.com]:
Re: (Score:2)
So you are saying that SCO had no patents or copyrights whatsoever? What utter nonsense. Of course they did. And so your logic dictates that "since SCO had real patents, any claim they make about patents must be true."
The MPEG-LA doesn't indemnify jack shit. So