FFmpeg's VP9 Decoder Faster Than Google's 101
An anonymous reader writes "A VP9 video decoder written for FFmpeg, FFvp9, now holds the title of being the world's fastest VP9 video decoder. FFvp9 is faster than Google's de facto VP9 decoder found in libvpx, but this doesn't come as too much of a surprise given that FFmpeg also produced a faster VP8 video decoder than Google a few years back with both single and multi-threaded performance."
Faster is not necessarily better: Quality matters. (Score:3, Interesting)
Re: Faster is not necessarily better: Quality matt (Score:5, Informative)
Re: (Score:1)
Re: Faster is not necessarily better: Quality mat (Score:3)
Re: (Score:3)
Re: (Score:1)
Re: Faster is not necessarily better: Quality matt (Score:4, Interesting)
Re: (Score:2, Informative)
If you were talking about VP8 you might have a point, but VP9 is widely superior to h.264 in a number of areas and competetive with h.265.
In short, before you write about what a loser someone is, should have sound idea of what you are babbling about.
The true believer mods-up "Informative" (Score:2)
If you were talking about VP8 you might have a point, but VP9 is widely superior to h.264 in a number of areas and competetive with h.265.
No links = No proof.
Re: (Score:2)
VP9 is *almost* as good as h.264, not h.265. h.265 is much much better than both. You need to recheck your facts.
Re: (Score:2)
Factually false. VP9 is approximately on par with h.264 by design. The goal of VP9 is not to improve quality over h.264, but to shrink the amount data needed. It's end goal is to get less data intensive than h.265.
As such, it's strictly inferior to h.265 which aims to improve quality while reducing data requirements. The current implementation, as far as I know, is inferior to HEVC in all areas. Of course, both implementations are very much a work in progress at this stage.
Re: (Score:3)
100 years ago, nothing supports H.264 in hardware either, yet here it is. I know, lets waste money making hardware for codecs that are not standards yet!
Well HEVC decoding does have hardware support and is rolling out to consumer devices right [androidos.in] now [hdtvtest.co.uk]. The shift to 4K which requires new hardware for everybody is probably the only chance at making a new standard, otherwise you'll have a non-trivial number of consumers with non-VP9 devices which means HEVC will be the de facto standard. Google has made a lot of bluster about their codec before, but YouTube still serves up video in H264. Unless they get really serious about pushing VP9 in hardware and very soon, h
Re: (Score:2)
VP8 didn't have a chance, because H.264 was long established and popular YEARS before VP8/WebM was even released. H.264 had 7 years of no competition, so yeah, it caught on, and VP8 didn't.
This time, though, VP9 was released head-to-head with H.265. Things could be very different. While companies start coming up with H.265 decoders, VP9 will be right there next to it, completely free to im
Re: (Score:2)
Because developing software for it, as well as possibly dedicated silicon (if you're not talking about GPU decoding but actual dedicated hardware as is done in mobile in many cases today) is far from free.
Re: (Score:2)
The software will be there. Google already requires Android devices be able to decode WebM, and I would expect them to update that to include VP9/Opus sometime in the future. Not to mention they provide a built-in media player to handle the formats.
Re: (Score:2)
I don't think you quite understand the complexity of the task at hand. Just because a huge company supports it doesn't mean that it will be there. Google itself is not exactly awesome with video players today (one of the big reasons to buy samsung for example is their vastly superior media player to that of google's own which lacks support for a lot of formats).
Specialists who actually can do necessary software design and programming in the field are rare, and most of them are already employed under very go
Re: (Score:2)
As one of those specialists, and part of the ffmpeg crowd, I can safely say you're utterly wrong on just about every count.
Re: (Score:2)
Okay.
I guess then we'll be seeing tens and hundreds of ffmpeg-like encoders available for both free (as in beer), free (of bugs), and actually functional and with functional GUI front ends.
Makes me wonder why we haven't seen that back in divx days or h.264 days going right now if you are indeed correct and I'm wrong. What's holding all these people back?
Re: (Score:2)
100 years ago, nothing supports H.264 in hardware either, yet here it is.
100 years ago, there were no 3G phones. H.264 has been in hardware since the first 3G phones came out. To overcome this installed base, a streaming service that relies on VP9 would have to ship all-new phones to subscribers who happen to have purchased a phone before VP9 hardware support became common. How would it make a profit?
Re: (Score:2)
That's only partially true. Some parts of h.264 decoding were indeed in the phones, but early ones were cripplingly limited. I have an older 3G phone that can only decode h.264 MP at 180p. Anything more than that and it will not decode. Installing a player with software decoder will enable ability to decode more, but at hilarious speeds of fractions of a frame per second.
Re: Faster is not necessarily better: Quality matt (Score:5, Insightful)
I haven't seen dropped frames in video in longer than that... on my desktop. My AMD E-350 based netbook, on the other hand... when it runs into something incompatible and can't do hardware decoding, it gets bad.
Besides, even if you have a decently powerful laptop, each second your CPU spends in higher performance states costs you battery runtime. Faster code gives you less heat and longer battery life for free.
Screen is the limiting factor (Score:2)
even if you have a decently powerful laptop, each second your CPU spends in higher performance states costs you battery runtime.
Yes, but is it significant? I thought the screen was the limiting factor, and the help file for that just says "reduce the backlight brightness".
Re: (Score:2)
Yes it is, if you were paying attention Intel's Haswell managed to increase "light use" battery life (spends most of the time idle) by 50% just by reducing the idle power drawn by the processor and platform.
If the processor had that much battery life impact when doing NOTHING, you can imagine it's of major importance to keep it idle as much as humanly possible.
Re: (Score:2)
Re: (Score:2)
I'm not sure why more desktops and notebooks don't have hardware decoders for most of the popular formats.
They do. Most desktops, notebook, and smartphones have hardware decoders for the most popular format. VP8/9 isn't it.
Re: (Score:2)
Not entirely sure what CPUs you were using this decade, but unless you have always been on a very powerful server/gaming grade machine, or only decoded stuff that is packed with older codecs, you are presenting an impossible scenario.
CPU power is very much behind the curve even today and has been so for last two decades at the very least when it comes to real time decoding of cutting edge video compression technology. It remains one of the barriers preventing widespread adoption of h.265 - in fact much of t
Re: (Score:2)
...yeah but when did you run out of cpu when decoding stuff last time? (counting on that you didn't do it on a raspberry..)
however, since this is sw we're talking about it's entirely plausable that googles version looks crappier too....
Re: (Score:1)
Agreed. And am I the only person in the world who still laments the transition to digital broadcast TV because analogue degraded far more gracefully? It was always worse picture rather than jerky/stalled picture, and worse sound rather than broken up sound.
And, no, watching TV over the Internet is not better at all: it's an utterly inefficient waste of bandwidth, and not subject to the lack-of-decent-QoS of the Internet: e-m waves OTA don't suffer congestion!
All that before the problem of having 400 crap ch
Re: (Score:2)
e-m waves OTA don't suffer congestion!
That's because the FCC or foreign counterpart has limited the selection of available programs to ease congestion.
All that before the problem of having 400 crap channels rather than 4 good ones.
What you define as a crap channel doesn't necessarily match what someone else defines as a crap channel.
Re: (Score:2)
I have given up on TV when analog OTA was switched off (in the country that used SECAM). Switching to digital should have made great sense because of the spectrum savings, but the 10 or 15 or so new channels are mostly garbage, the equivalent of shovelware. So the spectrum was polluted back anyway.
You have to buy a flimsy digital tuner piece of shit to get TV on an old TV instead of using nothing, and live with two remotes instead of one (and taking up a scarce SCART input). The quality went down, because y
Re: (Score:2, Funny)
The difference is because it doesn't send every twentieth frame to the mothership so they can target you with ads for buttplugs.
Or whatever else prominently features in the movie you're watching. Obviously. Just an example.
Re: (Score:3)
Why are you watching the justin bieber movie again?
Re: (Score:2)
he made a movie?
Re: (Score:2)
yea It was on amazon's instant video service for a while(still might be but has moved farther away from the movies I want to watch) that I don't care enough to look for it.
Re:Faster is not necessarily better: Quality matte (Score:5, Funny)
Stop pulling examples out of your ass.
Re: Faster is not necessarily better: Quality matt (Score:2)
Re: (Score:2)
Now that's what you call an "analogy"!
Re: (Score:2, Informative)
They're talking about decoding, where the output is either correct or it's not. Unlike encoding, where you can trade speed for quality and where a bad implementation can achieve little of either and still be 'correct'.
Re:Faster is not necessarily better: Quality matte (Score:5, Informative)
This is false. Decoding for modern video formats is strictly defined, and all decoders must produce bit-perfect output. You can add as many filters as you want after that, but that's a postprocessing step in the video player and has nothing to do with the decoder. Things like in-loop filters are strictly defined as part of the decoding process and must be there for the decoder to be considered correct.
Re: (Score:2)
Re: (Score:2)
Better decoders add filters to reduce compression artefacts
You can add as many filters as you want after that, but that's a postprocessing step in the video player and has nothing to do with the decoder.
Let me rephrase: Subjectively better decoders aren't pure decoders at all. They're combinations of a (possibly hardware-accelerated) decoder with an (also possibly hardware-accelerated) additional post-processing filter that enhances subjective video quality. All video player front-ends using this decoder-plus-filter component benefit from the filter's enhancement.
Re: (Score:2)
That's only true when you have a constant bitstream.
Recovering from dropped or badly ordered packets is usually different between decoders.
Re: (Score:1)
Core 1 Duo machines are perfectly usable if the software is just written sanely.
Bad choice of example. Core 1 machines are pretty much garbage thanks to one stupid choice by Intel, the return to 32 bit. 64 bit was already standard at that point. We could have had Windows 7 as a 64 bit exclusive if it weren't for the fucking Core 1 and similar timeframe Atoms meaning there were 32 bit only systems still under warranty.
Fuck Intel for releasing those things.
Re: (Score:3)
Fuck you. Windows 7 32bit is needed for all those Athlon XP and Pentium 4 machines still around. Enough of them will be turned into botnets in a couple of monthes already.
Businesses would have got all pissy anyway without the 32bit version, which allows to run Windows 3.1 software or 32bit software with random silly issues (and Windows 2000/XP drivers too)
Re: (Score:2)
I'm sure you can run bleeding edge linux with LXDE and gnumeric on that Celeron 333, and edit spreadsheets just fine. That Celeron is better than a Raspberry Pi : it has about the same CPU performance, and better disk and networking I/O (done over a PCI bus whereas Pi does all on a single USB port). If we could buy a cheap ass H264 etc. PCI decoding card, the Celeron would be suprisingly current.
Re:Faster is not necessarily better: Quality matte (Score:5, Informative)
Re: (Score:3)
Dude. It has the same hash value. What do you think?
Re: (Score:1)
I don't know. MD5 is no longer considered a collision free hash. It is entirely plausible that the VP9 implementation will output only same MD5, but not same output for a few frames. They should really upgrade to SHA-256 or stronger.
Re: (Score:1)
I don't know. MD5 is no longer considered a collision free hash. It is entirely plausible that the VP9 implementation will output only same MD5, but not same output for a few frames.
Really? I think you do not know what you are talking about. There is a higher chance of the sun going nova tomorrow than code output producing same MD5 hashes, by accident.
Re: (Score:2)
So, is the quality of the output equivalent or has it suffered due to compromises due to the speed increase?
It probably just means the reference implementation wasn't optimized very much.
Re: (Score:2, Informative)
Todays formats are different from the MPEG2 and DivX ;-) of yore and generally require decoding to be a bit-exact process. There is only one correct way to decode a frame. If a decoder gives a different result, it doesn't just have "bad quality", it's wrong. This has obvious benefits, when compared to the days where you had to be lucky and hope that your decoder uses the same way to implement a DCT as the encoder that was used, or suffer suboptimal results. However, since this fact isn't very well known yet
Video decoders offer filters to hide artifacts (Score:2)
Re: (Score:2)
Also, how often does it crash?
The winner of the current crash-record on my computer is ffmpeg.
Re: (Score:2)
I'm not a video expert, but I did write an H.261 codec once.
I don't think it's practical in a VP9 decoder to save time by cutting quality. The Huffman decoding stuff all needs to be bit-exact. The DCT is pretty standard; you would just get a fast implementation of DCT and use that.
I suppose you could sleaze the mixing and filtering steps but the results would probably be so horrible that nobody would want to see it... part of how video decoding works is to refer back to previously-decoded images. The way
Whoo-hoo! (Score:1)
Ever faster porn!
Re: (Score:2)
Or less fan noise while watching porn.
Re: (Score:2, Insightful)
Or less fan noise while watching porn.
If you want less fan noise you should go with a format your system have hardware decode support for. That would not be VP9.
Re: (Score:2)
We're on a site traditionally slanted towards linux and FLOSS. Even when we have decoding hardware, it's not supported most time. There's even recent news about Ubuntu 14.04 specifically not supporting some of it!, depending on your hardware and drivers.
Is that a feature or a bug? (Score:3)
Re:Good, Fast and Cheap... Pick Any Two (Score:5, Insightful)
My understanding is that there is no room for decode artifacts in this - you either do it right, or it's not a proper decoder. This is a proper decoder, so will produce identical output to the google standard one. I believe there are test streams with md5s for the test frames, and this decoder passes the tests.
So, it's free, and it's correct, and it's fast. I think you have pre-conceived prejudices which are in this case wrong ;-)
From my perspective, faster is good for low power devices, so if this helps spread decent video codecs to more devices, that's a win.
Re:Good, Fast and Cheap... Pick Any Two (Score:4, Informative)
A decoder is indeed normally specified with bit level output requirements. Two different decoders thus generate exactly the same decoded bitstream. Some hardware decoders do generate 'wrong' output, but it is either that or 3-4 times as much battery drain. It is also not so important when watching a fullHD movie on a 320x480 screen wether all 18 bits of output are right.
"De Facto"? (Score:1)
Re: (Score:2)
Re: (Score:2)
decoding may be faster, but encoding is still drea (Score:4, Insightful)
This may be OK for google, which encodes a video once and then sends it to many many customers (youtube), the bandwidth savings pay for the increased CPU cost.
But for most users, that's just not acceptable. Until they get the speed up to a reasonable, we'll keep using what works: x264 or vp8
Re: (Score:2)
Re:decoding may be faster, but encoding is still d (Score:4, Informative)
Most users never encode a single video in their life. (Except for cameras on devices, and who is doing 4k video on thier phone these days?)
And if encoding takes 50x longer, that's 50x the resources Google needs to keep up with the work flow.
So you have it totally backwards.
Not to mention that we are talking about 4k-targetted codecs, so you should be comparing to H.265, not H.264. The additional computations for encoding H.265/VP9 are to reduce bandwidth requirements. If you don't care about bandwidth, feel free to generate a 5GB H.264 video.
WebRTC (Score:2)
We will likely see an increase of client-side encoding VP9 through WebRTC.
Let others debate the extent of use, but browser-based video chat and screencasting will likely increase. This should pop many I've-never-encoded-a-video-before cherries.
The usage of VP9 will not be solely for 4k videos.
ffmpeg/Google? (Score:3)
Google use ffmpeg quite a lot through Youtube. I wouldn't be surprised if they'd contributed quite a lot to the ffmpeg codebase, fixing bugs and performance issues. How much of this did Google's staff actually write?
Go ffmpeg! (Score:1)
I love H.264. I do not love that its proprietary, nor do I believe it should be. x.264? I hope someone can create a free codec equivalent to or superior to H.264 so I give props to Google for giving it a go with VP8 and VP9.
The founders of Oblong Industries, Inc. [oblong.com] were responsible for the visuals in the 2002 movie the Minority Report [imdb.com]. Their company started shortly after that movie and has been in the black for more than 10 years.
These images show the technology as conveyed in the movie. [google.com] The video in