

AV1 is Supposed To Make Streaming Better, So Why Isn't Everyone Using It? (theverge.com) 39
Despite promises of more efficient streaming, the AV1 video codec hasn't achieved widespread adoption seven years after its 2018 debut, even with backing from tech giants Netflix, Microsoft, Google, Amazon, and Meta. The Alliance for Open Media (AOMedia) claims AV1 is 30% more efficient than standards like HEVC, delivering higher-quality video at lower bandwidth while remaining royalty-free.
Major services including YouTube, Netflix, and Amazon Prime Video have embraced the technology, with Netflix encoding approximately 95% of its content using AV1. However, adoption faces significant hurdles. Many streaming platforms including Max, Peacock, and Paramount Plus haven't implemented AV1, partly due to hardware limitations. Devices require specific decoders to properly support AV1, though recent products from Apple, Nvidia, AMD, and Intel have begun including them. "In order to get its best features, you have to accept a much higher encoding complexity," Larry Pearlstein, associate professor at the College of New Jersey, told The Verge. "But there is also higher decoding complexity, and that is on the consumer end."
Major services including YouTube, Netflix, and Amazon Prime Video have embraced the technology, with Netflix encoding approximately 95% of its content using AV1. However, adoption faces significant hurdles. Many streaming platforms including Max, Peacock, and Paramount Plus haven't implemented AV1, partly due to hardware limitations. Devices require specific decoders to properly support AV1, though recent products from Apple, Nvidia, AMD, and Intel have begun including them. "In order to get its best features, you have to accept a much higher encoding complexity," Larry Pearlstein, associate professor at the College of New Jersey, told The Verge. "But there is also higher decoding complexity, and that is on the consumer end."
They Don't Care (Score:1)
Clearly the other streaming services just don't care. Bandwidth is too low on their cost matrix for even a significant savings to matter to them. They just don't want to bother with the complexity of setting up another encoding system. It's not like they have to switch; they just need to encode shows in both, and then stream to customers with hardware that supports the new codec. If they just did this for their top hits, they could save a ton of bandwidth with minimal effort.
Re: (Score:2)
Re:They Don't Care (Score:5, Insightful)
No, it's lack of hardware support. I called in to an AOM webinar end of October entitled "Is Real-Time AV1 Ready For Prime Time?". Meta, Google and Microsoft for instance all complained that there isn't enough hardware support, e.g. only in higher-end phones or some GPUs. Agora didn't mind so much because most of their users were desktop browsers and they could cope with CPU decoding.
There's also the cost factor. A streaming service isn't going to switch off AVC or HEVC streams just because they've started to use AV1. They want to support the long tail of users who can't decode AV1, so that means adopting a new codec increases their bandwidth, storage and CDN costs, as well as making their encoding and packaging pipelines more complex and expensive. Furthermore, many vendors that have already switched/added from AVC to HEVC aren't going to switch to AV1 because the savings aren't enough and are more likely to wait for something better such as VVC.
Complaining that usage is not widespread enough seems to be common and it shows a lack of understanding of codec adoption. HEVC usage is still growing and it was standardised in 2013. AV1 needs a few more years.
Give it more time.
Re:They Don't Care (Score:4, Informative)
Nah. Three issues:
- AV1 playback hardware is not even across the board. Particularly in mobile phones and SmartTV's. The iPhone 15 Pro (2023) is the first iphone that supports it. No Android device guarantees support for it in hardware, and considering how utterly trash Android hardware is, it means the experience is not uniform. Pixel 6 and Samsung S21 (both 2021 devices)
- AV1 Encoding support only exists in cards people do not have. (RTX 40xx/50xx / Intel Xe Arc Series)
- AV1 Playback iGPU support is only available in Intel 11th gen+
So that means that distribution of AV1 content (eg 4K HDR, 1080p60+) that really needs the codec can be served by fixed function hardware that most of this stuff already has, eg HEVC (h.265) and has been paid for.
Now that said, for google, netflix, amazon, disney+ etc, you aren't going to encode the same show 4 times (h264, h265, AV1, and obviously whatever the media was made in) you're going to encode the show once at a high bit rate that makes sense to convert to these other codecs in HDR and non HDR mode.
Unfortunately until the current smartphone upgrade cycle is done, the majority of the devices out there are not going to support AV1, and it can be assumed that anyone who wants AV1 as the default has to wait. Like Twitch and Youtube have been talking about AV1, but never send you AV1 in any situation, even when your hardware supports it.
Re: (Score:2)
Hardware decode support isn't critical for all devices. For example, an iPhone 14 or 15 non-pro can play AV1 just fine with software decoding, because they have more than enough compute available to do it, it's just much less efficient. Doesn't really change the big picture, lots of devices don't have the processing power to brute force it, so decode support isn't widespread enough, it just means that the situation isn't *quite* as bad.
Easy (Score:4, Interesting)
Re:Easy (Score:4, Informative)
If you have a RDNA2 or above GPU, or an Intel 11th gen / AMD Ryzen 6000 or above CPU, they support AV1 decoding.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Well, OK, but in that case it's not AV1 to blame. I wonder if your machine could even hardware decode HEVC.
Re: (Score:2)
What about my old GeForce GTX 750 Ti and 8800 GT?
Re: (Score:3)
Re: Easy (Score:3)
I can't speak to every codec because I haven't tested them all, but Nvidia does a great job with H.265. I was disappointed with their H.264, but the 265 is peachy.
Re: (Score:2)
Re: (Score:2)
I have a 4060 Ti, but I'm not honestly sure if they're using different algorithms on different GPUs or if they just run faster on newer hardware. Both H264 and H265 were very fast, but for a given bitrate the H265 just craps on the H264. I did not compare to other encoders, though. I was just finding something I could live with that would play on my TV.
how similar are the subscribers bases (Score:3)
Prime video has a lot of uses because it is bundled. Netflix is probably the biggest name in streaming.
Is there any reason to think the same general population has peacock or paramount+?
I have had paramount off and on so I could see Star Trek - Latest, but other than that it seems like a lot of back catalog that was not especially worth revisiting. I have no idea what the user make up is or isn't but maybe people that want to pay to watch old comedies from the '80s and 90s sitcom reruns over lap with people that have hardware that can't use AV1?
Its really not that much better than VP8/9, but you'll need pretty recent Apple device to get hardware support for AV1. Maybe the percentage of subscribers who can use it for the niche streamers just isnt there.
Re: (Score:2)
We get both Paramount and Peacock because of bundles. The latter is standard for any long time Comcast customer. The former... I don't remember what bundled it (not our phone, I know that, I want to say it was a store's "plus" service) but that's how we have it.
But yeah, they're bundled for a lot of people.
Re: (Score:2)
That actually might explain it right there. My parents still have 'cable' even though it has been IpTV underneath for over a decade. Still that Comcast STB they have is probably at least 5 to 7 years old since it was last replaced.
I bet that is true for a lot of subscribers and given the ownership situation with Peacock that probably means a huge portion of their viewership doesn't have a device that does AV1.
AV1 making streaming better? (Score:2)
It will make services stick with an idea longer than a handful of episodes before cancelling?
Complexity (Score:2)
AV1 is insanely complex to decode when compared to 264 or even 265, and very few have a hardware decoder.
The tradeoffs for lower bandwidth and no royalties really do not give any advantage at all - at least not now.
It's device support (Score:5, Informative)
And seriously nothing else. Everything else in the article is just fluff. I worked for a streaming company when h265 was taking hold, no different. We fairly regularly would survey all of the devices using our service, to weigh how many of them would actually use it if we started encoding all of our library in h265.
It's the same thing, Apple in particular, was really slow to adopt AV1. No Apple TV supports it, and only very new iPads, iPhones and Macs do. (All of the ones for sale today do, but most streamers customers will be on older ones still.) Android and PCs are a better, but not by much. Big streaming services will start to adopt it once they can say something like 40%+ of their users will get a good experience from it.
Even if it saves 50% bandwidth, that isn't worth some very expensive reencode of a huge library if only 20% of users get that benefit. Just like h265 was ten years ago, the encoders are still really resource intensive, but getting better every day, so they wait.
Re: (Score:2)
Re: (Score:2)
Even if it saves 50% bandwidth, that isn't worth some very expensive reencode of a huge library if only 20% of users get that benefit.
And this is the kind of mindset that shoots most businesses in the foot. It doesn't matter if not every customer can receive AV1, you serve them up whatever the best version is for their hardware. If they can receive AV1, then company immediately saves money on bandwidth which is more expensive than encoding. AV1 hardware will only increase in availability over time. Might as well be somewhat ready. It helps to have content people actually want to watch - Max, Peacock, Paramount, and AppleTV are awful for t
Not that much better than VP9 (Score:3)
Re: (Score:2)
I played with svt-av1/ffmpeg about a year ago and my conclusions were about the same as yours or worse. Depending on the source material, the space/bandwidth savings were between about 15% and -10% (yes, 10% larger than the source material) while power consumption and CPU utilization was much larger that software h264/265
Re: (Score:2)
Was that on 1080p or 4K? From what I remember AV1's major benefit would have been on 4K but it has taken time for 4K to be the standard. Even now while some new content is 4K, older content at best is upscaled. As a matter of need, AV1 has not really been a priority thus slow adoption rate.
It parallels the discrepancy between Bluray vs DVD adoption. Most consumer got immediate benefits updating from VHS to DVD. There were some benefits to updating to Bluray, but many still had older collections on DVD.
Re: (Score:2)
Last time I tried the svt-av1 encoder in ffmpeg, it still didn't pass captions/subtitles through like libx264/libx265 do. Getting those into the final encoding are a requirement for me, and I don't like having to do several extra steps to make that happen.
Also, only one of my three Roku units supports AV1 playback, so I haven't bothered for that reason. libx265 at defaults looks worse than libx264, so everything I've transcoded for archiving is with libx264.
Re: (Score:2)
20% smaller than VP9 is still in the ballpark. TFA says 30% vs h.264. Not to mention your tests don't mean anything because any real video encoding/decoding is not going to be purely CPU.
Slow and slower? (Score:2)
Re: (Score:2)
Hardware support (Score:2)
A lot of older hardware that people may still be watching on doesn't support AV1 decoding. Only very recent gen video cards support AV1 encoding.
Also realistically its just not THAT much better IMHO. H265 is a significant step above H264. AV1 vs H265 though just isn't as extreme (quick google searching says that H265 offers a 50% reduction in filesize versus an equivalent quality H264, whilst AV1 typically only results in a 12% reduction in size vs H265).
Personally for my Jellyfin library where I archive
Re: (Score:1)
This absolutely is, I need to pick up an Nvidia 4060 or a Radeon RX 7900 at least to get hardware encoding. We still have people fighting tooth and nail to stay on windows 10, since their hardware doesn't support windows 11.
It will probably take about 10 years for support to be widespread enough to matter. Throw in support being subpar (as it's new) and H265 will hang on for awhile. H265 is likely to hang on just as well as gif and jpeg did just through literal momentum by how much exists, A1 will end up be
More expensive and more trouble (Score:2)
Encoding is more expensive and slower, and you get more bugs with all the random setups people use to play videos. H264 just works.
Hardware (Score:2)
Support for hardware decoding is still pretty new but it will get there eventually. Nobody will notice the changeover except for obsessive nerds.
Make Streaming Better (Score:2)
How about including all the old titles in your on-line catalog? I'd even put up with a bit more buffering delay if I could only just watch the movie.
DRM (Score:2)
publishers want a format that can do DRM. Even when they do not use DRM yet, but they may want later or have some files protected and some not. Guess what formats work with common DRM and what don't.
Energy consumption? (Score:2)
I'm talking about the total : encoding, decoding, file transfert (but let 's forget the cost of renewing hardware)
Genuine question. Does anyone know?