Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
The Internet Entertainment

Netflix Uses AI in Its New Codec To Compress Video Scene By Scene (qz.com) 67

An anonymous reader shares a Quartz report: Annoying pauses in your streaming movies are going to become less common, thanks to a new trick Netflix is rolling out. It's using artificial intelligence techniques to analyze each shot in a video and compress it without affecting the image quality, thus reducing the amount of data it uses. The new encoding method is aimed at the growing contingent of viewers in emerging economies who watch video on phones and tablets. "We're allergic to rebuffering," said Todd Yellin, a vice president of innovation at Netflix. "No one wants to be interrupted in the middle of Bojack Horseman or Stranger Things." Yellin hopes the new system, called Dynamic Optimizer, will keep those Netflix binges free of interruption when it's introduced sometime in the next "couple of months." He was demonstrating the system's results at "Netflix House," a mansion in the hills overlooking Barcelona that the company has outfitted for the Mobile World Congress trade show. In one case, the image quality from a 555 kilobits per second (kbps) stream looked identical to one on a data link with half the bandwidth.
This discussion has been archived. No new comments can be posted.

Netflix Uses AI in Its New Codec To Compress Video Scene By Scene

Comments Filter:
  • AI? (Score:3, Insightful)

    by jgotts ( 2785 ) <jgotts AT gmail DOT com> on Thursday March 02, 2017 @10:20AM (#53961511)

    Why are they calling it AI? That's silly.

    It's just an improved encoding scheme with better algorithms.

    Nothing new to see here. We've been improving video encoding schemes since we started encoding video.

    • Because AI is this year's great new marketing term, like 3d printing was a few minutes ago, like cloud computing was a few hours ago...

      All it means is someone can charge more for something, somewhere while us poor suckers get used to [my opinion] what most certainly will be dumbed-down lower-rez imaging.

      Watch. In a few years they'll be pushing what, to my eyes, looks like 8-bit pixellated graphics as high quality and it will be accepted. But I'm a cynic.

      • In addition to AI being trendy, there are robots, specifically the idea that robots are going to replace jobs.

        Machines, and automation have been replacing jobs since the dawn of the Industrial Revolution, and has been more aggressive with the invention of Microprocessors.

        For some reason people seem to think that there's a new phenomenon where Lt. Commander Data and Bender are going to be replacing jobs.

    • Re: (Score:3, Informative)

      by TheRaven64 ( 641858 )
      Hi, welcome to 2017. AI is now defined by the media to mean 'thing using algorithms'. In related news, algorithm is now defined to mean 'scary thing the reader probably doesn't understand'.
    • Well no that is not silly at all. Modern video codecs are in fact a toolbox containing different techniques that are more suited for such r such type of scenes.
      - action movies would have better movment over definition
      - still sequences
      - sequences where only part of the scene is moving.

      Using deep neural network to
      1- identify which type of scene and adjust codec settings on the fly
      2- compare the rendering to the original uncompressed version
      3- readjust if necessary and learn from the situation

      would be a breakt

    • The improved algorithms are being driven by machine learning. They trained it to recognise when a scene needs a higher bitrate to look good, so that humans don't have to guide the encoder.

      I don't necessarily agree that all machine learning or neural networks merit the term "AI", but nor do I class software that must be trained on a dataset to function as an "algorithm", except in the broadest sense.

  • by jpellino ( 202698 ) on Thursday March 02, 2017 @10:24AM (#53961537)
    middle-out.
  • You sa "AI"? (Score:4, Insightful)

    by bobbied ( 2522392 ) on Thursday March 02, 2017 @10:32AM (#53961597)

    I don't think that word means what you think it means...

  • VBR isn't rocket science, and not new. Great that they're using it. GPU transcoding is really helping these days.

    • by kriston ( 7886 )

      Exactly. DISH Network and DirecTV have been using this since the early 1990s and big-dish C-Band has used it even longer.

      I fail to see what is new here, and, from what I understand, I'm even more surprised that Netflix wasn't aware of this technology from the very beginning.

    • VBR within a frame is relatively new or not talked about much if it isn't. This is talking about which macroblocks to give more bandwidth to based on their content and relative importance within the frame.

  • by Solandri ( 704621 ) on Thursday March 02, 2017 @10:53AM (#53961711)
    Netflix does use AI in developing the video compression algorithm. The problem with encoding videos with lossy algorithms is that video quality is a subjective thing. You need a person to watch it and tell you how good the video quality looks. This makes it rather slow and difficult to do A/B testing, not to mention how boring it is watching the same clips over and over with different encoding.

    Netflix got around the problem by using machine learning to teach a computer when video quality looked good. They had a bunch of people watch videos with different compression and rate the quality, then told the AI that their ratings were gospel. It then analyzed the different videos and decided for itself which features were associated with good quality. Once the computer was generating the video ratings as people, they had a rapid way to do A/B testing. That allowed them to optimize their compression algorithm in much less time than with using humans to rate video quality.

    I'm not sure why Summary links to some popular news article which talks in general about Netflix using AI, instead of linking to the actual Netflix page describing exactly what they did [netflix.com]. This used to be the sort of technical detail you'd expect from slashdot submissions.
  • Call it anything you want: "Netflix uses bagels to compress video" I don't really care. I just wish they would take a closer look at the darkest parts of a scene and stop compressing the hell out of it. Visible gradients ruin every single scene always.

    • Maybe your display brightness/contrast settings are wrong?

      • by cetan ( 61150 )

        I view streaming content on a variety of devices off of a perfectly acceptable cable internet connection and I still see the compression, but the worst of it is seen on the "main" family TV. Netflix offers the best experience (followed by Amazon Video, followed by the truly horrific Google Play), but it's still there.

        I fully admit that I am not a hardcore video guy and not obsessed with tweaking a bunch of TV settings so there is indeed room to make adjustments. That said, I'm very happy with up-scaled DV

    • by zlives ( 2009072 )

      they do a better job on their 4K stuff for sure, still no comparison to 4k media. i wonder if they compress based on BW availability.

  • by Anonymous Coward

    "No one wants to be interrupted in the middle of Bojack Horseman or Stranger Things."
    Actually, if I am every watching Bojack Horseman... interrupt me any way possible. Use bullets if necessary.
     

  • Netflix is disappointment growing by the days. Soon Nextflix may have only a single video in 1000 categories with compression showing a single frame as all.
  • by BrendaEM ( 871664 ) on Thursday March 02, 2017 @11:03AM (#53961789) Homepage
    Soon Nextflix may have only a single video in 1000 categories with compression showing a single frame as all.
  • Except in edge cases, videos don't stutter because they take slightly more bandwidth than you have available. They stutter because the buffers aren't deep enough to overcome network jank, and my understanding is that streaming providers use shallow buffers for content-protection reasons (it's not like you're going to suddenly switch streams 45 minutes into a movie).

    Put another way, the difference between a 500 kbps stream and a 250 kpbs stream isn't going to improve your rebuffering experience on a link wi

    • Tangentially related, it's rather infuriating that (at least on a Chromecast) going back 30 seconds requires re-buffering. Perhaps, as you say, this is due to content protection reasons.

      In any event, it makes missing a bit of dialog a frustrating experience -- I'd love a "skip back ten seconds and turn on subtitles temporarily" button, with all the content already buffered...
  • Take a scene of a pretty mother breast feeding. What a male considers as interesting blocks/parts of the image is totally different from that of a female. The AI may choose to drop details from one block than another based on its training set (or based on what it thinks the viewer cares). Essentially now the viewer is served only stuff that the server thinks what may be liked. That is it's producer doing the choosing; rather than the consumer. Not sure if it's a good thing or bad..but at times we want to se
  • >> compress it without affecting the image quality,

    If the compression used is in any way lossy, affecting image quality is by definition inevitable.

  • Please stop calling simple algorithms AI. Algorithm != Artificial Intelligence. Stop losing our language to the losers in the marketing department. Respectfully, Huge Dilbert Fan
  • by fahrbot-bot ( 874524 ) on Thursday March 02, 2017 @02:06PM (#53963553)

    "We're allergic to rebuffering," said Todd Yellin, a vice president of innovation at Netflix. "No one wants to be interrupted in the middle of Bojack Horseman or Stranger Things."

    Or porn. "Yes, yes, yes..." (buffering ...) [ Nooooooooooooooo.... ]

Systems programmers are the high priests of a low cult. -- R.S. Barton

Working...