Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Software Graphics

LGPL H.265 Codec Implementation Available; Encoding To Come Later 141

New submitter Zyrill writes "The German company Stuttgarter Struktur AG has released a free and open source implementation of the H.265 codec, also termed 'High Efficiency Video Coding (HEVC)' which is now available on Github. At the same video quality, H.265 promises roughly half the bitrate as compared to H.264. Also, resolutions up to 8K UHD (7680 × 4320 px) are supported. The software is licensed under LGPL. Quoting from the homepage where the software is also available for download: '[This software] is written from scratch in plain C for simplicity and efficiency. Its simple API makes it easy to integrate it into other software. Currently, libde265 only decodes intra frames, inter-frame decoding is under construction. Encoding is planned to be added afterwards.'"
This discussion has been archived. No new comments can be posted.

LGPL H.265 Codec Implementation Available; Encoding To Come Later

Comments Filter:
  • Codec? (Score:5, Funny)

    by Hatta ( 162192 ) on Thursday September 05, 2013 @11:37AM (#44766183) Journal

    If there's no encoding, isn't it just a dec?

    • Re: (Score:1, Troll)

      Dunno why this is moderated funny. Apparently today's batch of mods don't actually know that the word codec is an abbreviation for coder/decoder.
    • Re:Codec? (Score:5, Informative)

      by Sulik ( 1849922 ) on Thursday September 05, 2013 @12:33PM (#44766715)
      Since it's intra-frame only, it's not even a 'dec' -> maybe just a 'duh'
    • Re:Codec? (Score:4, Informative)

      by adisakp ( 705706 ) on Thursday September 05, 2013 @12:33PM (#44766721) Journal

      If there's no encoding, isn't it just a dec?

      Nice joke. Although in the current vernacular of media encoding and playback, codec is commonly used to describe modules or libraries that provide EITHER encoding or decoding (or both). For example, if you get a "codec" pack to play media formats, it's often just the decoders. And when you list codecs in FFMPEG, it tells you whether it supports encoding or decoding as separate flags.

    • Re:Codec? (Score:5, Informative)

      by Alsee ( 515537 ) on Thursday September 05, 2013 @01:00PM (#44766973) Homepage

      It's not even that. The current version is basically just a glorified slideshow viewer.

      The way most video codecs work is they start by storing a full picture once every second or two. These are called key-frames, or intra-frames. The frames in between key-frames are called inter-frames, and this is where 90+% of the real work of a codec happens. These frames are stored as a short description of how the current frame is different than the last key-frame. Instead of storing the full picture you just describe what parts of the picture are moving, or if part of the picture is getting brighter or darker, or if colors are shifting.

      Currently, libde265 only decodes intra frames, inter-frame decoding is under construction.

      It's essentially a slideshow viewer, showing something akin to a series of JPEG pictures. Basically the entire CODEC is missing, the part that compresses and decompresses all the video frames in between.

      -

      • I'm admittedly totally nitpicking.

        Is it really as little as every second or two? I thought typically it was at least 2/second, at least for MPEG 2. (The wikipedia page mentions every 15th frame, then shows a shorter sequence in its IBP example.)

        Also, is it really "how the current frame is different than the last key-frame"? Isn't it "how the current frame is different than the PREVIOUS frame"? (and the previous frame of course would require decoding from the previous I-frame up through that frame). Bas

        • by Alsee ( 515537 )

          Is it really as little as every second or two?

          That is configurable, and typically ranges between about a half second and ten seconds. In fact I've been seeing about ten seconds between key-frames in some of the movies on my cable TV. It's visible as a faint fog that creeps up in very dark scenes, and abruptly vanishes to black at the next key-frame. The sudden jump is distracting, but it's even more distracting consciously realizing I'm "watching a CODEC" rather than "watching a movie. Chuckle.

          Isn't it "how the current frame is different than the PREVIOUS frame"?

          Yeah, my description was a little sloppy.

          -

          • by dave420 ( 699308 )
            You also forgot to mention an intra-frame can describe its difference against future frames, too.
  • I have never seen really stable frame-rates in video replay without hardware acceleration, even if CPU load is only a few percent. And this is only more true with higher resolution and tighter compression. Is H.265 basically similar enough to H.264 that the hardware acceleration in our GPUs can be applied to it?
    • I think the problem is drivers. Not until the standard gets more use will the GPU makers add support for the new format.

      Other than that, I'm sure the actual decoding work could be implemented on their hardware. Really, as long as the work is massively parallel, they can get the job done with a quickness.

    • by Khyber ( 864651 ) <techkitsune@gmail.com> on Thursday September 05, 2013 @11:50AM (#44766361) Homepage Journal

      "I have never seen really stable frame-rates in video replay without hardware acceleration"

      My only recommendation is to stop using substandard hardware or switch to a better software player.. I've been doing 1080p video rendering in software just fine using VLC since the days of my 2.4GHz P4 with a 64MB GeForce 2 and 2GB PC2700 DDR1.

      • by chmod a+x mojo ( 965286 ) on Thursday September 05, 2013 @12:23PM (#44766635)

        Let me know when I can slap a desktop class processor in my Nexus10 / netbook / other portable device that doesn't chug down battery like my i5 laptop ( that lasts maybe 5 hours doing light work @ 50% brightness ). And before you say anything about 1366x768 and down-scaling the N10 at least has a higher resolution than 1080P.

        There are tons of devices out there that need to be able to hardware decode anything above 720P H.264. That is the same reason I absolutely hate that more an more video is being released in the 10bit H.264 format, supposedly for smaller file sizes without visual distortions. Especially by the idiots who way over bitrate their encodes, not only can very few devices hardware decode 10bit, but I can transcode to "shitty" xvid with smaller file sizes ( literally shaving off GBs of 1080P encodes) and no visual quality loss.

        If you are going to encode with huge bitrate overages you might as well use 8bit that pretty much anyone and everyone nowadays can easily decode....

        • by Anonymous Coward

          Let me know when I can slap a desktop class processor in my Nexus10 / netbook / other portable device that doesn't chug down battery like my i5 laptop ( that lasts maybe 5 hours doing light work @ 50% brightness ). And before you say anything about 1366x768 and down-scaling the N10 at least has a higher resolution than 1080P.

          There are tons of devices out there that need to be able to hardware decode anything above 720P H.264. That is the same reason I absolutely hate that more an more video is being released in the 10bit H.264 format, supposedly for smaller file sizes without visual distortions. Especially by the idiots who way over bitrate their encodes, not only can very few devices hardware decode 10bit, but I can transcode to "shitty" xvid with smaller file sizes ( literally shaving off GBs of 1080P encodes) and no visual quality loss.

          If you are going to encode with huge bitrate overages you might as well use 8bit that pretty much anyone and everyone nowadays can easily decode....

          I suppose you're an anime fan ? Well you can always reencode the 10bit to 8bit and use that.
          But I agree, it's pretty shitty that most fansubbers have gone with a format that can't be decoded entirely in hardware. Even on the desktop, only professional cards like Nvidia Quadro have hardware decoding for 10 bit. Everyone else has to rely on software.
          On the other hand most HD rips of films are done in 8bit so no worry there.

          • Actually it's not only anime groups, some BlueRay rip groups are starting release more and more 10bit, I ran across a few rips that are 10bit so couldn't watch them on the netbook or nexus. And I do transcode to either 8bit or as I said xvid. Either that or I have to watch it on a more powerful but less battery efficient ( portable ) machine or a desktop.

        • Re: (Score:3, Interesting)

          by Khyber ( 864651 )

          "Let me know when I can slap a desktop class processor in my Nexus10 / netbook / other portable device that doesn't chug down battery like my i5 laptop"

          The processor in your Nexus 10 and netbook are likely already far more powerful than my old P4. The Atom D510 smashes the shit out of a 3.2GHz HT-Enabled P4 in 3DMark CPU Bench, and totally dominates in Sandra (excepting memory bandwidth test.)

          Which means you should've been able to do 1080p stutter-free for AGES on CPU alone. Your software is the issue here,

        • Re: (Score:2, Redundant)

          by jedidiah ( 1196 )

          > Let me know when I can slap a desktop class processor in my Nexus10

          Well that's what you get for using lame hardware and trying to pretend that it's anything but a bad joke.

          ARM hardware sucks. Plenty of us will happily point out this fact given any opportunity. You don't even need to use the most extreme examples to demonstrate this either.

          "Mobile" devices are like a blast from the past (90s) when it comes to performance.

        • The only video that is using Hi10p to any real extent is anime. If you are watching HD anime on a netbook, or really anything other than a desktop, then you will probably be derided by the encoders for using your 'shitbox', and they will probably suggest you stick with crunchyroll.
          • by Guspaz ( 556486 )

            To be fair to them, animated content with its frequent use of simple gradients does get a bigger advantage out of 10-bit encoding (avoiding banding) than film content does. Then again, the ffdshow deband filter (GradFun2DB) does a fantastic job at sorting out banding issues on playback without degrading image quality regardless of content type, so that certainly limits the utility of 10-big encoding.

        • And before you say anything about 1366x768 and down-scaling the N10 at least has a higher resolution than 1080P.

          So instead we're talking about upscaling any video you might be trying to play, since nobody publishes video in anything higher than 1080p for wide distribution.

          Still CPU intensive.

      • by AvitarX ( 172628 )

        Really?

        I've always had obvious stutter (apperent when credits are rolling), and never was able to get mildly acceptable playback at blueray level bitrates without hardware that was dramatically more powerful.

        • by Khyber ( 864651 )

          You probably got a crappy encode with a bitrate far higher than it needed to be or you're using a crappy codec.

          • by AvitarX ( 172628 )

            It was a Blue Ray Rip that I tested for high bitrate.

            The quality of the encoding may be low, but commercial movies (purchased legally) are part of what I want to view.

            Short of DVD or Blue Ray players I am yet to see smooth scrolling text or credits (netflix on my PS3 is the worse for this, software rendered VLC on older hardware coming next, even on an SD H.264 conversion off of a blue ray rip).

            I've found that sub 10 Mbps leads to pretty bad artifacting at 1080P during high-contrast (lightning strikes, HBO'

        • If you observed it during a credits sequence it is almost certainly due to the refresh rate of your display/video card not being matched to the refresh rate of the source. (Credits sequences are very easy to encode and decode and shouldn't run into a bottleneck anywhere).

          The best way to start analysing this issue is to use MediaPlayer Classic Home Cinema or Black Editon (a fork of MPC-HC), and check the stats with CTRL+J (using Custom EVR as renderer):
          https://trac.mpc-hc.org/attachment/ticket/2682/screensho [mpc-hc.org]

      • I believe that VLC has been using vdpau, vaapi, and xvmc for hardware offloading since that rig could have been classified as state of the art, so you most likely have been using hardware acceleration the whole time.
        • by Khyber ( 864651 )

          "VDPAU (Video Decode and Presentation API for Unix) is an open source library (libvdpau) and API originally designed by Nvidia for its GeForce 8 series and later GPU hardware"

          Note, my rig had a GeForce 2 MX, so no to VDPAU and VAAPI. XvMC only available on a GeForce 2 card (if possible) using the proprietary linux blob (which only supports up to the 7000 series GPUs) and would not be available in Windows 2000 due to the locked down DMA for video memory.

          Actually, checking the wikis on all three of what you

    • by jedidiah ( 1196 )

      > I have never seen really stable frame-rates in video replay without hardware acceleration

      Then you're not trying hard enough.

  • Don't get me wrong, I find these technical improvements interesting: same quality as h.264 at half the bitrate, that is impressive. However, I also think about the flip of the coin when (if?) this format gets popular: currently I have several devices with hardware support to decode h.264. Think iPhone, iPad, Apple TV, a Western Digital Streamer, a Raspberry Pi. With this new codec they'll need to decode in software, which esp. for the portable idevices (and I assume current Android devices as well) will ha
    • by unixisc ( 2429386 ) on Thursday September 05, 2013 @11:52AM (#44766381)

      So far, the FSF promoted Ogg-Theora as the standard that they wanted to push as the liberated standard for video encoding & decoding. Since h.264 is more standardized, it's good that they have at least an FOSS equivalent of it - if it can decode existing h.264 encoded videos out there, it's off to a good start.

      What I wonder is - if it's LGPL, how different is LGPL3 from GPL3? The FSF made radical changes in version 3, and made GPL3 almost unusable for anyone who wants to lock things in hardware. Is LGPL3 any looser in terms of allowing hardware locking of the code than GPL3? Also, Ogg Theora itself - is it GPL3?

      Also, would the new standard be supportable under HTML5?

      • by steveha ( 103154 )

        Theora uses a BSD-style license.

        http://www.theora.org/faq/#14 [theora.org]

        WebM also uses a BSD-style license.

        http://www.webmproject.org/about/faq/#licensing [webmproject.org]

        IMHO, if you are trying to make a standard for media encoding, it just makes sense for the reference code to be BSD-licensed. The point of GPL is to make sure that people can't lock users in to a proprietary code base, with no way to make changes; with a media format, the users can always grab their own copy of the reference code. (And a proprietary version that is

    • by slaker ( 53818 ) on Thursday September 05, 2013 @11:52AM (#44766389)

      As long as you have an intermediary to transcode to a supported format, why is that a problem? Plex does a perfectly fine job right now delivering h.264 with AAC audio to less capable mobile devices that I own, as do a number of DLNA servers that are scattered around my apartment. Presumably if you're watching on a device with sub-optimal functionality, you're going to be less concerned about overall source fidelity in the first place; it's not like you care that you aren't getting the full bit rate and eight channel audio from your blu-ray sources when you're watching them on a 4" iThing screen with a $10 pair of headphones.

      • Thanks for your comment. Yes it is true, transcoding could be an acceptable workaround. I guess that I am overly lazy as I try to transcode as little as possible - in fact I only did it 2 or 3 times in the last few years.
        • The point is that Plex does it on-demand and doesn't store the resulting transcode.

        • by slaker ( 53818 )

          Plex is a literal Swiss Army knife for this stuff. As long as you have a reasonably powerful back end like a retired Core i or even a Core 2 Quad, you're probably good to go on arbitrary amounts of real time transcoding for a typical home setup. If your Plex Media Server is on some kind of ARM or Atom-powered NAS, you have work to do, but even then there are pretty straightforward tools (e.g. Format Factory on Windows or Handbrake on whatever) that can whip a media collection in to shape if you absolutely n

    • by steveha ( 103154 )

      currently I have several devices with hardware support to decode h.264... With this new codec they'll need to decode in software

      Maybe not.

      The hardware support for H.264 is probably in the form of general-purpose DSP (Digital Signal Processing) combined with some code. Basically, your devices have some sort of DSP capability, and someone wrote DSP code to offload much of the decoding work from the general-purpose CPU to the DSP core(s) [wikipedia.org].

      So, it is probably possible to write additional code for H.265 and offer

  • Decoding is simple (Score:3, Informative)

    by Anonymous Coward on Thursday September 05, 2013 @11:53AM (#44766391)

    Decoding is simple, just implement the spec. Encoding is much more difficult, there is no spec - only a requirement that your encoded output can be decoded by a spec compliant decoder.

    • by omnichad ( 1198475 ) on Thursday September 05, 2013 @01:46PM (#44767413) Homepage

      You've got that entirely backwards. Encoding requires you only to handle your own implementation - with as little or as many of the features as you decide to implement. Decoding means you have to handle the quirks of the entire spec and possibly the bugs of other encoders.

      • You've got that entirely backwards.

        Only pedantically speaking.

        Pedantically speaking you could just encode a bunch of I frames and have done with it.

        In practice, making an encoder that is remotely worth making is far harder.

  • by Anonymous Coward

    Currently, libde265 only decodes intra frames, inter-frame decoding is under construction.

    Wait... so basicaly they have implemented an h265 decoder which only works if the supposedly existing encoder is configured to use only I-frames.
    How is this news? It is like saying you implemented something but actualy only 25% of it works...

    • by Pieroxy ( 222434 )

      They implemented MJPEG decoding in 2013... Not worthy of much to be honest. Shitty bitrate!

  • Patents. (Score:5, Interesting)

    by brunes69 ( 86786 ) <`gro.daetsriek' `ta' `todhsals'> on Thursday September 05, 2013 @11:56AM (#44766425)

    An open source library in the codec world is meaningless if the codec itself is covered by patents, because this means that no one can use the library in any country that enforces software patents. Last I saw H.265 is blanketed by over 500 patents. And in this case it's even worse than H.264 because they're not all held by one group, but by all kinds of different groups who all say they want royalties.

    • Re:Patents. (Score:5, Insightful)

      by Anonymous Coward on Thursday September 05, 2013 @12:11PM (#44766543)

      You are wrong. It's not meaningless, because in the country where I live we don't have software patents. I understand how you may feel it's meaningless for you though but that doesn't mean that it's meaningless in general. Also, I believe that this is distributed in source form and thus is not violating any patents. This makes it even less meaningless.

      • Re:Patents. (Score:4, Interesting)

        by tlambert ( 566799 ) on Thursday September 05, 2013 @03:49PM (#44768269)

        You are wrong. It's not meaningless, because in the country where I live we don't have software patents. I understand how you may feel it's meaningless for you though but that doesn't mean that it's meaningless in general. Also, I believe that this is distributed in source form and thus is not violating any patents. This makes it even less meaningless.

        This is why New Zealand is region code 4, but pretty much all the content is region code 1. You've been sent to "we can't enforce software patents on you so you don't get content until after everyone else has paid for it" Coventry. It's also why the same content costs more there: out of spite, by the content production companies.

    • The thing is that if Struktur AG has all the patents involved, then under GPL (and LGPL) the authors automatically give all recipients a license to all the patents (if I understand the legalese of GPL correctly).
    • Yes, meaningless. Tell that to XVID, x264, and LAME. All very successful Open Source projects implementing patent minefields. Sometimes it's more about the fun of coding than pushing an ideology.
    • IOW: we have another JPEG2000: A great idea destroyed by greed.

      In this case, the math and numbers are owned by a bunch of different entities.

      • by Goaway ( 82658 )

        h.264 is massively patented, and it is huge success. JPEG2000 was a failure because it wasn't actually good enough to be worth dealing with the patents for, unlike h.264.

    • It's certainly NOT "meaningless" to have an open source codec for a patented standard. They still get the benefits of multiple vendors focusiing all their efforts on a common code base. And let's not forget that not every country enforces software patents like the US.

      That said, it's sad that here on /. We've had a couple of stories about H.265 with NO mention of VP9 which Google has frozen the bitstream on, claims matches or slightly exceeds H.265 across the board, and has a working decoder in Chrome...

    • by jrumney ( 197329 )

      An open source library in the codec world is meaningless if the codec itself is covered by patents, because this means that no one can use the library in any country that enforces software patents.

      I don't see how that makes it meaningless in the other 90% of the world that has sane IP laws.

  • by WestonFire22 ( 2736813 ) on Thursday September 05, 2013 @12:21PM (#44766623)
    The other day Telestream announced the availability of an open source H.265 encoder via the x265.org site. Guess we don't have to wait for "Encoding to come later". See here: http://www.telestream.net/company/press/2013-09-03.htm [telestream.net]
  • by Anonymous Coward

    libvpx contains BSD-licensed VP9 codec (here [webmproject.org]) which is not infected by patents. VP9 is the VP8 (the current WebM) successor, HEVC level competitor. But Google has not paid for slashvertisement so no article about VP9 here.

  • Guess this is different from yesterday's story in Streaming Media, Telestream Helps Launch Open Source x265/HEVC Project [streamingmedia.com], which offers a H.265 encoder, available under either GPL or commerical license.

  • A BSD-licensed C++ reference implementation of both HEVC/H.265 encoding and decoding is available. This implementation is focused on research and features and not on speed. Especially the encoder is very slow, a few frames per minute is completely normal, even on good machines.
    But many people have optimized the reference implementation by adding SIMD and parallelizing the decoder and have reached 4K real time decoding and >100fps for 1080p decoding, even on low end machines.
    A C-based reimplementation is

Two can Live as Cheaply as One for Half as Long. -- Howard Kandel

Working...