Forgot your password?
typodupeerror
AMD Linux

Expanded AMD HDMI 2.1 Support Is Coming To Linux (gamingonlinux.com) 29

AMD is preparing expanded HDMI 2.1 support for Linux, following earlier delays after the HDMI Forum rejected an open source implementation of HDMI 2.1 as proprietary technology. As GamingOnLinux reports, AMD developer Harry Wentland submitted a patch series to the Linux kernel mailing list, noting that it brings "HDMI FRL support to the amdgpu display driver" and that "DSC is still being tested and will be sent out later."

A forum post on Phoronix from an AMD driver developer also said "a full implementation will ultimately be available once the patches are ready and have completed compliance testing."

Expanded AMD HDMI 2.1 Support Is Coming To Linux

Comments Filter:
  • by PPH ( 736903 )

    Maybe now, I can do away with all these HDMI 2.1 to whatever Chinese adapters.

  • Proprietary irony. (Score:4, Insightful)

    by MachineShedFred ( 621896 ) on Monday May 04, 2026 @04:46PM (#66127718) Journal

    Anyone else getting a chuckle out of the HDMI Consortium, the current ruling kings of proprietary technology and implementations now that the MPEG LA tripped over their own genitals and made themselves redundant, rejecting open source implementations as "proprietary technology"?

    Fuck HDMI. I still don't know why people are using that shitty port when DisplayPort is royalty-free, has better specs, is supported by everyone, and can be used over other cables that don't have all the licensing attachment (USB-C for example).

    • Well in my case, my Graphics adapter does not offer Display Port and I'm not about to buy a new one at current prices. My monitor supports both and this will be good news for me - once it comes.

    • by drnb ( 2434720 ) on Monday May 04, 2026 @05:10PM (#66127750)

      I still don't know why people are using that shitty port when DisplayPort is royalty-free, has better specs, is supported by everyone, and can be used over other cables that don't have all the licensing attachment (USB-C for example).

      HDMI is the new VGA. It allows the use of TVs, older monitors, etc. I have an ancient 1080p monitor plugged into a PC as a secondary display. The overhead projector used for presentations is often HDMI.

      • by thegarbz ( 1787294 ) on Monday May 04, 2026 @06:59PM (#66127876)

        HDMI is the new VGA.

        No not really. HDMI is still very much the active and dominant standard in the entire AV industry. It just isn't in the computer monitor industry. Saying it's VGA implies it's some kind of legacy tech, but the reality is right now in 2026 if you go to your local Wallmart and buy the latest fancy TV you'll find 100% of them use HDMI not as a legacy connector, but as the current latest hot tech.

        • by bn-7bc ( 909819 ) <bjarne-disc@holmedal.net> on Tuesday May 05, 2026 @03:13AM (#66128282) Homepage
          I think op might have meant thst HDNI is what replaced VGA itvwas just imprecisely worded so you misunderstood the intent and gave corect info based on the misunderstanding. No harm done you ehere both more or less correct
          • Oh right, as in ubiquitous rather than legacy. Yeah I agree with that. Everything seems to have some HDMI capability.

        • by drnb ( 2434720 )

          HDMI is the new VGA.

          No not really. HDMI is still very much the active and dominant standard in the entire AV industry. It just isn't in the computer monitor industry. Saying it's VGA implies it's some kind of legacy tech, but the reality is right now in 2026 if you go to your local Wallmart and buy the latest fancy TV you'll find 100% of them use HDMI not as a legacy connector, but as the current latest hot tech.

          Apologize if I was not clear, I am not comparing VGA in 2026 to HDMI in 2026, I am comparing 2026 HDMI to VGA of a past decade. As in in their respective timeframes, the VGA or HDMI connector could always be counted on to be there.

          • by drnb ( 2434720 )

            Apologize if I was not clear ...

            "Apologies" not "Apologize", damn autocorrect. I apologize for missing the typo. I am not expecting an apology. I understand the very reasonable above interpretation I did not intend. :-)

      • ... I still use it esp. with my OmniCube KVM from Y2K! :/

    • Because it is ubiquitous.

      But because the pesky licenses, for example, the Commander X16 retro computer uses a good old VGA port. :)

      https://www.commanderx16.com/ [commanderx16.com]

      • The better option would have been SDI over fiber

        • by bn-7bc ( 909819 )
          I agree but no one outside of pro av/ tv production h( and ovc us geeks)as ever heard of SDI and does SDI have any drm, if it doesn't hollywood would not acceot it on conssumer av equipment because piracy. Allso, and do correct me if I'm weong., but isn't SDI reather expensive to implement just beacause of tha bandwidth requirements and other details.
          • by sxpert ( 139117 )

            at which point the tech world should tell Hollywood to take a hike and stop trying to impose their broken by design nonsensical DRM schemes that only do one thing, frustrates the user with broken nonsense

    • by thegarbz ( 1787294 ) on Monday May 04, 2026 @06:56PM (#66127868)

      I still don't know why people are using that shitty port when DisplayPort is royalty-free, has better specs, is supported by everyone

      Not only is there plenty of hardware out there without DisplayPort, but if you at all want to connect to a TV your choice is either HDMI or go pound sand. Dp is supported by precisely 0% of the AV industry. As for why the AV industry uses it, it's because it has features specific to the AV industry that DP lack, such as an Audio Return Channel which is critical if using an audio system that relies on input switching. It also has CEC baked in to the standard, and has longer cable runs without resorting to active amplification or conversion.

      and can be used over other cables that don't have all the licensing attachment (USB-C for example).

      The ability to USB-C for Displayport relies on the implementation of DP Alternative Mode on the USB host which not all devices have and is really only common on mobile devices. This isn't some magic solution. Interestingly HDMI Alternative Mode was something that existed for a solid 7 years before vendors gave up because no one used it. But more relevant: There simply isn't a need to support USB alt mode directly when HDMI can piggyback off the Dp signal with an active adapter, especially when the active circuitry is small enough to fit in the HDMI connector. (Yes I connect my laptop to TVs via USB-C, it's a normal thing you can do through the same port which supports dp).

      • by jsonn ( 792303 ) on Monday May 04, 2026 @08:36PM (#66128018)
        The only reason the AV industry prefers HDMI is the DRM bit. Everything else is a gimmick.
        • False, there's nothing preventing DP from having DRM. In fact dp carries the same HDCP info. On the other hand the dp standard doesn't cover the required functionality I listed from HDMI. It could, but the group behind it are not interested in expanding it in that way.

          • by jsonn ( 792303 )
            Given that nothing in the DisplayPort world cares about it, it is irrelevant whether it could be implemented or not. Hollywood demands it, it doesn't exist in reality, it doesn't matter if it is fictional.
            • I don't know what point you're trying to make. The same DRM is present in HDMI and DP. Just go open the NVIDIA control panel and see that your dp link is to your monitor is currently HDCP compliant to verify this yourself.

              The reality is Hollywood has nothing to do with the fact HDMI is on TVs. The standard itself has direct benefits grown from the desires of the AV industry itself. Dp doesn't. That's really all there is to it. You won't get a TVs with DP until remote CEC codes universally transmit over it,

    • Fuck HDMI. I still don't know why people are using that shitty port when DisplayPort is royalty-free, has better specs, is supported by everyone, and can be used over other cables that don't have all the licensing attachment (USB-C for example).

      I have one computer that only has DP, and the TV only supports HDMI..
      So I'm forced to use a DP to HDMI Cable.
      I'm convinced that unless I have the TV on before I turn the PC on, the TV will display 'no signal'.
      Every time this happens - Alt-Ctrl-Del - Reboot - TV then works.

    • by bn-7bc ( 909819 )
      Bicause any consumer av product ( that includes pc monitors) are DOA if them don't have them as they are everywhere allreasy, it's called inertia and backwards compatibility. HDMI won because it was everywhere from the introduction of HD onwards so people allready had cables if they needed a quick replacement ( or they gould get standard kenghts fir relatively cheap at tje nearest yv store/ electronics outlet). I most people ( at least at the start) did just fine with hdmi 1.0, when you don't need multili
  • The last time it happened, Liam was amused to be called "a fansite" :D

  • absolutely sick of my TV's not behaving like monitors.

    they power off alright, albeit not as quickly as monitors, but ive gotta find the remote or press the power button to turn them back on, AND they take a while to boot.

    would be so much nicer if i could move my mouse or press buttons on my keyboard to wake the whole setup from sleep instead of needing to also wake the displays.

  • You keep using that word. I do not think it means what you think it means.

  • Why is linux support (doing the math if it can't do 2.1....) 13 years behind the specification?

Heisenberg may have slept here...

Working...