Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
IT Technology

CES 2022 Will Introduce HDMI 2.1a, Another Confusing New Spec (theverge.com) 35

An anonymous reader shares a report: The HDMI standards are a mess. HDMI 2.1, in particular, is a uniquely frustrating mess, with haphazard support among TV manufacturers, cable makers, and devices that make setting up, say 120Hz gaming on a PS5 or Xbox Series X a uniquely harrowing experience. Fortunately, the HDMI Forum is swooping in ahead of CES with its latest revision to the HDMI specification stack, HDMI 2.1a, which is here to make everything better and simpler... I'm kidding, of course. It's gonna make things more complicated. It's a new HDMI standard, what on earth did you expect?

Let's start with the good: HDMI 2.1a is an upcoming revision to the HDMI 2.1 stack and adds a major new feature, Source-Based Tone Mapping, or SBTM. SBTM is a new HDR feature that offloads some of the HDR tone mapping to the content source (like your computer or set-top box) alongside the tone mapping that your TV or monitor is doing. SBTM isn't a new HDR standard -- it's not here to replace HDR10 or Dolby Vision. Instead, it's intended to help existing HDR setups work better by letting the content source better optimize the content it passes to the display or by removing the need to have the user manually calibrate their screens for HDR by having the source device configure content for the specific display. Other use cases could be for when there's a mix of content types, like for streamers (who could have an HDR game playing alongside a window of black and white text), displaying each area of content.

This discussion has been archived. No new comments can be posted.

CES 2022 Will Introduce HDMI 2.1a, Another Confusing New Spec

Comments Filter:
  • With its variety of connectors and standards? Quick, can a microscope with a USB be connected to a screen with a USB input and produce a picture?

    • > Quick, can a microscope with a USB be connected to a screen with a USB input and produce a picture?

      Yes but the picture might come out of the printer next door because the mouse has an integrated print server and wifi.

  • by wertigon ( 1204486 ) on Wednesday December 29, 2021 @11:25AM (#62125175)

    Seriously, everyone is already transitioning to Displayport simply because the HDMI standard does not allow for more room. Let it have the same status as DVI and VGA connectors. Useful, but deprecated.

      • by jd ( 1658 )

        That is not dead which can eternal lie.
        And with strange aeons even death may die.

    • by Luthair ( 847766 )
      Is there anyone actually releasing DisplayPort televisions? Even if they were, these whiners would have the exact same complaint that the displays don't support everything the standard supports.
    • by jd ( 1658 )

      I use DisplayPort, but I fear that as it's much more closed than HDMI, we'll have another Betamax vs VHS war with the inferior but more common standard winning. HDMI 2.1a is 5/8ths the bandwidth of DisplayPort 2.0, in theory (48Gbps vs 77.37Gbps), although in practice you won't often see either running at full speed for a while. If you're playing a game, a computer would really want to be running PCIe 3.0 to guarantee enough bandwidth for DP 2.0. I don't believe set-top boxes are remotely close to high-end

      • I don't have any devices that have a DisplayPort input or output. However, based on everything I've read, it seems to be superior to HDMI in many ways, especially regarding bandwidth. Other than some audio features, is there anything useful that DisplayPort doesn't support that HDMI does?
        • Multiple screens thanks to daisy chaining. Hence its increasing popularity in business.

        • Having read the HDMI 2.1 spec myself few years back, I can tell that there are many differences to DisplayPort. However, you need to be aware of many features are optional in either protocol. For example, there was a time when many computer displays or GPUs did not support audio through DP but did support audio through HDMI. There was as well a brief period of time when some devices supported Ethernet over HDMI, something that made little sense when I saw it introduced in HDMI 1.4.

          In any case, HDMI supports

        • by jd ( 1658 )

          DisplayPort supports up to three ports. It can handle two monitors at the same resolution and refresh rate (but better colours) as HDMI 2.1a with a single monitor. You also have to drop to 8K HDMI to get similar colour quality. DisplayPort also allows you to have multiple monitors driven off a single port. The most displays DP seems to handle is 8 running at 8K with compression. HDMI limits you to one monitor per port and the best HDMI devices only have two ports.

          (DP is designed to have multiple displays d

      • You are right in that the standards are incompatible. However, they are meant for different purposes. DisplayPort is for computers, HDMI is for anything else. Like the 16:9 aspect ratio, computer screens are adopting video standards for economy of scale, I fear.

        There is no relationship between bandwidth on the motherboard and video output. Many video codecs are implemented in HW, like H264 and H265, so uncompressed images are only appearing inside the GPU. Similarly, set-top boxes and BD players only need

      • If you're playing a game, a computer would really want to be running PCIe 3.0 to guarantee enough bandwidth for DP 2.0

        With games, you load all your assets on the card ahead of time and then use a miniscule amount of bandwidth for draw calls. The PCI-e bandwidth used for 1080p and 8K will be mostly identical.

        • by jd ( 1658 )

          Hmm, ok, I'll take that one back, then, at least until I find a multi-monitor game that actually needs to access main memory a lot.

          • I suspect games that get the biggest boost from a Resizable BAR will also be the ones to get the biggest boost from more bandwidth, because they are the ones doing a lot of texture streaming.
  • Obligatory XKCD (Score:4, Informative)

    by sTERNKERN ( 1290626 ) on Wednesday December 29, 2021 @11:54AM (#62125245)
  • I've always been a resolution nerd, sought after every little improvement. I had a LaserDisc and an S-VHS recorder. There was also an "S-VHS" cable standard, and some TVs had the plug in the back. When we got our first HDTV (with a tube, 200 lb!) I had the LaserDisc and a rented Jurassic Park sequel, and went to some trouble to switch back and forth between the regular composite plug and the cool S-VHS plug.

    After about three rounds of looking at the same opening scene with both, we concluded that, yes,

    • by AvitarX ( 172628 )

      Probably not if it's calibrated, but on my crummy hdr TV calibration makes a pretty big difference in games.

    • by _merlin ( 160982 )

      S-VHS [wikipedia.org], short for Super VHS, is a video cassette format using the VHS form factor, but with a higher luminance carrier frequency to get better luma resolution (chroma resolution is not improved over VHS).

      S-Video [wikipedia.org], short for separate video, is a video signalling scheme using separate Y (luma) and C (chroma) signals, typically using 4-pin mini DIN connectors for cabling.

      S-Video will give you more luma and chroma bandwidth than a composite signal, and can result in a better picture, but you need a source that

    • by jd ( 1658 )

      On standard tv sets? Probably not. On large UHDTV screens? Quite possibly. On some TV standard that will be invented in 5-10 years time? Almost certainly.

      On top of that, you've got eyesight weirdness. I've known people who grew up on NTSC and couldn't handle the flicker of PAL, others who grew up on PAL and couldn't handle the coarser NTSC. And yet others who couldn't tell the difference.

      The main problem will be cable companies, who compress as much as possible. And sometimes even more. If you've never see

  • This is a bit offtopic, but is anyone else extremely frustrated with HDMI 2.1 already? Devices that support it, such as next-gen gaming consoles as well as newer TVs, have been out for over a year and yet I still can't find an AV processor that properly supports it. The problem is that if I buy an AV processor that doesn't support HDMI 2.1, then I'll have to hook up my devices to my TV and use eARC to get the audio to my AV processor, but eARC on my TV doesn't support passthrough of certain audio standard
    • Disclaimer: I do not have any such recent devices as yourself but I fail your pain. So let me travel memory lane. One of the main features of HDMI 1.3 was the introduction of Deep Colour, a way to send more than 8b per channel without resorting to 4:2:2 chroma subsampling. For more than one year, the PS3 was the only one source known to use Deep Colour. PCs were in a poorer state regarding anything similar - you had to use an Nvidia Quadro and an OpenGL application hoping that your very expensive monitor co
  • by claiming our new super duper device is 2.1a comp. without even having to implement any of the features!
    • by bws111 ( 1216812 )

      Except that is not allowed. You can not use version numbers unless you specify exactly which features are supported.

        • by bws111 ( 1216812 )

          Wait, you actually believed any of that idiocy? While that moron claimed that the 'HDMI Licensing Association' encourages manufacturers to label their devices as
          'HDMI 2.1' even if it doesn't support any features, what they actually say [hdmi.org] is the exact opposite: "You can only use version numbers when clearly associating the version number with a feature or function as defined in that version of the HDMI Specification. You cannot use version numbers by themselves to define your product or component capabilities

  • FTFA: "...having the source device configure content for the specific display."
    I really hope that this means that we'll soon be able to put/position multiple HDMI streams on one large display, like a background stream, program streams, and Nest/Ring/Sec camera windows on one big display.
    Lcd/Plasma/OLED/TV's began to lose PIP (Picture-in-Picture) capability over a decade ago, so it's about time that feature came back to we the new consumers.

The gent who wakes up and finds himself a success hasn't been asleep.

Working...