CES 2022 Will Introduce HDMI 2.1a, Another Confusing New Spec (theverge.com) 35
An anonymous reader shares a report: The HDMI standards are a mess. HDMI 2.1, in particular, is a uniquely frustrating mess, with haphazard support among TV manufacturers, cable makers, and devices that make setting up, say 120Hz gaming on a PS5 or Xbox Series X a uniquely harrowing experience. Fortunately, the HDMI Forum is swooping in ahead of CES with its latest revision to the HDMI specification stack, HDMI 2.1a, which is here to make everything better and simpler... I'm kidding, of course. It's gonna make things more complicated. It's a new HDMI standard, what on earth did you expect?
Let's start with the good: HDMI 2.1a is an upcoming revision to the HDMI 2.1 stack and adds a major new feature, Source-Based Tone Mapping, or SBTM. SBTM is a new HDR feature that offloads some of the HDR tone mapping to the content source (like your computer or set-top box) alongside the tone mapping that your TV or monitor is doing. SBTM isn't a new HDR standard -- it's not here to replace HDR10 or Dolby Vision. Instead, it's intended to help existing HDR setups work better by letting the content source better optimize the content it passes to the display or by removing the need to have the user manually calibrate their screens for HDR by having the source device configure content for the specific display. Other use cases could be for when there's a mix of content types, like for streamers (who could have an HDR game playing alongside a window of black and white text), displaying each area of content.
Let's start with the good: HDMI 2.1a is an upcoming revision to the HDMI 2.1 stack and adds a major new feature, Source-Based Tone Mapping, or SBTM. SBTM is a new HDR feature that offloads some of the HDR tone mapping to the content source (like your computer or set-top box) alongside the tone mapping that your TV or monitor is doing. SBTM isn't a new HDR standard -- it's not here to replace HDR10 or Dolby Vision. Instead, it's intended to help existing HDR setups work better by letting the content source better optimize the content it passes to the display or by removing the need to have the user manually calibrate their screens for HDR by having the source device configure content for the specific display. Other use cases could be for when there's a mix of content types, like for streamers (who could have an HDR game playing alongside a window of black and white text), displaying each area of content.
Re: (Score:1)
That's because it isn't.
The HDMI spec is just that - an HDMI spec. The version number is the number of the current spec. There is no such thing as an "HDMI 2.1a cable." There is such a thing as a cable that meets the HDMI 2.1a spec, and such a cable may not support all HDMI 2.1a features. It's up to the user to determine what features they need between their display, device, and cable. The features are clearly marked rather than trying to make certain "compatibility levels."
If you're looking for an "HDMI 2.
Re: (Score:2)
I would agree with the criticism of people looking for an HDMI 2.1a device, they should be looking at the features they're interested in not what the spec supports. Nor is this new, for example specs support 3D but most displays do not, even when they were actively being manufactured.
However, it is relevant for cables, cables do have a bandwidth rating associated with them that is relevant.
Re: (Score:2)
Jesus, do you realize what you're expecting me to do when shopping for a new Display?
Basically a multi dimensional feature table:
Display features x connectivity features x cable capabilities x source features.
Add to that potential future source features.
This is fucking insane.
Now, I will admit I did do such a table when I got my new display. But no. This isn't the way.
Re: (Score:2)
Re: (Score:2)
This decision isn't the hard one. Getting it to work once you have the display is.
For instance: PS4 + VR, and a 4K HDR display. The VR set kills HDR. So I keep reconnecting cables back and forth.
Not to mention I had a cable issue (old HDMI cable -> no CEC).
Now if you add an A/V receiver into the mix, depending on the way you connect it some audio formats go through, some don't. Also the A/V receiver needs to support the same HDMI feature set as the TV, else you lose something.
I'm not talking theoreticall
As confusing as USB? (Score:2)
With its variety of connectors and standards? Quick, can a microscope with a USB be connected to a screen with a USB input and produce a picture?
Re: (Score:2)
> Quick, can a microscope with a USB be connected to a screen with a USB input and produce a picture?
Yes but the picture might come out of the printer next door because the mouse has an integrated print server and wifi.
How about just letting the standard die? (Score:5, Insightful)
Seriously, everyone is already transitioning to Displayport simply because the HDMI standard does not allow for more room. Let it have the same status as DVI and VGA connectors. Useful, but deprecated.
Re: How about just letting the standard die? (Score:2)
"What is dead may never die"
Re: (Score:2)
That is not dead which can eternal lie.
And with strange aeons even death may die.
Re: (Score:2)
Re: (Score:2)
I use DisplayPort, but I fear that as it's much more closed than HDMI, we'll have another Betamax vs VHS war with the inferior but more common standard winning. HDMI 2.1a is 5/8ths the bandwidth of DisplayPort 2.0, in theory (48Gbps vs 77.37Gbps), although in practice you won't often see either running at full speed for a while. If you're playing a game, a computer would really want to be running PCIe 3.0 to guarantee enough bandwidth for DP 2.0. I don't believe set-top boxes are remotely close to high-end
Re: (Score:2)
Re: How about just letting the standard die? (Score:2)
Multiple screens thanks to daisy chaining. Hence its increasing popularity in business.
Re: (Score:2)
In any case, HDMI supports
Re: (Score:3)
DisplayPort supports up to three ports. It can handle two monitors at the same resolution and refresh rate (but better colours) as HDMI 2.1a with a single monitor. You also have to drop to 8K HDMI to get similar colour quality. DisplayPort also allows you to have multiple monitors driven off a single port. The most displays DP seems to handle is 8 running at 8K with compression. HDMI limits you to one monitor per port and the best HDMI devices only have two ports.
(DP is designed to have multiple displays d
Re: How about just letting the standard die? (Score:2)
Any usb 3 port that ISNT Thunderbolt support DisplayPort 3.1. Thunderbolt3 ports only support DP2
Re: (Score:2)
There is no relationship between bandwidth on the motherboard and video output. Many video codecs are implemented in HW, like H264 and H265, so uncompressed images are only appearing inside the GPU. Similarly, set-top boxes and BD players only need
Re: (Score:2)
With games, you load all your assets on the card ahead of time and then use a miniscule amount of bandwidth for draw calls. The PCI-e bandwidth used for 1080p and 8K will be mostly identical.
Re: (Score:2)
Hmm, ok, I'll take that one back, then, at least until I find a multi-monitor game that actually needs to access main memory a lot.
Re: (Score:2)
Obligatory XKCD (Score:4, Informative)
Like S-VHS? (Score:2)
I've always been a resolution nerd, sought after every little improvement. I had a LaserDisc and an S-VHS recorder. There was also an "S-VHS" cable standard, and some TVs had the plug in the back. When we got our first HDTV (with a tube, 200 lb!) I had the LaserDisc and a rented Jurassic Park sequel, and went to some trouble to switch back and forth between the regular composite plug and the cool S-VHS plug.
After about three rounds of looking at the same opening scene with both, we concluded that, yes,
Re: (Score:2)
Probably not if it's calibrated, but on my crummy hdr TV calibration makes a pretty big difference in games.
Re: (Score:1)
S-VHS [wikipedia.org], short for Super VHS, is a video cassette format using the VHS form factor, but with a higher luminance carrier frequency to get better luma resolution (chroma resolution is not improved over VHS).
S-Video [wikipedia.org], short for separate video, is a video signalling scheme using separate Y (luma) and C (chroma) signals, typically using 4-pin mini DIN connectors for cabling.
S-Video will give you more luma and chroma bandwidth than a composite signal, and can result in a better picture, but you need a source that
Re: (Score:2)
On standard tv sets? Probably not. On large UHDTV screens? Quite possibly. On some TV standard that will be invented in 5-10 years time? Almost certainly.
On top of that, you've got eyesight weirdness. I've known people who grew up on NTSC and couldn't handle the flicker of PAL, others who grew up on PAL and couldn't handle the coarser NTSC. And yet others who couldn't tell the difference.
The main problem will be cable companies, who compress as much as possible. And sometimes even more. If you've never see
A Different Kind of HDMI Frustration (Score:2)
Re: (Score:2)
Hey but now we can make more money (Score:1)
Re: (Score:2)
Except that is not allowed. You can not use version numbers unless you specify exactly which features are supported.
Re: (Score:1)
Re: (Score:2)
Wait, you actually believed any of that idiocy? While that moron claimed that the 'HDMI Licensing Association' encourages manufacturers to label their devices as
'HDMI 2.1' even if it doesn't support any features, what they actually say [hdmi.org] is the exact opposite: "You can only use version numbers when clearly associating the version number with a feature or function as defined in that version of the HDMI Specification. You cannot use version numbers by themselves to define your product or component capabilities
PIP (Picture-In-Picture). Does HDMI-2.1a make it? (Score:1)
FTFA: "...having the source device configure content for the specific display."
I really hope that this means that we'll soon be able to put/position multiple HDMI streams on one large display, like a background stream, program streams, and Nest/Ring/Sec camera windows on one big display.
Lcd/Plasma/OLED/TV's began to lose PIP (Picture-in-Picture) capability over a decade ago, so it's about time that feature came back to we the new consumers.