2025 Was the Beginning of the End of the TV Brightness War (theverge.com) 56
The television industry's brightness war may have hit its inflection point in 2025, the year TCL and Hisense released the first consumer TVs capable of 5,000 nits under specific settings -- a figure that would have seemed absurd not long ago when manufacturers struggled to reach 2,000 nits. LG introduced Primary RGB Tandem OLED technology, moving from a three-stack panel design to a four-stack red-blue-green-blue configuration that the company claims can achieve 4,000 nits. The technology appears in the LG G5, Panasonic Z95B and Philips OLED950 and OLED910.
RGB mini-LED also emerged as a new category. The technology uses individual small red, green and blue LED backlights instead of white or blue LEDs paired with quantum dots. Hisense demonstrated it at CES 2025, TCL announced its Q10M for China, and Samsung unveiled its own version called micro-RGB. These sets range from $12,000 to $30,000. Sony has confirmed it will debut RGB TV technology in spring 2026. HDR content is currently mastered at a maximum of 4,000 nits. The situation echoes the audio industry's loudness war, The Verge points out, which peaked with Metallica's heavily compressed Death Magnetic in 2008.
RGB mini-LED also emerged as a new category. The technology uses individual small red, green and blue LED backlights instead of white or blue LEDs paired with quantum dots. Hisense demonstrated it at CES 2025, TCL announced its Q10M for China, and Samsung unveiled its own version called micro-RGB. These sets range from $12,000 to $30,000. Sony has confirmed it will debut RGB TV technology in spring 2026. HDR content is currently mastered at a maximum of 4,000 nits. The situation echoes the audio industry's loudness war, The Verge points out, which peaked with Metallica's heavily compressed Death Magnetic in 2008.
I love the brightness of the Hisense U8 line. (Score:2)
I believe it (Score:2)
I bought a new TV this year--just a cheap direct-lit model LCD panel--and the backlight is so eye-searingly bright that I ended up turning it down to 30%, and then turning off HDR as well. I don't need the headache.
Besides, I'm not convinced HDR is anything more than a software gimmick unless you have an OLED display.
Re: I believe it (Score:2)
Whether HDR is real or not depends on both brightness and color depth. Many TVs will take 10 bit input but have an 8 bit panel and simply aren't capable of displaying subtle gradations. Then they do fake HDR. This can still make improvements (they can do better dithering, my LG43UT8000 actually does get noticeably better in so called HDR+ mode, which is really fake HDR) but they still don't have more dynamic range.
Re: (Score:2)
HDR is about the transfer function.
The standard gamma log function has limited dynamic range.
Your TV will support HLG, DD, HDR10, or HDR10+, which all have high dynamic range transfer functions.
Gradations are a separate problem.
Re: (Score:2)
I've already discussed this with you as much as I'm going to, since you are confused about every point. But I will address this:
Your TV will support HLG, DD, HDR10, or HDR10+, which all have high dynamic range transfer functions.
That's the opposite of what you said last time, so please do fuck off forever.
Re: (Score:3)
I've already discussed this with you as much as I'm going to, since you are confused about every point. But I will address this:
I'm not remotely confused- you are just wrong, and you are very defensive over being wrong.
That's the opposite of what you said last time, so please do fuck off forever.
Incorrect. Though through the lens of your illiteracy, I wouldn't be surprised if you thought that.
Re: (Score:2)
That's interesting, didn't know about the distinction between a 10-bit panel and an 8-bit panel. This explains why I'm seeing annoying artifacts with HDR enabled, like posterization in really bright scenes. Thanks.
Re: (Score:2)
HDR generally looks fine on an 8-bit panel (using temporal dithering) up to a couple thousand nits. It looks fine without temporal dithering up to ~400 nits.
Your 4k@10b HDR signal requires far more bandwidth than anyone is sending you video at.
How far they reduce the bitrate is what is going to equate to the posterization (quantization errors in tone mapping)
Re: (Score:2)
Besides, I'm not convinced HDR is anything more than a software gimmick unless you have an OLED display.
You don't need the infinite contrast or colour fidelity of OLED to exceed what SDR content is capable of. Gamuts marginally larger than sRGB already induce banding with the old 8bit SDR standard. Whether you personally notice it or not isn't relevant. Displays (even low contrast displays from 15+ years ago) exceeded what the signal was capable of, so creating a new standard for signalling of content is not a software gimmick.
8k resolution (Score:2)
8K resolution can enable in-home immersive (IMAX like experiences. Therefore we need to get to 8K resolution and also eliminating (visibility of) the space between pixels (use a diffuser sheet?). After that work on cost reduction/manufacturing.
Re:8k resolution (Score:5, Funny)
eliminating (visibility of) the space between pixels
I think you might be sitting too close to your TV.
Re: (Score:2)
4k streams are still hard to come by from basically anywhere, and now you think that the providers that don't even manage past 1080p (being every single broadcast source except certain pro sports events where they might bother to do a 4k broadcast) can look even shittier through upscaling to a resolution that is 16x larger?
How about we fix the bottlenecks that prevent the previous standard from being fully realized before we worry about the next one that requires 4x the bandwidth that already isn't being de
Apparently we need it (Score:1)
Amen! (Score:2)
I've always hated this. TV shows and games that are FAR too dark to be able to see anything at all, even in a dimly lit room.
Poor lighting and the incomprehensible audio of today's shows makes closed captions absolutely mandatory. I was starting think it was just me getting old. But, when Apple TV added automatic captions that appear if you backspace the show at all and last for 30 or so seconds, I realized that this is a wide spread problem with the shows and not my sight nor hearing.
Re: (Score:1)
Didn't your mom tell you that you'd go blind?
Re: (Score:2)
Metallica's Death Magnetic (Score:2)
I guess if you play retro games (Score:1)
On the other hand a lot of old pixel art is specifically designed for CRTs and scan lines and I've yet to find a filter that doesn't fix that by having little bits of black lines or dots throughout the image. Even the fancy pixel shader stuff doesn't quite pul
Re: (Score:1)
i so miss my viewsonic g90.
Re: (Score:3)
CRTs are not, from what I have read, seen, or remember, brighter than any recent LCD display. Sure, they were brighter than the early LCDs, but not anything recent and nice. Not even close.
Re: (Score:3)
This might be cool because CRTs were much much brighter than LCDs
The early 00s called and want their comment back. CRTs typically had a peak brightness of around 100cd/m2. Cheap arse bottom tier LCDs were brighter than this back in the 00s. CRTs had better dynamic range, but their most brightest high end models (around 500cd/m2) were surpassed in brightness by LCDs quite early in the LCD development cycle. Whole new standards had to be created for movie mastering since consumer LCD TVs completely smashed the 100cd/m2 brightness that was typical of CRTs and thus the prima
I have the oppposite problem (Score:2)
I watch tv at night and don't need a TV that can be seen under flood lights. I need one that is much more dimmable.
Re: (Score:2)
Exactly! Who watches TV in a brightly-lit room? I watch TV in a dark or dimly lit room, 8 feet away from my 10+ year old 50" 1080p tv, with the brightness turned down to 50%. Even on my old, inexpensive LED backlit LCD panel TV, setting the brightness at 100% is way too bright in a dark room, and I doubt very much that that's anything near 2000 nits. Probably more like 200.
Re: (Score:3)
Exactly! Who watches TV in a brightly-lit room? I watch TV in a dark or dimly lit room, 8 feet away from my 10+ year old 50" 1080p tv, with the brightness turned down to 50%. Even on my old, inexpensive LED backlit LCD panel TV, setting the brightness at 100% is way too bright in a dark room, and I doubt very much that that's anything near 2000 nits. Probably more like 200.
If you are talking SDR content 200 nits is about right. A peak brightness in the thousands is only relevant for HDR content, where that kind of brightness will only appear for a very small amount of time in a very small portion of the screen. The overall brightness can still be quite low even with those peaks and HDR content is typically expected to be viewed in a dark room.
Re: (Score:2)
I watch tv at night and don't need a TV that can be seen under flood lights. I need one that is much more dimmable.
Viewing in a dark room is typically how HDR content is supposed to be viewed, that's why viewing it in a lit room makes it appear so dim. The peak brightness can be very high but only happens in a very small portion of the screen and for a short time.
SDR content is not going to reach anywhere near peak brightness so all these thousands of nits being advertised are irrelevant in that context.
Re: (Score:2)
Turn the brightness down. You literally don't have a problem beyond your inability to use your own remote. The problem only exists in the opposite direction.
Generally, don't need (Score:2)
But 5000 nits is nice on the deck of a yacht.
What does brightness matter? (Score:2)
What I find distracting is the bands of darkness because digital compression does not have a smooth dark profile.
I know I have better sight than most in the dark (talking about actual real life), but I can't believe people don't notice the shadow in the dark consists of only three gray values.
In short, screw TV colour encoding cutting bits from the spectrum, use all of 0-255 of every byte.
Re: (Score:2)
Failing to account for this by applying a power law (gamma curve) to the display results in pretty nasty imagery.
The fix to your problem is actually to remove colors from the code points of the byte (gamma corrected 8-bit curve has ~170 lumosity values, 10-bit ~700)
Read more, here. [krita.org]
The "Linear TRC" is what it looks like if you "use all 0-255 of every byte".
HDR throws out the old power law gamma curve (precisely because of its inefficiency/wasted luminosity values) either
Not Loudness War Redux. (Score:4, Interesting)
Increasing display brightness is the literal exact opposite- allowing for larger dynamic range.
Where the Loudness Wars sought to increase the loudness of a medium with a fixed dynamic range, TVs are increasing the dynamic range so that a TV-equivalent Loudness War doesn't need to happen.
Article was written by an idiot.
Re: (Score:2)
The Loudness Wars were compression of dynamic range- reduction. Increasing display brightness is the literal exact opposite- allowing for larger dynamic range. Where the Loudness Wars sought to increase the loudness of a medium with a fixed dynamic range, TVs are increasing the dynamic range so that a TV-equivalent Loudness War doesn't need to happen. Article was written by an idiot.
I don't disagree, at least based on the summary. How is "there's new technology this year and next that is brighter than ever before" somehow either an end of escalating brightness or an inflection point? That's not what either of those mean.
Re: (Score:3)
That would indeed be horrible. But that's not what's happening. TVs are getting better at reproducing studio-mastered content is what is happening.
Re: (Score:2)
The problem is always the same - the technology gets abused. With brightness wars the content tries to blind you, and looks bad if you try to compress the dynamic range. Or the other extreme, it gets colour graded for high end sets that can show a lot of detail in dark areas, and people complain that on their SDR LCD everything is black.
The BBC experimented with a second sound stream that made dialogue clearer for a while, then abandoned it.
Re: (Score:2)
The problem is always the same - the technology gets abused. With brightness wars the content tries to blind you, and looks bad if you try to compress the dynamic range.
There are no Brightness Wars.
Increasing peak luminosity of an HDR-capable TV does not increase the brightness of the content.
If you're arguing that there is a Brightness War in the mastered content- then this article is not about that.
The Loudness Wars were about making shit sound louder on shit with fixed dynamic range.
This is about allowing brighter content to be displayed faithfully without engaging in brightening the entire display.
Or the other extreme, it gets colour graded for high end sets that can show a lot of detail in dark areas, and people complain that on their SDR LCD everything is black.
HDR content is not shown on SDR displays.
The BBC experimented with a second sound stream that made dialogue clearer for a while, then abandoned it.
Sound is a separate problem
Re: (Score:2)
There's no visual fidenlity downside to having a higher peak brightness on a display, only potential upsides, just like there's no aural fidelity downside to having a higher sampling rate, only potential upsides.
Contrast this to "The Loudness Wars", where there is very much an audio fidelity downside, since the dynamic range decreases, rather than sampling rate and brightness allows for- increasing.
If you're aw
Re: (Score:2)
All HDR content is that instead of going from say, 0 to 255 for brightness, we can go from 0 to 1023. Your SDR content stays mapped to 0-255 as it always was, but now your HDR content can now use 256-1023 as brightness values which goes much brighter.
And that's how you notice HDR video - they can appear normal, but then it walks out into sunlight and it's a lot brighter than it can be.
The problem is that most HDR displays are crap - some even claim HDR when they can't do it (DisplayHDR 400, for example is u
Re: (Score:2)
All HDR content is that instead of going from say, 0 to 255 for brightness, we can go from 0 to 1023. Your SDR content stays mapped to 0-255 as it always was, but now your HDR content can now use 256-1023 as brightness values which goes much brighter.
This is absolutely incorrect.
First and foremost, PQ and HLG have entirely different curves than SDR gamma.
And that's how you notice HDR video - they can appear normal, but then it walks out into sunlight and it's a lot brighter than it can be.
Also completely incorrect.
HDR is not about making a scene brighter. It's about making parts of the screen brighter will still retaining detail in very dim parts.
The problem is that most HDR displays are crap - some even claim HDR when they can't do it (DisplayHDR 400, for example is used on many laptops). This just means it can do up to 400 nits.
DisplayHDR 400 looks great on an OLED laptop.
A laptop is never more than a half meter or so from your face. It doesn't need 1000 nits to be absolutely blinding.
And remember- brightness isn't about brightness of the entire scene. It's small p
Re: (Score:2)
Or the other extreme, it gets colour graded for high end sets that can show a lot of detail in dark areas, and people complain that on their SDR LCD everything is black.
It seems to be the fashion now that TV is basically now unviewable unless you have a cinema grade setup, due to everything being muddy brown and grey in low lighting. But that's OK because they also make it impossible to tell what's going on by having mumbled, quiet speech with high levels of background noise.
If they really want to win the consumer over... (Score:2)
Diminishing returns (Score:4, Interesting)
The problem TV manufacturers are up against is that it's hard to make a convincing reason to get a new one anymore.
I purchased my first "flat panel" in 2010. It was a 42in 1080p unit. It was a big deal at the time that it had a LED backlight. It would be laughable today, but it was amazing next to the 32in CRT that it replaced. I upgraded in 2019 to a 65in 4k unit with local diming and HDR- that was a huge upgrade. When my daughter threw an object and broke the screen last month, I was forced to buy a new TV. I bought a new 2025 model year model with all the bells and whistles. And you know what? If someone had snuck into my house and swapped them in the middle of the night I'm not sure I would have noticed. I'm sure in a side by side compare I would notice that the darks are a little darker and the light parts are a little lighter. I'd probably notice the 120hz refresh rate in the new one. But for day-to-day viewing? It's just not a compelling upgrade.
So manufacturers have pretty much gone as far as they can go on brightness and HDR. Hard to say what else they can do to make the viewing experience better/different. 8k isn't noticeable unless your screen is absurdly large. I suppose "absurdly large" is one new thing: you can get 100in TVs for consumer-accessible prices now. But anything much over 75in starts to dominate the room and ends up looking pretty gaudy in a normal living room.
Re: (Score:2)
I think that's more so an attempt at staying at the top end of VESA DisplayHDR specs in all viewing situations.
It's an actual fidelity improvement- and the whole name of the game for these things is high fidelity.
The size thing- I do think that's where it's really at.
As you mentioned, 100" TVs are literally the norm, now (my sister just bought one- I'm still here with a plebeian 65")
Of course, I literally couldn't reasonably fit a 100" in m
I'll confess my ignorance (Score:2)
What the hell.is a "nit?" I assume we're not talking about lice eggs here...
Re:I'll confess my ignorance (Score:4, Informative)
What the hell.is a "nit?" I assume we're not talking about lice eggs here...
In your defence it's one of those commie foreign things like every other unit in the metric system. Also in your defence it's a depreciated unit that for some reason just won't die. Nit is the old name for the current SI unit candela / square meter (1 nit = 1 cd/m^2)
Re: (Score:2)
As you mentioned, it's 1 cd/m^2.
We can deprecate meters if we're at it as long as we're allowed to use an arbitrary amount of other units to define it.
Re: (Score:2)
Re: (Score:2)
Is it deprecated? There's no unit that replaces it. As you mentioned, it's 1 cd/m^2.
You literally just pointed out the unit which replaced it. Candella is the SI unit.
Re: (Score:2)
cd/m^2 (candela per square meter) is a measure of a different thing- it's like replacing grams with grams per cubic centimeter. It makes no sense.
I can see why "nit" is still commonly used.
Re: (Score:2)
No, it's a reflection that every combination of standard units doesn't need it's own unit. Nits fell out of favour a long time ago, only monitors brought it back, and that despite the fact we were using cd/m^2 prior. There's no reason for it to be used. They are the same unit.
We don't have units for many common combinations in the world.
AI slop article (Score:2)
Nothing in TFS or TFA points towards any end of increasing brightness specs.
Nothing about having capability to make brighter pictures has anything remotely to do with the loudness war (which had to do with compressed dynamic range). TFS even notes that mastering is not hitting the limits of the TVs (which is what the loudness war was doing with digital audio).
It's time the Verge upgraded it's version of ChatGPT.
108 nits ought to be enough for anybody (Score:2)
That's what Dolby Cinema theaters achieve, and those are amazing.
I believe my home theater with an Optoma UHD65 projector and 106" screen achieves about the same nits, but it doesn't look nearly as good as Dolby Cinema in terms of black level and colors. Certainly, the HDR on it is nothing to write home about. There is more to a good quality picture than just nits.
Re: (Score:2)
1) Theaters are not high dynamic range*
2) Theaters are nearly black inside.
3) Theaters are reflected light, not emitted.
These all factor into it, but primarily- Dolby Cinema actually just isn't that good. They tone map the masters down for cinema, and leave them closer to the masters for home HDR releases.
Your Optoma UHD65 is not HDR. Sure, it can process an HDR10 signal- but it tone maps i
My TV is bright enough (Score:1)