LG Develops OLED Monitor That Can Hit 480Hz Refresh Rate (pcmag.com) 95
LG says it developed a 27-inch OLED gaming monitor that can reach an incredibly high 480Hz refresh rate, promising to usher in an "era of OLEDs featuring ultra-high refresh rates," LG says. From a report: LG says it achieved the 480Hz rate on a QHD 2,560-by-1,440-resolution display. Other vendors, including Alienware and Asus, have also introduced PC monitors that can hit 500Hz. But they did so using IPS or TN panels at a lower 1920-by-1080 resolution. OLED panels, on the other hand, are known for offering stunning color contrasts, and true blacks, resulting in top-notch picture quality.
The 480Hz refresh rate will be overkill for the average gamer. But the ultra-high refresh rate could appeal to competitive players, where latency and smooth gameplay matters. LG adds that the 27-inch OLED monitor features a 0.03-millisecond response time. The OLED panel should also be easier on the eyes during long playthroughs. "The company's Gaming OLEDs emit the lowest level of blue light in the industry and approximately half the amount emitted by premium LCDs," LG says. "This reduction in blue light not only minimizes eye fatigue but also eliminates flickers, providing gamers with more comfortable and enjoyable gaming sessions."
The 480Hz refresh rate will be overkill for the average gamer. But the ultra-high refresh rate could appeal to competitive players, where latency and smooth gameplay matters. LG adds that the 27-inch OLED monitor features a 0.03-millisecond response time. The OLED panel should also be easier on the eyes during long playthroughs. "The company's Gaming OLEDs emit the lowest level of blue light in the industry and approximately half the amount emitted by premium LCDs," LG says. "This reduction in blue light not only minimizes eye fatigue but also eliminates flickers, providing gamers with more comfortable and enjoyable gaming sessions."
Re: Well (Score:2)
Re: (Score:2, Insightful)
It may be quantifiable, but when the target is human visual perception, 480 refreshes a second is way way beyond superfluous.
Display tech companies have started trying to convince everyone they should be able to tell a world of difference between frame rates north of 120 FPS, but it's just not the case.
Re: (Score:2)
Depends on the situation. I remember not to long ago comapnies running 120hz and trying and failing to get decent 3d tv performance.
At 480 hz you could use a 4 sets of customized 3D glasses and overlay 4 different mario cart views on the same screen each running at 120 hz. The tv without glasses would ve a jumbled mess but each player could have a unique full screen view and couldnt cheat. Keeping the glasses and monitor in sync would be a headache but possible
Cant think of many uses beyond that thoug
Re: (Score:2)
I never met a pair of shutter glasses that sufficiently mitigated cross-talk, not too optimistic about the chances of that working even that well at 480 Hz cycling.
That is assuming a game company would even bother doing the development to support such a niche case on niche equipment, particularly since the 3DTV attempt broadly failed and tanked the chances of people bothering to have that equipment.
Re: (Score:1)
The LED screen used for filming shows like The Mandalorian has a 180FPS refresh rate which enables up to 6 different cameras (at 30FPS) to see completely different lighting and perspectives around the same real life actors and props, or 3 cameras at 60FPS, etc. It's not cheap, but cutting edge tech always becomes mainstream eventually.
https://youtu.be/u7pu1cQBqtQ?f... [youtu.be]
Re: (Score:2)
I suspect those cameras had actual hard, opaque physical shutters actuating at 30 fps. Which can be handled in a scenario where the mics if any are far away, the shutters are buried in large cameras, and you have audio production to take care of any potential residual noise, and vibration doesn't matter too much. On the face, we've had to settle for LCD shutters that electronically "close", and they don't quite get fully transparent nor fully opaque.
Re: (Score:2)
It may be quantifiable, but when the target is human visual perception, 480 refreshes a second is way way beyond superfluous.
It is not superfluous for sample-and-hold displays if the goal is to reduce motion blur while keeping brightness up. You do not have to compute 480 different frames per second to get fluid animations if each frame is only shown for a very short time and the picture is kept black for the rest of the time. But if you keep the OLEDs alight all the time, then the fewer frames per second there are, the more motion blur there is. So accepting the same amount of motion blur, you can either have less FPS but lower
Re: (Score:2)
I wonder if such a display would be fast enough to emulate an interlaced display, so I could watch an interlaced video without running yadif2x on it.
Re: (Score:2)
If the interlaced material was shot by a came
Re: (Score:2)
Average gamer? (Score:4, Insightful)
Show me the science that says this might benefit ANY gamer. Maybe they just need some oxygen-free wire to hear their enemies approaching.
Re:Average gamer? (Score:5, Funny)
Re: (Score:1)
Re: Average gamer? (Score:2)
Re: (Score:1)
Re: (Score:2)
You are confused. That video is about the latency between input and output, not the framerate (and no, they are not the same thing).
As the guy says at the beginning, current touch screen systems have about a 100ms delay between when the touch occurs and when the result is displayed. Believe it or not, current touch screen systems are updating their display at much faster than 10 Hz. The latency demonstrated in the video is almost entirely due to the processing that goes into detecting the touch and deter
Re: Average gamer? (Score:1)
Re: (Score:2)
Question is did you do a blind comparison?
I have seen several videos setting up blind comparisons and asking the users to guess which refresh rate was faster. Basically over 120 Hz it seemed they were no better than randomly guessing.
Re: Average gamer? (Score:1)
Re: (Score:2)
Question is whether you did a blind test, where you didn't *know* you were looking at 120Hz v. 240 Hz. Also doing it on the same monitor to control for other potential qualitative differences between the models other than refresh rate.
When told up front the refresh rates, people magically could see the improvement, despite not being able to pick them out. I recall one case where the person got it wrong and then after being told he says "oh wait, *now* I see how it's better".
We are very much inclined to wa
Re: Average gamer? (Score:1)
Re: Average gamer? (Score:1)
Re: (Score:2)
If I recall the military paper in general, it was a bit different. They wanted to see how short an interval a discontinuity in what you see before it's impossible to even know. So you are watching a landscape and for a tiny fraction of a second some very distinct thing flashes on screen. Will you notice the discontinuity? Will you know what the frame was? Unanswered was even if you could identify the frame in retrospect, did you know what it was in the moment (e.g. latency before your brain actually pro
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
I'm pretty sure an individual number doesn't exist at all independent of the material being viewed. Visible flickering is gone by 90Hz, beyond that the speed of objects determines how visible the framerate is. If your framerate is too low you get to make a tradeoff between strobing and persistence of vision motion blur artifacts. Strobing (think CRTs or VR headsets) is visible as duplicate images, persistence of vision motion blur makes the image blurred (typical LCD, although some gaming monitors support s
Re: (Score:3)
Note that it highly depends on the content.
If you had, say, an LCD with 90 screen updates every second, *but* you turn off the backlight 75% of the time, it will look as smooth as 360 HZ. Why? Because when the eyes are not getting any data from the dark screen, the brain is filling in automatically, and the pixels transitioning from one state to another are masked out by the lack of a backlight.
Re: (Score:2)
Re: (Score:2)
They probably detected that somewhere around 400Hz was the limit of perception but estimated 800 due to some variation on the Nyquist theorem.
Re: (Score:2)
It's a diminishing returns thing for sure, but we can quantify the benefit. The maximum delay between action in the game arriving at your eyes includes the refresh period plus the input latency. When we lower the refresh period (by increasing the frequency/refresh rate), we lower that maximum delay. Going from 240 Hz to 480 Hz, you'd gain 2ms. How important is 2ms?
Re: (Score:2)
Well, the fastest ever recorded human reaction time is 100 ms.
Re: (Score:2)
Remember that all of this is additive. The human reaction time is added to the latencies from your computer equipment. So if you're a pro gamer with 100ms reaction times, then reducing your max monitor latency by 2ms might bring your overall delay from 150ms to 148ms (1-2%). Personally, I don't think it would make much of a difference for someone like me.
I think it's kind of like golf where people would rather spend money on equipment than time to develop their skills. And with 480 Hz, the issue of the
Re: (Score:2)
It's unlikely any gamer has a reaction time of 100 ms. 100 is the fastest ever measured (it was a drag racer). About 200 seems to be what most people reach with training, but Nvidia claims some pros get down to 150 ms. So training can make more than an order of magnitude more difference than one of these monitors in the worse case. It might be useful if you've actually minimized your reaction time, but if you're the mass market consumer that mass market consumer electronics are made for you'd probably be be
Re: Average gamer? (Score:2)
That's only in full reaction to visual stimulus, from eyes receiving the photons to your hand moving. And even then, anticipation of an event goes all the way down to 10ms reaction. And even though that might be reaction time, actual subconscious perception of visual stimulus is much faster and even varies a lot depending on which part of the eye received it, and delaying that only adds yet more of a delay to reaction time. So yeah, I'd say 2ms could indeed make a difference, depending on the person.
Re: (Score:2)
Your eyes receiving photons and your hand moving is what happens when you play computer games.
Re: (Score:2)
For a game, seeing 480 frames per second instead of 120 probably doesn't have any benefit (especially since most of the frames would be of an interpolated state between the "actual" frames), but after the GPU renders a frame it has to wait for the next display refresh to actually present it. So even for a game running at 30 frames per second, with a faster refresh rate tha
Re: (Score:2)
It might be useful as a return back to stereoscopic 3D.
3D might be dead, but there's a vocal minority trying to push VR around, and another minority that is still pushing 3D movies. After all, they may not move as many tickets as 10 years ago, but you can still buy tickets to 3D showings - meaning people view 3D often enough to justify doing 3D. And unlike say, 4K where you get 1080p and others "for free", 3D requires work to do, so the fact it's still around means there's still money.
(Of course, it helps t
Re: (Score:2)
Show me the science that says this might benefit ANY gamer.
Gamer? The whole point of this TV is not to improve gaming, it's to improve sales of RTX 4090s. God knows my 3 year old upper middle-end graphics card can barely get 45fps on my 4k screen on some modern games even with DLSS running.
Re: (Score:2)
LTT did test high refresh rates and found some small gains. The returns are diminishing though.
Blue light panic is such a crock (Score:1)
Congrats, LG, by buying into the blue light nonsense you've taken your top of the line product and made it look like an old restaurant TV that's been facing the sun for the past decade. Embarrassing for you.
Re: (Score:3)
Re: Blue light panic is such a crock (Score:2)
Re: Blue light panic is such a crock (Score:2)
Re: (Score:2)
Nothing you said negates anything I said. That nice OLED will look like shit with the blue filtered out. I don't care how dark the blacks are if the overall color spectrum is fucked up to appeal to hypochondriacs.
Re: Blue light panic is such a crock (Score:2)
Re: (Score:2)
I don't know if it's a scare. If you watch TV right before bed, you may care more about getting to sleep easily afterward than you do about color accuracy. Eye fatigue at work involves blue light too - if you're in front of a screen for long hours, you'll have less eye pain and dry eye if you block some of it. It's not something I do, but it's about comfort more than permanent damage - it doesn't hurt you.
I know high blue light no longer means that there's also lots of UV that you don't see but that used
Re: (Score:2)
Care to revist this? [slashdot.org]
Re: (Score:2)
I don't think I've ever visited that comment in the first place. You probably just need to say what you mean.
But red light is far more important than just blue light. Scotopic vision ("night" vision) is enhanced by melatonin and red light does not suppress melatonin production because rods aren't very sensitive to red. Rods are most sensitive to blue-green and blue, so a well designed blue filter should give you more than just a warm picture. A red shift / blue filter may not do as much, but when I read
Re: (Score:2)
Click the link, and after reading at least the summary, revisit your statement that pretends like blue light is a "thing" that is somehow any different from light in general.
I wonder when we'll drop the notion... (Score:2)
... of refresh rates and just let software update arbitrary portions of the screen whenever they're ready without having to push a full framebuffer? Especially makes sense in the context of raytracing, wherein you can prioritize important parts of the scene or areas with more motion for more frequent updates (or higher resolution)**.
What's the point of having to dump whole frames at once to the output device?
** In a way, that's kind of what the Apple Vision Pro is doing, only moreso, by using eye tracking
Re:I wonder when we'll drop the notion... (Score:4, Interesting)
The point is to prevent "tearing". A lot of effort has been put into solving that problem.
If the pixels were updated in some random or semi-random pattern on the screen it would probably be unnoticeable, but I suspect that either a lot of architectural changes would be needed both in software and hardware, or you would effectively have to achieve a 480Hz full-screen refresh rate to achieve it without doing things like attaching an address to each pixel output so the rendering device didn't need to assume sequential pixels should be drawn sequentially.
Re: (Score:2)
Well, we can't really drop the idea of refresh rates for a while because of a couple things, I think: ;)
1. It takes time for the pixels themselves to change state. Now, going to LED type technology this can be far faster than human light response(which is ~60hz, varying quite a bit by individual, age, and all that), but it's a factor. LCD tech is slower, but LED was fast enough back in the '80s to transmit megabit speeds unintentionally - you could read ethernet traffic by the status light.
2. There's on
Re: (Score:2)
Re: (Score:2)
First because you need to at least consider the worst case. It is *highly* common for full frame to update, so you need some metric of worst case how many frames can be pushed through, as governed by factors including data bandwidth to the display and physics of the elements manipulated to present the display among other things.
Once you account for "worst case", well, not much point in doing restricted screen updates when you have the bandwidth for worst case in a dedicated 48Gbps connection to push your p
Re: (Score:2)
Basically you need to fuse the GPU and the monitor. The VRAM needs to have 2 buses - one for R/W to the GPU (as is now) and one read only (but possibly much faster) going to the panel. So the GPU just renders and puts pixels in the VRAM and the "panel" reads them whenever and changes the corresponding pixel(s). The key difference is that this type of access doesn't have to be sequen
Included Eyeball Upgrade? (Score:3)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
neither, I just see my cursor move
Re: (Score:2)
LCD's doen't work that way. The backlight is constant.
You'd need a CRT monitor to see the flicker. Or an OLD style fluorescent tube light (whose ballast is locked to the mains power frequency). Incandescent bulbs, sadly, carry over too much latent heat to flicker much.
Re: (Score:2)
Re: (Score:2)
CRT will cover this with phosphor fade. Cheap Christmas lights with a half wave rectified circuit will demonstrate it well enough.
Re: (Score:2)
Most people can't see 60Hz flicker ...and definitely cannot see 70Hz flicker
Old VDU's had low refresh rates like these and didn't have this issue, below 70Hz you could see the refresh of the whole screen, but they could do smooth judder free motion with no issues, as the screen updates at the same rate across the screen - a 120Hz VDU is smoother than butter
It's not that you can see flicker of the screen refresh, it's that you can see the latency of the refresh across the screen as different pixels update at
Re: (Score:2)
There are benefits above 60 Hz - the nausea effect in below 90 Hz in VR is a key indicator that 60 Hz isn't enough, even if it's not at a conscious level.
I'm just glad that now, high quality CFL's and LED's mercifully flicker at a kHz or so, rather than the 50/60 Hz of the country's power grid as they did a couple of decades ago. I get so many fewer headaches. Now if they can just improve the spectra...
Re: (Score:2)
This is a misunderstanding.
Lets split this into two things: can you see the difference, and does it make you a better player.
I moved from 60Hz -> 85Hz -> 144Hz -> 280Hz and each time, I could see the difference. I know everyone on here is a contrarian, so let me expand. 280Hz is just a nicer experience to me: everything is very fluid feeling, there are fewer artifacts.
You can see a difference because your eyes are an additive input device and games can only render instants in time. You can't identi
Re: (Score:2)
The ones I have at the moment can only handle a 60Hz maximum rate.
You definitely need an eyeball upgrade. The difference between 60Hz and 120Hz is trivially visible. Going above 120Hz is starting to be of dubious value.
There's a reason 100% of manufacturers of VR equipment consider 60Hz displays unsuitable for their headsets, the differences are well and truly perceivable.
Re: (Score:2)
Re: (Score:2)
Above 90Hz you cannot see the refresh rate flicker - you can see the pixel update delay flicker, tearing, etc .. as pixels take different times to update
On an old CRT the whole screen takes the same time to update, but you could often see the flicker on most due to the 50/60Hz refresh rate
Some had 100/120Hz refresh and you simply cannot see any flicker
Re: (Score:2)
To be fair, a 120Hz phone screen looks better to me. When scrolling, it's easier to track where I was reading, and even continue reading.
Re: (Score:2)
Hey bro, I know you are smart, but your perspective is skewed.
60FPS is plenty fast for passively ingesting content.
For less passive activities, a 60hz update to what you see is too slow. If you were driving a car in the Real World and had glasses on that updated your view of the outside world 60 times a second, you would be able to drive just fine; however, you would notice the limitations.
Resolution vs. Hz (Score:2)
Isn't resolution more important at a certain crossover point? With high resolution if an object moves a smaller degree of arc it will show as a changed pixel on the display. Of course if comparing high res vs. ultra high res your eye can't detect that. But most gamer displays are low resolution.
Re: (Score:2)
A semi-serious question. (Score:2)
Does that really mean anything for the average human user? My understanding is that once you get above 120Hz, you really aren't going to see any difference as the frame rate goes up from there. So why develop these super-high refresh rates? Are we already planning for the day where we're obsolete and we need to impress whatever robotic overlords we have to keep improving refresh rates for their end-users, which will be other robots? Is there a real-world use-case for these higher refresh rates that I don't
Re: (Score:2)
Does that really mean anything for the average human user? My understanding is that once you get above 120Hz, you really aren't going to see any difference as the frame rate goes up from there. So why develop these super-high refresh rates? Are we already planning for the day where we're obsolete and we need to impress whatever robotic overlords we have to keep improving refresh rates for their end-users, which will be other robots? Is there a real-world use-case for these higher refresh rates that I don't get? Even gamers can't see better than their eye allows. What's the goal?
I've got a 4k monitor that can hit 144Hz and I have to say... after years of playing a game (Destiny 2) at 60 FPS, pushing 120+ has had zero visible effect. I don't really buy the "it's smoother" story even at that rate. The only way I can tell it's any different is that the FPS counter in the corner says it is. And yes, I'm 100% confident my monitors are set up correctly and running at the provided frame rate.
This 480Hz stuff has got to be like painting green rings around your CDs to make the digital
Re: (Score:2)
People want video beyond the eye's abilities, much like people who demand "HD Audio" - raw, uncompressed, 32-bit, 192-kHz sampled audio. It's *way* beyond our ability to hear. It's also *way* beyond our ability to reproduce (outside of explosives and rocketry).
The reality is, for audio, 12-bit/30 kHz covers everything we'd really consider musical. (That also includes the noise floor, by the way). Funny thing about that: It's within the capabilities of an analog LP record. Ever wonder why people still swear
Re: (Score:2)
records can reproduce higher frequencies than 15kHz.
Also, with digital it's a bit different than the analog frequency response. Sure, my tape deck may have -3dB at 18kHz or whatever, but it is different than digital with 36kHz sample rate. Two reasons.
1.there still is some signal at the higher frequencies, it's just attenuated.
2. Due to how digital works, for 36kHz sample rate, you need a filter that passes 17999Hz and drops 18000Hz completely. Such a steep filter may not be possible and various attempts wi
Re: (Score:2)
why? (Score:2)
see title.
Say what? (Score:2)
"This reduction in blue light not only minimizes eye fatigue but also eliminates flickers, providing gamers with more comfortable and enjoyable gaming sessions."
How in hell does reducing blue light eliminate flickering? Am I missing something, or are LG's press release writers totally clueless about the technology they're writing about?
I also question the relationship between blue light and eye fatigue. I know blue light at the wrong time of day is said to disrupt circadian rhythms by suppressing melatonin; maybe the "eye fatigue" goes along with the whole-body fatigue which results from poor sleep? I don't see any other possible causal relationship here, given tha
Re: (Score:2)
Re: (Score:2)
Unless you're old enough that you need your pupils to contract to focus at a reasonable distance. Then you need more light but just not the kind that inhibits melatonin.
My kid asking for this... (Score:2)
In 3... 2... 1...
I got them OLED blues (Score:3)
Re: (Score:2)
Can I play Duck Hunt on it? (Score:1)
The problem with most modern displays and Duck Hunt is that the display buffers frame data before displaying it, so the NES Zapper mechanism of showing a hitbox-frame and simultaneously checking the Zapper's light-sensor to register a hit does not work correctly since the displayed frame on the LCD is delayed. With a theoretical response time of 0.03ms, I wonder if it will now be possible to attach a high-speed RF-demodulator to this monitor and play Duck Hunt on an original NES...
Of course, it may be nece
Re: (Score:1)
To clarify the buffering thing: I'm not saying this display doesn't buffer frames like any other, what I am saying is that if it processes 8 frames within the time-span of a single NES rendered frame, this buffering may become indistinguishable from realtime CRT beam-scanning to the NES.
Getting Out of Control (Score:2)
I'd be happier... (Score:2)
I'd be happier with an OLED screen that was capable of displaying text that wasn't a blurry mess.
Lies. (Score:2)
Can't wait for micro LED.