Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
IT Technology

LG Develops OLED Monitor That Can Hit 480Hz Refresh Rate (pcmag.com) 95

LG says it developed a 27-inch OLED gaming monitor that can reach an incredibly high 480Hz refresh rate, promising to usher in an "era of OLEDs featuring ultra-high refresh rates," LG says. From a report: LG says it achieved the 480Hz rate on a QHD 2,560-by-1,440-resolution display. Other vendors, including Alienware and Asus, have also introduced PC monitors that can hit 500Hz. But they did so using IPS or TN panels at a lower 1920-by-1080 resolution. OLED panels, on the other hand, are known for offering stunning color contrasts, and true blacks, resulting in top-notch picture quality.

The 480Hz refresh rate will be overkill for the average gamer. But the ultra-high refresh rate could appeal to competitive players, where latency and smooth gameplay matters. LG adds that the 27-inch OLED monitor features a 0.03-millisecond response time. The OLED panel should also be easier on the eyes during long playthroughs. "The company's Gaming OLEDs emit the lowest level of blue light in the industry and approximately half the amount emitted by premium LCDs," LG says. "This reduction in blue light not only minimizes eye fatigue but also eliminates flickers, providing gamers with more comfortable and enjoyable gaming sessions."

This discussion has been archived. No new comments can be posted.

LG Develops OLED Monitor That Can Hit 480Hz Refresh Rate

Comments Filter:
  • Average gamer? (Score:4, Insightful)

    by Lije Baley ( 88936 ) on Thursday January 04, 2024 @11:57AM (#64131441)

    Show me the science that says this might benefit ANY gamer. Maybe they just need some oxygen-free wire to hear their enemies approaching.

    • by UnknowingFool ( 672806 ) on Thursday January 04, 2024 @12:07PM (#64131471)
      Only if those cables are Monster cables. Anything less will never transport those 1s and 0s effectively.
    • The benefit of 480 Hz is plainly visible on the desktop when moving the mouse.
      • This. 480 is probably getting to the point of diminishing returns, but if you run over 100hz and then go back to a 60hz panel (or ,4k30 on a shit adaptor) you will certainly feel like the computer is broken because the cursor jumps around rather than fluidly tracking inputs.
        • The benefit is clearly visible all the way up to 1000 Hz. https://www.youtube.com/watch?... [youtube.com]
          • You are confused. That video is about the latency between input and output, not the framerate (and no, they are not the same thing).

            As the guy says at the beginning, current touch screen systems have about a 100ms delay between when the touch occurs and when the result is displayed. Believe it or not, current touch screen systems are updating their display at much faster than 10 Hz. The latency demonstrated in the video is almost entirely due to the processing that goes into detecting the touch and deter

            • I know, itâ(TM)s not the right analogy. The difference in smoothness is very similar. There is a day and night difference between 120 and 240 Hz when angular movement is involved like zooming in with a rifle.
              • by Junta ( 36770 )

                Question is did you do a blind comparison?

                I have seen several videos setting up blind comparisons and asking the users to guess which refresh rate was faster. Basically over 120 Hz it seemed they were no better than randomly guessing.

                • I have both 120 Hz and 240 Hz displays for different purposes. The difference is actually bigger on the desktop than in most games. In strategy games where you pan the map around a lot, similar to desktop usage, there is huge difference.
                  • by Junta ( 36770 )

                    Question is whether you did a blind test, where you didn't *know* you were looking at 120Hz v. 240 Hz. Also doing it on the same monitor to control for other potential qualitative differences between the models other than refresh rate.

                    When told up front the refresh rates, people magically could see the improvement, despite not being able to pick them out. I recall one case where the person got it wrong and then after being told he says "oh wait, *now* I see how it's better".

                    We are very much inclined to wa

                    • If itâ(TM)s hard to tell the difference then the content being viewed doesn't benefit from it. There are scientific papers by the military showing we can see the difference up to 960 Hz or something after which they stopped the test. I donâ(TM)t need an RCT for something thatâ(TM)s blatantly obvious.
                    • Also the people in that Linus Tech Tips video where they couldnâ(TM)t tell the difference were all noobs.
                    • by Junta ( 36770 )

                      If I recall the military paper in general, it was a bit different. They wanted to see how short an interval a discontinuity in what you see before it's impossible to even know. So you are watching a landscape and for a tiny fraction of a second some very distinct thing flashes on screen. Will you notice the discontinuity? Will you know what the frame was? Unanswered was even if you could identify the frame in retrospect, did you know what it was in the moment (e.g. latency before your brain actually pro

                    • Good point. I've never tried 480 Hz, only 240 Hz on a 24" monitor. Another important factor which is ignored is the size of the display. Even 60 Hz is smooth on a small 3" display because the distance moved by the projection on the retina is tiny. When sitting up close to a 32" monitor, the same animation would traverse a huge portion of your retina making every frame visible.
    • Years ago, I saw a presentation claiming that it would take 800 fps for a display's refresh rate not to register subliminally. That was an extrapolation, as nothing produces 800 fps (AFAIK). I can't remember how they determined how much people were aware of the refresh rates. I think they measured brain activity of a person looking at the real world as a baseline. The subject was shown displays of increasing frame rate, and the researches noted how the brain reacted differently. Apparently, there was some b
      • I'm pretty sure an individual number doesn't exist at all independent of the material being viewed. Visible flickering is gone by 90Hz, beyond that the speed of objects determines how visible the framerate is. If your framerate is too low you get to make a tradeoff between strobing and persistence of vision motion blur artifacts. Strobing (think CRTs or VR headsets) is visible as duplicate images, persistence of vision motion blur makes the image blurred (typical LCD, although some gaming monitors support s

        • by Junta ( 36770 )

          Note that it highly depends on the content.

          If you had, say, an LCD with 90 screen updates every second, *but* you turn off the backlight 75% of the time, it will look as smooth as 360 HZ. Why? Because when the eyes are not getting any data from the dark screen, the brain is filling in automatically, and the pixels transitioning from one state to another are masked out by the lack of a backlight.

          • Having shorter on-times reduces the blur but introduces other artifacts. 90Hz with strobing looks perfectly sharp but you see individual positions of moving objects which is not pleasant. You get a similar effect with PWM lights. The most visible example I often encounter is PWM headlamp in the rain or snow - you see individual "freeze-frame" droplets instead of smooth blur lines. It's hard on the eyes.
      • They probably detected that somewhere around 400Hz was the limit of perception but estimated 800 due to some variation on the Nyquist theorem.

    • It's a diminishing returns thing for sure, but we can quantify the benefit. The maximum delay between action in the game arriving at your eyes includes the refresh period plus the input latency. When we lower the refresh period (by increasing the frequency/refresh rate), we lower that maximum delay. Going from 240 Hz to 480 Hz, you'd gain 2ms. How important is 2ms?

      • by ceoyoyo ( 59147 )

        Well, the fastest ever recorded human reaction time is 100 ms.

        • Remember that all of this is additive. The human reaction time is added to the latencies from your computer equipment. So if you're a pro gamer with 100ms reaction times, then reducing your max monitor latency by 2ms might bring your overall delay from 150ms to 148ms (1-2%). Personally, I don't think it would make much of a difference for someone like me.

          I think it's kind of like golf where people would rather spend money on equipment than time to develop their skills. And with 480 Hz, the issue of the

          • by ceoyoyo ( 59147 )

            It's unlikely any gamer has a reaction time of 100 ms. 100 is the fastest ever measured (it was a drag racer). About 200 seems to be what most people reach with training, but Nvidia claims some pros get down to 150 ms. So training can make more than an order of magnitude more difference than one of these monitors in the worse case. It might be useful if you've actually minimized your reaction time, but if you're the mass market consumer that mass market consumer electronics are made for you'd probably be be

        • That's only in full reaction to visual stimulus, from eyes receiving the photons to your hand moving. And even then, anticipation of an event goes all the way down to 10ms reaction. And even though that might be reaction time, actual subconscious perception of visual stimulus is much faster and even varies a lot depending on which part of the eye received it, and delaying that only adds yet more of a delay to reaction time. So yeah, I'd say 2ms could indeed make a difference, depending on the person.

          • by ceoyoyo ( 59147 )

            Your eyes receiving photons and your hand moving is what happens when you play computer games.

    • Show me the science that says this might benefit ANY gamer. Maybe they just need some oxygen-free wire to hear their enemies approaching.

      For a game, seeing 480 frames per second instead of 120 probably doesn't have any benefit (especially since most of the frames would be of an interpolated state between the "actual" frames), but after the GPU renders a frame it has to wait for the next display refresh to actually present it. So even for a game running at 30 frames per second, with a faster refresh rate tha

    • by tlhIngan ( 30335 )

      It might be useful as a return back to stereoscopic 3D.

      3D might be dead, but there's a vocal minority trying to push VR around, and another minority that is still pushing 3D movies. After all, they may not move as many tickets as 10 years ago, but you can still buy tickets to 3D showings - meaning people view 3D often enough to justify doing 3D. And unlike say, 4K where you get 1080p and others "for free", 3D requires work to do, so the fact it's still around means there's still money.

      (Of course, it helps t

    • Show me the science that says this might benefit ANY gamer.

      Gamer? The whole point of this TV is not to improve gaming, it's to improve sales of RTX 4090s. God knows my 3 year old upper middle-end graphics card can barely get 45fps on my 4k screen on some modern games even with DLSS running.

    • by AmiMoJo ( 196126 )

      LTT did test high refresh rates and found some small gains. The returns are diminishing though.

  • Congrats, LG, by buying into the blue light nonsense you've taken your top of the line product and made it look like an old restaurant TV that's been facing the sun for the past decade. Embarrassing for you.

    • How can it emit "less blue light" than other monitors? Does that mean it's incapable of displaying accurate colours?
      • I think that blue light is a very markety term in this instance, but go to the tv store and look at an oled side by side with a Samsung quantum dot display of comparable specs. The non-oled looks like shit and will be excessively bright with no contrast on the low end. The OLED has actual dark black colors. I went to the store to buy a quantum dot samsung panel and was standing there with this other panel that looked 50x better and it was the LG oled and it's all I'll buy now. I don't like to feel like
    • I don't know if it's a scare. If you watch TV right before bed, you may care more about getting to sleep easily afterward than you do about color accuracy. Eye fatigue at work involves blue light too - if you're in front of a screen for long hours, you'll have less eye pain and dry eye if you block some of it. It's not something I do, but it's about comfort more than permanent damage - it doesn't hurt you.

      I know high blue light no longer means that there's also lots of UV that you don't see but that used

        • I don't think I've ever visited that comment in the first place. You probably just need to say what you mean.

          But red light is far more important than just blue light. Scotopic vision ("night" vision) is enhanced by melatonin and red light does not suppress melatonin production because rods aren't very sensitive to red. Rods are most sensitive to blue-green and blue, so a well designed blue filter should give you more than just a warm picture. A red shift / blue filter may not do as much, but when I read

          • Click the link, and after reading at least the summary, revisit your statement that pretends like blue light is a "thing" that is somehow any different from light in general.

  • ... of refresh rates and just let software update arbitrary portions of the screen whenever they're ready without having to push a full framebuffer? Especially makes sense in the context of raytracing, wherein you can prioritize important parts of the scene or areas with more motion for more frequent updates (or higher resolution)**.

    What's the point of having to dump whole frames at once to the output device?

    ** In a way, that's kind of what the Apple Vision Pro is doing, only moreso, by using eye tracking

    • The point is to prevent "tearing". A lot of effort has been put into solving that problem.

      If the pixels were updated in some random or semi-random pattern on the screen it would probably be unnoticeable, but I suspect that either a lot of architectural changes would be needed both in software and hardware, or you would effectively have to achieve a 480Hz full-screen refresh rate to achieve it without doing things like attaching an address to each pixel output so the rendering device didn't need to assume sequential pixels should be drawn sequentially.

    • Well, we can't really drop the idea of refresh rates for a while because of a couple things, I think:
      1. It takes time for the pixels themselves to change state. Now, going to LED type technology this can be far faster than human light response(which is ~60hz, varying quite a bit by individual, age, and all that), but it's a factor. LCD tech is slower, but LED was fast enough back in the '80s to transmit megabit speeds unintentionally - you could read ethernet traffic by the status light. ;)
      2. There's on

    • We haven't even finished migrating to Wayland and you already want a whole new display system? I'm fine with that as long as we call it "Yet Another Display System".
    • by Junta ( 36770 )

      First because you need to at least consider the worst case. It is *highly* common for full frame to update, so you need some metric of worst case how many frames can be pushed through, as governed by factors including data bandwidth to the display and physics of the elements manipulated to present the display among other things.

      Once you account for "worst case", well, not much point in doing restricted screen updates when you have the bandwidth for worst case in a dedicated 48Gbps connection to push your p

    • It might be technically possible, but it requires such a big architectural changes that I doubt we will see it anytime soon...
      Basically you need to fuse the GPU and the monitor. The VRAM needs to have 2 buses - one for R/W to the GPU (as is now) and one read only (but possibly much faster) going to the panel. So the GPU just renders and puts pixels in the VRAM and the "panel" reads them whenever and changes the corresponding pixel(s). The key difference is that this type of access doesn't have to be sequen
  • by Roger W Moore ( 538166 ) on Thursday January 04, 2024 @12:07PM (#64131469) Journal
    Does it come with an included eyeball upgrade? The ones I have at the moment can only handle a 60Hz maximum rate.
    • by ELCouz ( 1338259 )
      You might need an eyeball upgrade. 60 Hz flickering is clearly visible to the human eye. It's even worse when source of illumination are in motion.
      • Just tried it with my laptop. Even waving it around shows zero flicker. Perhaps it is because I'm originally European so technically my eyes are only rated for 50Hz.
        • That's because your display is sample-and-hold and doesn't actually flicker but instead holds the same image for the entire frame time. Now try moving your cursor instead. Do you see a smooth blur or many individual cursor flashes? 480Hz isn't even enough to get rid of all these very obvious strobing artifacts.
        • by sl3xd ( 111641 )

          LCD's doen't work that way. The backlight is constant.

          You'd need a CRT monitor to see the flicker. Or an OLD style fluorescent tube light (whose ballast is locked to the mains power frequency). Incandescent bulbs, sadly, carry over too much latent heat to flicker much.

          • Exactly, if this were a CRT monitor it might make some sense although I would have other questions then! However, I have to say that even a properly operating old-style fluorescent tube had a barely noticeable flicker at 50 Hz and even an LED power by a simple diode rectifier has no noticeable flicker at 100-120 Hz.which is a quarter of this screen's refresh rate.
          • CRT will cover this with phosphor fade. Cheap Christmas lights with a half wave rectified circuit will demonstrate it well enough.

      • Most people can't see 60Hz flicker ...and definitely cannot see 70Hz flicker

        Old VDU's had low refresh rates like these and didn't have this issue, below 70Hz you could see the refresh of the whole screen, but they could do smooth judder free motion with no issues, as the screen updates at the same rate across the screen - a 120Hz VDU is smoother than butter

        It's not that you can see flicker of the screen refresh, it's that you can see the latency of the refresh across the screen as different pixels update at

    • by sl3xd ( 111641 )

      There are benefits above 60 Hz - the nausea effect in below 90 Hz in VR is a key indicator that 60 Hz isn't enough, even if it's not at a conscious level.

      I'm just glad that now, high quality CFL's and LED's mercifully flicker at a kHz or so, rather than the 50/60 Hz of the country's power grid as they did a couple of decades ago. I get so many fewer headaches. Now if they can just improve the spectra...

    • This is a misunderstanding.

      Lets split this into two things: can you see the difference, and does it make you a better player.

      I moved from 60Hz -> 85Hz -> 144Hz -> 280Hz and each time, I could see the difference. I know everyone on here is a contrarian, so let me expand. 280Hz is just a nicer experience to me: everything is very fluid feeling, there are fewer artifacts.

      You can see a difference because your eyes are an additive input device and games can only render instants in time. You can't identi

    • The ones I have at the moment can only handle a 60Hz maximum rate.

      You definitely need an eyeball upgrade. The difference between 60Hz and 120Hz is trivially visible. Going above 120Hz is starting to be of dubious value.

      There's a reason 100% of manufacturers of VR equipment consider 60Hz displays unsuitable for their headsets, the differences are well and truly perceivable.

      • For VR (just like the CRTs of old, for the same reason) 90Hz is the bare minimum for "the flicker doesn't bother me anymore". The jump to 120Hz or 144Hz makes it feel much smoother, but you can still see individual frames. My estimation is that the "individual frames" problem won't go away until about 1000Hz for normal head/eye movement speeds. Objects moving faster than that that would still produce visible individual frames at those frame rates could be motion blurred in software with no visible differenc
        • Above 90Hz you cannot see the refresh rate flicker - you can see the pixel update delay flicker, tearing, etc .. as pixels take different times to update

          On an old CRT the whole screen takes the same time to update, but you could often see the flicker on most due to the 50/60Hz refresh rate
              Some had 100/120Hz refresh and you simply cannot see any flicker

           

    • by AmiMoJo ( 196126 )

      To be fair, a 120Hz phone screen looks better to me. When scrolling, it's easier to track where I was reading, and even continue reading.

    • Hey bro, I know you are smart, but your perspective is skewed.

      60FPS is plenty fast for passively ingesting content.

      For less passive activities, a 60hz update to what you see is too slow. If you were driving a car in the Real World and had glasses on that updated your view of the outside world 60 times a second, you would be able to drive just fine; however, you would notice the limitations.

  • Isn't resolution more important at a certain crossover point? With high resolution if an object moves a smaller degree of arc it will show as a changed pixel on the display. Of course if comparing high res vs. ultra high res your eye can't detect that. But most gamer displays are low resolution.

    • No. For objects in motion the vast majority of blur is caused by the mismatch of screen refresh and tracking eye motion. To get rid of that blur you'd want refresh rates so high that the difference between adjacent frames is only 1 pixel.
  • Does that really mean anything for the average human user? My understanding is that once you get above 120Hz, you really aren't going to see any difference as the frame rate goes up from there. So why develop these super-high refresh rates? Are we already planning for the day where we're obsolete and we need to impress whatever robotic overlords we have to keep improving refresh rates for their end-users, which will be other robots? Is there a real-world use-case for these higher refresh rates that I don't

    • Does that really mean anything for the average human user? My understanding is that once you get above 120Hz, you really aren't going to see any difference as the frame rate goes up from there. So why develop these super-high refresh rates? Are we already planning for the day where we're obsolete and we need to impress whatever robotic overlords we have to keep improving refresh rates for their end-users, which will be other robots? Is there a real-world use-case for these higher refresh rates that I don't get? Even gamers can't see better than their eye allows. What's the goal?

      I've got a 4k monitor that can hit 144Hz and I have to say... after years of playing a game (Destiny 2) at 60 FPS, pushing 120+ has had zero visible effect. I don't really buy the "it's smoother" story even at that rate. The only way I can tell it's any different is that the FPS counter in the corner says it is. And yes, I'm 100% confident my monitors are set up correctly and running at the provided frame rate.

      This 480Hz stuff has got to be like painting green rings around your CDs to make the digital

      • by sl3xd ( 111641 )

        People want video beyond the eye's abilities, much like people who demand "HD Audio" - raw, uncompressed, 32-bit, 192-kHz sampled audio. It's *way* beyond our ability to hear. It's also *way* beyond our ability to reproduce (outside of explosives and rocketry).

        The reality is, for audio, 12-bit/30 kHz covers everything we'd really consider musical. (That also includes the noise floor, by the way). Funny thing about that: It's within the capabilities of an analog LP record. Ever wonder why people still swear

        • records can reproduce higher frequencies than 15kHz.

          Also, with digital it's a bit different than the analog frequency response. Sure, my tape deck may have -3dB at 18kHz or whatever, but it is different than digital with 36kHz sample rate. Two reasons.
          1.there still is some signal at the higher frequencies, it's just attenuated.
          2. Due to how digital works, for 36kHz sample rate, you need a filter that passes 17999Hz and drops 18000Hz completely. Such a steep filter may not be possible and various attempts wi

    • 120Hz is fine if your objects move slowly enough that the difference between adjacent frames is only 1 pixel. That's not very fast motion at 120Hz.
  • see title.

  • "This reduction in blue light not only minimizes eye fatigue but also eliminates flickers, providing gamers with more comfortable and enjoyable gaming sessions."

    How in hell does reducing blue light eliminate flickering? Am I missing something, or are LG's press release writers totally clueless about the technology they're writing about?

    I also question the relationship between blue light and eye fatigue. I know blue light at the wrong time of day is said to disrupt circadian rhythms by suppressing melatonin; maybe the "eye fatigue" goes along with the whole-body fatigue which results from poor sleep? I don't see any other possible causal relationship here, given tha

    • Blue light and melatonin is way overhyped anyway, what you actually want is dim screens before sleep, not necessarily yellow ones.
      • Unless you're old enough that you need your pupils to contract to focus at a reasonable distance. Then you need more light but just not the kind that inhibits melatonin.

  • In 3... 2... 1...

  • by Chelloveck ( 14643 ) on Thursday January 04, 2024 @01:16PM (#64131753)
    Emits less blue light than any other monitor? Is it really dim, or does it just have shitty color balance?
  • The problem with most modern displays and Duck Hunt is that the display buffers frame data before displaying it, so the NES Zapper mechanism of showing a hitbox-frame and simultaneously checking the Zapper's light-sensor to register a hit does not work correctly since the displayed frame on the LCD is delayed. With a theoretical response time of 0.03ms, I wonder if it will now be possible to attach a high-speed RF-demodulator to this monitor and play Duck Hunt on an original NES...

    Of course, it may be nece

    • To clarify the buffering thing: I'm not saying this display doesn't buffer frames like any other, what I am saying is that if it processes 8 frames within the time-span of a single NES rendered frame, this buffering may become indistinguishable from realtime CRT beam-scanning to the NES.

  • Alright, the refresh rates are getting out of control. As someone who wants a great monitor for productivity, potentially an OLED, I'd rather that you focus on:
    • Higher brightness - bonus points if you can do it without washing out the colors using white OLEDs
    • Higher resolution for crisper text
    • Speaking of text, better subpixel layout. The non-standard layout in current OLEDs screws up font rendering in most operating systems
    • Larger, curved monitors. A 57" dual-UHD OLED would provide some nice competitio
  • I'd be happier with an OLED screen that was capable of displaying text that wasn't a blurry mess.

  • by PJ6 ( 1151747 )
    Anyone who's used a CRT knows that LCD refresh rate is always a lie [testufo.com].

    Can't wait for micro LED.

The computer is to the information industry roughly what the central power station is to the electrical industry. -- Peter Drucker

Working...