Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Technology

PC and Laptop Displays Are Working Toward 480 Hz 114

An anonymous reader shares a report: If you've ever looked at a 360 Hz monitor and thought, "This isn't fast enough," here's something to look forward to. While we've seen monitor prototypes surpass 360 Hz, the highest native refresh rate you'll find on a PC display these days, it seems that AU Optronics (AUO) is working on panels that'll be available with an even snappier 480 Hz refresh rate. Of course, not many would look at a screen updating with new information 360 times every second as lagging. But for very fast-paced action -- like in a competitive game where words and items whizz by in an instant or where a few milliseconds of a delay could be the difference between a win or a loss -- more speed may be imperative.
This discussion has been archived. No new comments can be posted.

PC and Laptop Displays Are Working Toward 480 Hz

Comments Filter:
  • Decreasing returns (Score:5, Insightful)

    by UnknownSoldier ( 67820 ) on Friday May 13, 2022 @02:29PM (#62530560)

    I'm all for 120+ Hz display but isn't this well into decreasing returns and marketing snake oil at this point?

    • The only "return" I could envision is a really bad "soap opera effect", and me cranking it down to 120hz or lower to try to get rid of it.

      I don't care what others think, but I can't stand watching theater movies that don't look 24fps. I just cant, the soap opera effect realky screws with me, and makes me feel like I am watching some cheap videotaped garbage.

      • "Soap opera effect"? What do you mean? I've never heard that term before.
        • Not the poster, but... Soap opera effect is consumer lingo for a visual effect caused by motion interpolation, a process that high definition televisions use to display content at a higher refresh rate than the original source." [techtarget.com]

          If a television screen has a refresh rate of 120Hz (120 frames per second) but the television is going to display film that was recorded at the standard 24 frames per second, the vendor must figure out a way to fill in an extra 94 frames each second.

          One way to do this is to have the television repeat each film frame five times(5x24=120). Another way is to have a computer program in the television digitally analyze concurrent frames and use the data to create intermediary frames. The insertion of these frames is called interpolation and they are what cause the soap opera effect.

          So it is a monitor taking a cinematic feed (24 FPS) and creating fake interframes that make it look like a video feed or like Jackson's The Hobbit [gizmodo.com].

          • by Ormy ( 1430821 )
            Wait, I'm familiar with the term soap opera effect, but I thought it referred to any movie or TV content with a high framerate, regardless whether it was filmed natively in a high frame rate or processed by interpolation software after filming. You're saying the term only refers to interpolated content? In my experience the two look very similar. People were even complaining about the soap opera effect in The Hobbit which was filmed natively in 48fps and not interpolated, were they all misusing the nomenc
            • by lsllll ( 830002 )
              You have it right. It's the framerate (sometimes achieved via interpolation between slow framerates, sometimes native high framerates). It sucked back in 1985 and it still sucks today. I don't get this fascination with 4K, 8K, and life-like materials on TV. I want to be immersed by the story, not because I can see blackheads on actors' faces. When studios can't deliver, they fake and trick the viewer instead.
              • by Ormy ( 1430821 ) on Friday May 13, 2022 @05:59PM (#62531242)
                Thanks for the reply. While I agree that extra spatial resolution beyond 4K is pretty pointless, I always welcome extra temporal resolution. Yes I said it, I like the soap opera effect, I realise I'm in the minority with this view but I know what I like.
              • by cusco ( 717999 )

                Decided that I didn't want an 8K television when I saw a closeup on an actress and the makeup was clearly visible.

          • by Kremmy ( 793693 )
            Shouldn't this have flipped by now? With monitors and TVs being high definition and full frame rate for a long time, the higher framerate should become normalized with the lower framerate taking on an unreal effect.
        • Exactly what it sounds like; daytime television in North America was recorded on video cameras at native NTSC speeds (60 Hz interlaced), while cinematic films standardized at 24 frames per second decades before video signals were commonplace experiences. Seeking to emulate the latter, fancier shows would (and still do) stick to the lower frame rate. As a result, many soap operas have a noticeably higher frame rate than more expensive productions, and details of facial movement, body language, and other rapi
          • by Ormy ( 1430821 )

            ...resulting in a more lifelike but overall flimsier and less weighty experience.

            I realise we are in the minority but there are those of us that disagree with that opinion (and it is a subjective opinion, not an objective fact btw). More lifelike to me means more immersive, I feel like I'm in the room with the actors rather than watching them on a screen, why would you not want that extra immersion? As such I use software on my media PC to interpolate all my video content in realtime up to whatever framerate I choose, currently 120fps to match my display. Watching anywhere else feel

            • Yes, it absolutely is the product of what you were exposed to first, plus the usual admixture of other personal experiences. The phenomenon is completely subjective. At best you can only really hope to understand the alternative viewpoint and where it comes from. It's a form of stylization, just as much as the style of an artist in a visual medium.
              • Not entirely I think.

                High def, high frame rate is undeniably more realistic. The problem is that reality is the sound stage you're filming on, not the false reality you're trying to convey.

                • That's certainly a concern for many productions, but not all of them; there are plenty of shows and films shot on location where they had the option of high frame rates, but decided realism was less important than producing a nostalgically "cinematic" experience.
                  • by Ormy ( 1430821 )

                    but decided realism was less important than producing a nostalgically "cinematic" experience.

                    Exactly the kind of decision that irks me. If they had shot in high frame rate, it's trivially easy for someone like you who dislikes the soap opera effect to simply discard half (or whatever fraction) of the frames. Whereas for me to interpolate frames is not so trivial, yes the software is available but it introduces minor artifacts and consumes significant CPU power which means significant electricity cost. It's akin to producer deciding to film in 480p rather than 2160p for that nostalgic VHS feel, d

                    • Stepping down framerate is not actually quite as simple as discarding the frames; motion blur has to be recreated to avoid a choppy, stop-motion look. There is no commercial procedure for doing so yet (I was only able to find a single preprint from 2018, which considered only two frames at a time and had no sense of progression.)

                      The comparison to frame resolution brings up another important issue, however, which is that adding data to the film is more expensive for production, regardless of which dimension

                    • by Ormy ( 1430821 )
                      Hmm. Fair point on both counts.
              • There are films where the director switched to different film stocks (16mm, 35mm...) to give a certain feel to scenes. It's subtle but very effective.

                The SOE would completely ruin what the director was intending to convey.

        • by hawk ( 1151 )

          >"Soap opera effect"? What do you mean?

          It's the phenomenon of dying when the actor that plays you asks for too much money for the next season.

          It can also be used for evil twins that somehow went unnoticed for the first thirty or fifty years of people's lives, too . . .

      • by lsllll ( 830002 )
        Holy shit! So there is a term for this! Soap Opera Effect! I remember arguing with my aunt in 1985 when she had just gotten a Sony Trinitron and her saying how nice soap operas were on it, to which I replied that I preferred movies because they didn't seem as sharp as the soap operas and you actually felt like you were watching something as opposed to being there.
      • by Tailhook ( 98486 )

        I don't care what others think, but I can't stand watching theater movies that don't look 24fps.

        What others think is that indulging hang-ups like this is a sure sign of an insufferable douche best avoided.

        • It's not "being a douche" or some hipster bullshit. I literally can't stand the effect.

          A while back, I was watching some old movies on Youtube and they quietly and suddenly forced motion interpolation on every video without any way to turn it off. And I tried far and wide to find a way to get these films to play as they were originally filmed, including checking my system settings to see if I accidently ticked something. Some film scenes were actually ruined because this processing wasn't done right

      • To be fair, these are generally for games which don't have the same issue. Also, soap opera effect isn't really from high framerates it's from motion interpolation.
        • Back in the old days, I played Duke Nukem on a PC which while the game played fine, it was a few FPS shy of 30, giving it a movie like quality.

          Later, I bought a much faster PC, and I got frame rates as high as 100fps or more. There was no motion interpolation at work, but everything looked "glassy", the soap opera effect taken to the extreme, and the atmospheric movie feel of the game vanished.

          It didn't matter much because I was just playing a video game, but the SOE was clearly there.

          Motion

      • by Z00L00K ( 682162 )

        Since the human eye still won't really be able to perceive the update rate above about 50Hz the update frequency figures are just marketing figures.

    • It is approaching that point of diminishing returns and it absolutely isn't a regular consumer device, but still not quite into snake-oil settings.

      For non-competitive gaming it completely unnecessary. The refresh rate there, about 2 milliseconds, is too fast for the simulation update on many games. Settings like v-sync helps most people for that, and will slow down the game and the display based on the monitor's refresh.

      In competitive gaming, the image tearing gives a slight advantage. When the image beg

      • by Ormy ( 1430821 )

        official timers were measured with quantum timers (more precise than atomic clocks)

        Atomic clocks are routinely used in experiments requiring nanosecond accuracy, that's a few orders of magnitude quicker than anything the olympics will ever need. I find it likely that 'quantum clocks' and atomic clocks are actually the same thing.

      • "highly competitive gaming (or with Olympic sports) knows that those tiny benefits can make the difference in a match. When esports prizes are reaching millions of dollars a microsecond advantage, or even a few millisecond advantage, can offer a slight edge"

        Too bad none of this can upgrade the human mind, and this will lead to eventual burnout or even severe mental disorders.

        It becomes a drug, always trying to get an even better milisecond advantage over tthe other players. Even machines will flake

      • by gweihir ( 88907 )

        None of that matters. The competitive gamers are just a superstitious bunch that believe some non-effect gives them an advantage. It does not. The measurement equipment in Tokyo was tech-demos with no real-world impact or meaning.

    • by AmiMoJo ( 196126 )

      Linus Tech Tips did some semi scientific experiments and IIRC concluded that there was a slight advantage to higher frame rates, but I think they only went to to 144hz.

      • by Luckyo ( 1726890 )

        Fun part is that while you're right that LTT did those tests, you're wrong on content as usual.

        This is the video in question:

        https://www.youtube.com/watch?... [youtube.com]

        Tests went up to 300FPS at 60Hz and 240 FPS at 240Hz. They conclude that there's a very significant advantage to going from 60Hz to 144Hz. And that there's less of an advantage going from 144Hz to 240Hz but the advantage is still clearly present and observable.

        • by gweihir ( 88907 )

          Observable != means anything.

          I can easily "observe" frame rates up to 1000Hz or so using interference effects. That does not mean it gives me any advantage because above 50Hz or so the different images just merge together in the human eye.

          Higher frequencies are just about separating fools and their money.

    • by Luckyo ( 1726890 )

      Yes. But "decreasing returns" means "returns exist". I.e. it's still better.

      And in terms of business, this is smart, because biggest profit margins are in the highest tier products where you can charge a lot for fairly minimal performance improvements. This is very visible in the other side of the gaming monitor business, with "how many FPS you get per unit of currency" in GPUs and CPUs. Higher end stuff is far less efficient in terms of cost of each unit of performance you get out of them than mid end mode

      • by gweihir ( 88907 )

        Sure. "Better, but does not matter". Same as a trash bag that gives you its performance on the moon printed on the bag is theoretically "better".

    • Those slow, throbbing, flickering 360 Hz displays hurt my eyes.
    • by tlhIngan ( 30335 )

      I'm all for 120+ Hz display but isn't this well into decreasing returns and marketing snake oil at this point?

      Of course, but gamers are the next generation audiophool - they've got money, and lots of it. Some of the biggest gamers that stream make a ton of money so why not part them from the money?

      The point of this is to get people to upgrade to the latest hot gear and pay a premium for it out of the belief it's needed. Doesn't matter if it only gives you a fraction of a percent improvement, if you can get

    • My monitor goes to 280hz, and my PC is capable of displaying that mostly consistently in Overwatch.

      When I came from 80hz to 144hz, it brought a hands-down improvement in my gameplay. When I went from 144hz to 280hz, it wasn't really clear that I played better but it absolutely *felt* better and I can very easily tell the difference if I turn it back down. If 480hz makes things *feel* better then I'm all for it, even if it doesn't give me an advantage.

      Will a practiced competitive e-sports murder machine see

      • by gweihir ( 88907 )

        >

        When I came from 80hz to 144hz, it brought a hands-down improvement in my gameplay.

        No. It did not. It made you _think_ there was an improvement and you might well have tried harder and that may have given you some improvement. But that is it unless you are an alien. You can see the difference, but you cannot use it.

    • The very concept of "refresh rate" should have been obsolete 10 years ago. It's not like the whole idea behind an LCD is that the image is stable unless it's forced to change.

      Plasma displays are a different story, of course. It's amazing what plasma screens look like in slow motion.

    • by gweihir ( 88907 )

      I am at 60Hz and see absolutely no reason for more. 120Hz is already well into decreasing returns and makes no sense at all in most situations. But it does keep the need for fast hardware up. Same with resolutions 4k and higher. It is basically a scam and serves to separate those with no understanding but money to spare from that money.

      • by dohzer ( 867770 )

        4k lets me put far more information on the screen. Like four different source files, or a fuckload of VHDL simulation waveforms.
        Probably not all that useful for gaming though.

    • by mjwx ( 966435 )

      I'm all for 120+ Hz display but isn't this well into decreasing returns and marketing snake oil at this point?

      Hi,

      It seems you're unfamiliar with the PC gaming peripherals market. Snake oil is almost all we've got.

  • It takes an absolute beast of a machine -- a very expensive beast -- to do a smooth 120 on modern games. Anything more than that without more efficient code or another three doublings of computing power is just silly. And I don't see gaming studios putting those executive bonuses into efficient code any time this year.
    • What about your desktop though? Surely everyone wants 480 FPS in MS Word.

    • Once you get to refresh rates that high, it's more about the UI than it is realtime 3D or gaming.

    • by N1AK ( 864906 )
      A lot of the bigger e-sports titles aren't particularly demanding, or at least can be run effectively without settings that slow them heavily. I got a 165hz screen because the game I play most is one that can easily hit 240hz frames on relatively modest hardware. I'd never seen the point in more than 60hz before now when the game goes back to 60hz occassionally after updates it feels genuinely uncomfortable to play.
    • by Xenx ( 2211586 )
      These monitors are designed for professional FPS gamers. They are, in fact, capable of running these games near these speeds. You have to remember they play the game differently than the average person. They're optimizing for best performance.
      • by Luckyo ( 1726890 )

        Not really. You don't need a beast of a machine to run CS:GO at 240 and even 360 FPS. Same goes for second most popular game that is DOTA2.

        And the rest of the top 25 chart put together has not that much more players than those two. And when you look at non-steam games with top tier popularity, that's going to be games in the same vein of computational requirements: League of Legends, World of Tanks, etc.

        • by Xenx ( 2211586 )

          Not really. You don't need a beast of a machine to run CS:GO at 240 and even 360 FPS.

          You're either agreeing with my point, or missing my point. Most gamers, even high end non-pro, don't need 300+fps. There is no practical gain to be had that high. They can push it, as I said. They just don't need it.

    • by Luckyo ( 1726890 )

      Depends on what you call "modern games". Most played games on Steam? My GTX970 can run most those at 120fps, easy, because those include CS:Go, Dota2, PoE, Rust, GTA5 and TF2. That's majority.

      https://steamcharts.com/top [steamcharts.com]

      Of the remaining games on the top 10 list, the only ones that really can't be run on at 120fps on modern, not too expensive hardware are those that have some in built technical limitation against it. Stable 240 FPS would probably be the line where you start to run into problems being able to

    • by Ormy ( 1430821 )

      It takes an absolute beast of a machine -- a very expensive beast -- to do a smooth 120 on modern games.

      True but a few of the FPS games played competitively are not exactly modern and so can happily run at several hundred fps on decent-spec modern hardware. CS:GO for example.

  • Yeah (Score:4, Funny)

    by Malays2 bowman ( 6656916 ) on Friday May 13, 2022 @02:42PM (#62530624)

    "If you've ever looked at a 360 Hz monitor and thought, "This isn't fast enough," here's something to look forward to."

    Setting up an appintment with your local shrink to figure out why you are obsessive compulsive to this extreme. He may very well set you up with some meds to deal with this at little or no cost to you.

  • Seems like this refresh rate issue would drop into that skit easily.
    • And I want the cowbell to have that deep, low tone sound with the proper reverb and fade, and the master tape better have every single vibration of that bell on it. If the studio engineer can't pull it off then he will really fear the reaper.

  • But why? (Score:5, Informative)

    by Kobun ( 668169 ) on Friday May 13, 2022 @02:44PM (#62530632)
    I have not heard of any advancements in gray to gray response times on LCDs. I was under the impression that this capped frames per second at around 120. Is this just some marketing goon jamming a higher frequency chip into a monitor to be able to print an impressive number on the box?
    • I have not heard of any advancements in gray to gray response times on LCDs. I was under the impression that this capped frames per second at around 120. Is this just some marketing goon jamming a higher frequency chip into a monitor to be able to print an impressive number on the box?

      Most gaming monitors have a sub 6ms GtG response times. They aren't limiting to 120Hz. Now as to 360Hz and beyond, that is an entirely different question. Even monitors with quoted 1ms times often measure closer to 5ms.

    • by Luckyo ( 1726890 )

      The main technological advancement there has been about being able to push overdrive voltage switching to go even faster, which in most aggressivbe overdrive modes needed for really fast refresh rates typically results overshooting target color. This can result in some color artifacting even on really fast TN monitors, significant and noticeable color artifacting on IPS monitors and very large and easily noticeable "things in motion flash in wrong color before becoming of their correct color" on VA monitors

  • I *have* looked at an underpowered PC and said, "Sure would be nice if I could buy an USB-to-HDMI converter that could selectively refresh rectangles on the screen -- e.g., web pages, spreadsheets, terminal windows, and downclock to whatever frame rate the PC could handle at the expense of increased lag and update artifacts, to drive an HDMI monitor of any resolution".
  • Okay stupid question time... Can a 144Hz display be run at 120Hz without adversely affecting the overdrive system?

    • Bro u gotta liquid cool your liquid crystal display but then you can overclock it to ONE HUNDRED BILLION MEGAHERTZ
  • There is definitely a decreasing benefit as you get higher but this is a necessary improvement if we ever want to get to closer to the smoothness of reality on screen and in VR. It's not noticeable whatsoever if nothing is moving but if there is movement on screen or in VR if you move your head there is still a crazy amount of improvement to be had.
    • If you want to notice the lack of smoothness, just move your mouse around the screen and see all the gaps it makes. It's like when video was on a CRT screen with grains and horrible resolution - you didn't care AT ALL, but now if you go back and watch those videos or play a game from the 90s, it is unbearable to deal with the low fps. Despite how acceptable current technology is now the improvements in 20 years time will make everyone wonder why we didn't always have 480 hz or better. Obviously our limitati
    • I think response time and consistency of the framerate is more important than maximum framerate.
  • The grey to grey response of LCD is pathetic, so a 480 Hz display is useless. Forget gaming, you would still see trailing when you move the mouse pointer across the screen. That never happened with a CRT display.

    • That never happened with a CRT display.

      Of course it did. Phosphors take time to fade to black as anyone who has ever taken a photo of a CRT will attest to. At fps we're talking about you can draw multiple frames before the the first one fades.

      CRTs were better than most run of the mill LCDs, but you really need to take your rose coloured glasses off.

      • I happen to have an ancient iMac G3 .. and I just checked it out. No visible mouse trailing whatsoever. The flicker you see in photos/film of CRT is actually a testament to how quickly the phosphor fades. Clearly the phosphor decay to black is within a millisecond, whereas the LCD might go to "grey" fairly quickly but then decays to black over much a longer period of time.

  • With VGA & DVI cables & KVMs. :P

    • Hehe.

      I switched to 100 Hz on my Zenith CRT monitor due to noticing flicker out of the side of my eye when the refresh rate was 100 Hz.

      There definitely is a difference between 120 Hz and 60 Hz, just not as noticeable as the crappy 30 Hz and 60 Hz difference.

    • That's funny. I too am not a gamer and thus not using gaming products. What else don't you do? I also don't knit or paint my nails.

  • by fluffernutter ( 1411889 ) on Friday May 13, 2022 @04:30PM (#62530980)
    Do they make a ten blade razor yet? Always swore I would start shaving again when they get to it.
    • Protip: If you see a set of disposeable razors with a rather crappy handle next to a set of razors with the rubber "sports car" handle, same count/brand/price, get the pack with the crappy handle. The companies put the more inferior blades in the race car version.

  • Personally I can more easily see flicker in my peripheral vision (look at something just outside of the main area I usually focus with). I don't have references, but as I remember it...yes, some people can see flicker to very high rates like this. And for VR displays, where a small panel takes up a lot of your field of view, it might help. Similar to tinier and tinier resolutions (some displays are zoomed in with optics).

    Or what about the old "show one field to the left, then the next to the right"? Tha

  • by PPH ( 736903 ) on Friday May 13, 2022 @05:18PM (#62531146)

    ... that people can tell the difference at much beyond 100 Hz when they can't tolerate LED or fluorescent lamps or have to walk out of movie theaters suffering massive migraines.

    There is an old anecdote about a new video card that came out featuring a 60 Hz refresh rate. When 30 or 45 Hz was the norm. All the autists ran out and bought one, claiming that the annoying flicker had finally went away. And just how marvelous the new cards were. But a few months later it was discovered that the Windows drivers had a bug which refused to actually set the card to anything faster than 45 Hz. Even when 60 was selected*. Suddenly, everyone who loved their new cards was crying "Muh headaches!" Just because someone told them that it was the same as the old hardware.

    *I believe the bug was discovered when some Linux users ran the old display probe program necessary to fetch parameters and hand edit them into the old X config files.

    • "that people can tell the difference at much beyond 100 Hz when they can't tolerate LED or fluorescent lamps or have to walk out of movie theaters suffering massive migraines."

      This can happen because some of these lights are literally flashing at mains frequency, without proper capacitors or high persistence phosphers to greatly reduce or eliminate the flickering.

      This is a well known hazard in industrial settings because it can cause running machinery to appear to be idle.

  • I want a FreeSync/GSync or variable refresh rate display on a battery powered device that can go down to 1 Hz so that the multi megabyte framebuffer is not being scanned hundreds of times a second. I'd also like it if the kick off for new frames on VRR is very short, so that scan out can begin immediately after render.

    On a tile based render where all the vertices are scanned and collected and assigned to tiles very early in the render, you could almost begin scan out before rendering is complete. If only th

  • This reminds me of high-end audio component pitches. They're selling stuff you can't possibly perceive.

    If nothing else, no video card is updating frames at anything close to that rate. And I defy you demonstrate that showing a baddie move 4 msec faster is going to make a difference. Humans just can't perceive things that fast: our reaction times are measured in tens to hundreds of msec.

In practice, failures in system development, like unemployment in Russia, happens a lot despite official propaganda to the contrary. -- Paul Licker

Working...