PC and Laptop Displays Are Working Toward 480 Hz 114
An anonymous reader shares a report: If you've ever looked at a 360 Hz monitor and thought, "This isn't fast enough," here's something to look forward to. While we've seen monitor prototypes surpass 360 Hz, the highest native refresh rate you'll find on a PC display these days, it seems that AU Optronics (AUO) is working on panels that'll be available with an even snappier 480 Hz refresh rate. Of course, not many would look at a screen updating with new information 360 times every second as lagging. But for very fast-paced action -- like in a competitive game where words and items whizz by in an instant or where a few milliseconds of a delay could be the difference between a win or a loss -- more speed may be imperative.
Decreasing returns (Score:5, Insightful)
I'm all for 120+ Hz display but isn't this well into decreasing returns and marketing snake oil at this point?
Re: Decreasing returns (Score:2)
The only "return" I could envision is a really bad "soap opera effect", and me cranking it down to 120hz or lower to try to get rid of it.
I don't care what others think, but I can't stand watching theater movies that don't look 24fps. I just cant, the soap opera effect realky screws with me, and makes me feel like I am watching some cheap videotaped garbage.
Re: (Score:2)
Re: (Score:2)
Not the poster, but... Soap opera effect is consumer lingo for a visual effect caused by motion interpolation, a process that high definition televisions use to display content at a higher refresh rate than the original source." [techtarget.com]
If a television screen has a refresh rate of 120Hz (120 frames per second) but the television is going to display film that was recorded at the standard 24 frames per second, the vendor must figure out a way to fill in an extra 94 frames each second.
One way to do this is to have the television repeat each film frame five times(5x24=120). Another way is to have a computer program in the television digitally analyze concurrent frames and use the data to create intermediary frames. The insertion of these frames is called interpolation and they are what cause the soap opera effect.
So it is a monitor taking a cinematic feed (24 FPS) and creating fake interframes that make it look like a video feed or like Jackson's The Hobbit [gizmodo.com].
Re: (Score:2)
Re: (Score:2)
Re: Decreasing returns (Score:4, Interesting)
Re: (Score:2)
Decided that I didn't want an 8K television when I saw a closeup on an actress and the makeup was clearly visible.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
...resulting in a more lifelike but overall flimsier and less weighty experience.
I realise we are in the minority but there are those of us that disagree with that opinion (and it is a subjective opinion, not an objective fact btw). More lifelike to me means more immersive, I feel like I'm in the room with the actors rather than watching them on a screen, why would you not want that extra immersion? As such I use software on my media PC to interpolate all my video content in realtime up to whatever framerate I choose, currently 120fps to match my display. Watching anywhere else feel
Re: (Score:2)
Re: (Score:2)
Not entirely I think.
High def, high frame rate is undeniably more realistic. The problem is that reality is the sound stage you're filming on, not the false reality you're trying to convey.
Re: (Score:2)
Re: (Score:2)
but decided realism was less important than producing a nostalgically "cinematic" experience.
Exactly the kind of decision that irks me. If they had shot in high frame rate, it's trivially easy for someone like you who dislikes the soap opera effect to simply discard half (or whatever fraction) of the frames. Whereas for me to interpolate frames is not so trivial, yes the software is available but it introduces minor artifacts and consumes significant CPU power which means significant electricity cost. It's akin to producer deciding to film in 480p rather than 2160p for that nostalgic VHS feel, d
Re: (Score:2)
Stepping down framerate is not actually quite as simple as discarding the frames; motion blur has to be recreated to avoid a choppy, stop-motion look. There is no commercial procedure for doing so yet (I was only able to find a single preprint from 2018, which considered only two frames at a time and had no sense of progression.)
The comparison to frame resolution brings up another important issue, however, which is that adding data to the film is more expensive for production, regardless of which dimension
Re: (Score:2)
Re: Decreasing returns (Score:2)
There are films where the director switched to different film stocks (16mm, 35mm...) to give a certain feel to scenes. It's subtle but very effective.
The SOE would completely ruin what the director was intending to convey.
Re: (Score:2)
>"Soap opera effect"? What do you mean?
It's the phenomenon of dying when the actor that plays you asks for too much money for the next season.
It can also be used for evil twins that somehow went unnoticed for the first thirty or fifty years of people's lives, too . . .
Re: (Score:2)
Re: (Score:3)
I don't care what others think, but I can't stand watching theater movies that don't look 24fps.
What others think is that indulging hang-ups like this is a sure sign of an insufferable douche best avoided.
Re: Decreasing returns (Score:2)
It's not "being a douche" or some hipster bullshit. I literally can't stand the effect.
A while back, I was watching some old movies on Youtube and they quietly and suddenly forced motion interpolation on every video without any way to turn it off. And I tried far and wide to find a way to get these films to play as they were originally filmed, including checking my system settings to see if I accidently ticked something. Some film scenes were actually ruined because this processing wasn't done right
Re: (Score:2)
Re: Decreasing returns (Score:2)
Back in the old days, I played Duke Nukem on a PC which while the game played fine, it was a few FPS shy of 30, giving it a movie like quality.
Later, I bought a much faster PC, and I got frame rates as high as 100fps or more. There was no motion interpolation at work, but everything looked "glassy", the soap opera effect taken to the extreme, and the atmospheric movie feel of the game vanished.
It didn't matter much because I was just playing a video game, but the SOE was clearly there.
Motion
Re: (Score:2)
Since the human eye still won't really be able to perceive the update rate above about 50Hz the update frequency figures are just marketing figures.
Re: Decreasing returns (Score:2)
Boosting specs higher than nessecary, that humans can't perceive is wasteful. It's marketing over engineering.
Re: Decreasing returns (Score:2)
If this was in a VR display, it could help people not vomit all over the place.
Such as it is, it's just marketing hype.
Re: Decreasing returns (Score:2)
"It is a shame that Peter Jackson went with 48 FPS and not 96 FPS for The Hobbit but yeah the soap opera effect is REALLY bad in it."
I don't know why studios can't dynamically change the FPS so fast pans don't look like crap, and we don't have to suffer the soap opera effect throughout the whole movie.
Watching something like The Hobbit and seeing the SOE is really jarring and distracting, and I never could get used to it.
Re: (Score:3)
It is approaching that point of diminishing returns and it absolutely isn't a regular consumer device, but still not quite into snake-oil settings.
For non-competitive gaming it completely unnecessary. The refresh rate there, about 2 milliseconds, is too fast for the simulation update on many games. Settings like v-sync helps most people for that, and will slow down the game and the display based on the monitor's refresh.
In competitive gaming, the image tearing gives a slight advantage. When the image beg
Re: (Score:2)
official timers were measured with quantum timers (more precise than atomic clocks)
Atomic clocks are routinely used in experiments requiring nanosecond accuracy, that's a few orders of magnitude quicker than anything the olympics will ever need. I find it likely that 'quantum clocks' and atomic clocks are actually the same thing.
Re: (Score:2)
Quantum clocks are a subtype of atomic clock [wikipedia.org] that are unchanged in accuracy but have more precision than traditional atomic clocks. The distinction is about the higher precision.
Writeups from the Tokyo Olympics seemed both a mix of pushing for PR and for actual precision, and they tried to check both. "Of course we love to experiment and often push at the boundaries just to see how far we can go. But there is always something practical driving it." [digitaltrends.com] They had all kinds of high precision timing devices on b
Re: Decreasing returns (Score:2)
"highly competitive gaming (or with Olympic sports) knows that those tiny benefits can make the difference in a match. When esports prizes are reaching millions of dollars a microsecond advantage, or even a few millisecond advantage, can offer a slight edge"
Too bad none of this can upgrade the human mind, and this will lead to eventual burnout or even severe mental disorders.
It becomes a drug, always trying to get an even better milisecond advantage over tthe other players. Even machines will flake
Re: (Score:2)
None of that matters. The competitive gamers are just a superstitious bunch that believe some non-effect gives them an advantage. It does not. The measurement equipment in Tokyo was tech-demos with no real-world impact or meaning.
Re: (Score:2)
Linus Tech Tips did some semi scientific experiments and IIRC concluded that there was a slight advantage to higher frame rates, but I think they only went to to 144hz.
Re: (Score:2)
Fun part is that while you're right that LTT did those tests, you're wrong on content as usual.
This is the video in question:
https://www.youtube.com/watch?... [youtube.com]
Tests went up to 300FPS at 60Hz and 240 FPS at 240Hz. They conclude that there's a very significant advantage to going from 60Hz to 144Hz. And that there's less of an advantage going from 144Hz to 240Hz but the advantage is still clearly present and observable.
Re: (Score:2)
Observable != means anything.
I can easily "observe" frame rates up to 1000Hz or so using interference effects. That does not mean it gives me any advantage because above 50Hz or so the different images just merge together in the human eye.
Higher frequencies are just about separating fools and their money.
Re: (Score:2)
Yes. But "decreasing returns" means "returns exist". I.e. it's still better.
And in terms of business, this is smart, because biggest profit margins are in the highest tier products where you can charge a lot for fairly minimal performance improvements. This is very visible in the other side of the gaming monitor business, with "how many FPS you get per unit of currency" in GPUs and CPUs. Higher end stuff is far less efficient in terms of cost of each unit of performance you get out of them than mid end mode
Re: (Score:2)
Sure. "Better, but does not matter". Same as a trash bag that gives you its performance on the moon printed on the bag is theoretically "better".
Re: (Score:2)
Re: (Score:2)
Of course, but gamers are the next generation audiophool - they've got money, and lots of it. Some of the biggest gamers that stream make a ton of money so why not part them from the money?
The point of this is to get people to upgrade to the latest hot gear and pay a premium for it out of the belief it's needed. Doesn't matter if it only gives you a fraction of a percent improvement, if you can get
Re: Decreasing returns (Score:2)
Everyone knows painting a racing stripe on your car makes it go faster.
Re: (Score:2)
Indeed. Stupid people with money. I am still amused by the audio-idiots that think the universally really bad valve amplifiers somehow sound "better". That is delusion at its finest.
Re: (Score:2)
My monitor goes to 280hz, and my PC is capable of displaying that mostly consistently in Overwatch.
When I came from 80hz to 144hz, it brought a hands-down improvement in my gameplay. When I went from 144hz to 280hz, it wasn't really clear that I played better but it absolutely *felt* better and I can very easily tell the difference if I turn it back down. If 480hz makes things *feel* better then I'm all for it, even if it doesn't give me an advantage.
Will a practiced competitive e-sports murder machine see
Re: (Score:2)
>
When I came from 80hz to 144hz, it brought a hands-down improvement in my gameplay.
No. It did not. It made you _think_ there was an improvement and you might well have tried harder and that may have given you some improvement. But that is it unless you are an alien. You can see the difference, but you cannot use it.
Re: (Score:2)
Re: (Score:2)
The very concept of "refresh rate" should have been obsolete 10 years ago. It's not like the whole idea behind an LCD is that the image is stable unless it's forced to change.
Plasma displays are a different story, of course. It's amazing what plasma screens look like in slow motion.
Re: (Score:2)
I am at 60Hz and see absolutely no reason for more. 120Hz is already well into decreasing returns and makes no sense at all in most situations. But it does keep the need for fast hardware up. Same with resolutions 4k and higher. It is basically a scam and serves to separate those with no understanding but money to spare from that money.
Re: (Score:2)
4k lets me put far more information on the screen. Like four different source files, or a fuckload of VHDL simulation waveforms.
Probably not all that useful for gaming though.
Re: (Score:2)
I'm all for 120+ Hz display but isn't this well into decreasing returns and marketing snake oil at this point?
Hi,
It seems you're unfamiliar with the PC gaming peripherals market. Snake oil is almost all we've got.
Re: (Score:2)
No you don't. Your uncanny valley is purely due to you not being used to seeing the world like that in 2 dimensions. Add a 3rd dimension (e.g. put on a VR headset) and you'll be perfectly fine.
Re: Decreasing returns (Score:2)
120+ HZ will be soap city, but not uncanny to me.
What would really give me the heebie jeebies is seeing two objects clip through each other in a game that looks almost indistinguishable from reality, and a VR headset would make it 10 times worse.
I guess my phobia comes from back in the day when I watched Star Trek TNG "In Theory", heard that blood curlding scream from a female officer who was on her way to check out some damage to the ship, and La Forge rushing back to see that Starfleet officer fus
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
OLED is both far too expensive for what it offers and it suffers from burn in just a year or two down the line of being used as a PC monitor. As such, it's just not an option for me. I'd prefer to invest the money into a better GPU to drive the monitors instead.
TN's problem with viewing angles bothers me when in vertical orientation. I like using triple monitor setup with one on the right being vertical for messaging software. When that one is TN, turning my head to watch other monitors causes clear change
Re: (Score:2)
Re: (Score:2)
Re: Decreasing returns (Score:2)
"To add to this, once you get used to 120Hz (as I did with overclocking CRT monitors), 24fps will actually cause you some discomfort. I gave up on TV in 2000 in part because I found it looking utterly atrocious."
I actually have no problem watching content with different frame rates, but at the same time a 70 inch TV starts to look no different to me than a 5 inch screen about a foot from my face. Maybe some people are way more sensitive than others to this type of thing.
Some people reported feeling
Re: (Score:2)
You have to go way beyond 120 to get past the eye perceiving the framerate rather than the motion.
A cursory look at the biological processes within the retina will show that the temporal resolution of the human eye depends on the luminance incident on the retina (which is a function of the brightness of the scene being viewed and the current diameter of the pupil). I.e. if you make the image brighter and the surroundings darker, the eye can tolerate a lower framerate without "perceiving the framerate rather than the motion" as you put it. That's why the judder of 24fps is less noticeable in the cinema
I dinnae have the power, Captain (Score:1)
Re: (Score:2)
What about your desktop though? Surely everyone wants 480 FPS in MS Word.
Re: (Score:2)
I for one get headaches if I look at anything less than a 480Hz display.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Once you get to refresh rates that high, it's more about the UI than it is realtime 3D or gaming.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Not really. You don't need a beast of a machine to run CS:GO at 240 and even 360 FPS. Same goes for second most popular game that is DOTA2.
And the rest of the top 25 chart put together has not that much more players than those two. And when you look at non-steam games with top tier popularity, that's going to be games in the same vein of computational requirements: League of Legends, World of Tanks, etc.
Re: (Score:2)
Not really. You don't need a beast of a machine to run CS:GO at 240 and even 360 FPS.
You're either agreeing with my point, or missing my point. Most gamers, even high end non-pro, don't need 300+fps. There is no practical gain to be had that high. They can push it, as I said. They just don't need it.
Re: (Score:2)
Depends on what you call "modern games". Most played games on Steam? My GTX970 can run most those at 120fps, easy, because those include CS:Go, Dota2, PoE, Rust, GTA5 and TF2. That's majority.
https://steamcharts.com/top [steamcharts.com]
Of the remaining games on the top 10 list, the only ones that really can't be run on at 120fps on modern, not too expensive hardware are those that have some in built technical limitation against it. Stable 240 FPS would probably be the line where you start to run into problems being able to
Re: (Score:2)
It takes an absolute beast of a machine -- a very expensive beast -- to do a smooth 120 on modern games.
True but a few of the FPS games played competitively are not exactly modern and so can happily run at several hundred fps on decent-spec modern hardware. CS:GO for example.
Yeah (Score:4, Funny)
"If you've ever looked at a 360 Hz monitor and thought, "This isn't fast enough," here's something to look forward to."
Setting up an appintment with your local shrink to figure out why you are obsessive compulsive to this extreme. He may very well set you up with some meds to deal with this at little or no cost to you.
I need more cowbell! (Score:2)
Re: I need more cowbell! (Score:2)
And I want the cowbell to have that deep, low tone sound with the proper reverb and fade, and the master tape better have every single vibration of that bell on it. If the studio engineer can't pull it off then he will really fear the reaper.
But why? (Score:5, Informative)
Re: (Score:2)
I have not heard of any advancements in gray to gray response times on LCDs. I was under the impression that this capped frames per second at around 120. Is this just some marketing goon jamming a higher frequency chip into a monitor to be able to print an impressive number on the box?
Most gaming monitors have a sub 6ms GtG response times. They aren't limiting to 120Hz. Now as to 360Hz and beyond, that is an entirely different question. Even monitors with quoted 1ms times often measure closer to 5ms.
Re: (Score:2)
The main technological advancement there has been about being able to push overdrive voltage switching to go even faster, which in most aggressivbe overdrive modes needed for really fast refresh rates typically results overshooting target color. This can result in some color artifacting even on really fast TN monitors, significant and noticeable color artifacting on IPS monitors and very large and easily noticeable "things in motion flash in wrong color before becoming of their correct color" on VA monitors
Not exactly, no (Score:2)
144 vs 120 (Score:2)
Okay stupid question time... Can a 144Hz display be run at 120Hz without adversely affecting the overdrive system?
Re: (Score:2)
Awesome!!! (Score:2)
Re: (Score:2)
Re: (Score:2)
Forget it (Score:2)
The grey to grey response of LCD is pathetic, so a 480 Hz display is useless. Forget gaming, you would still see trailing when you move the mouse pointer across the screen. That never happened with a CRT display.
Re: (Score:3)
That never happened with a CRT display.
Of course it did. Phosphors take time to fade to black as anyone who has ever taken a photo of a CRT will attest to. At fps we're talking about you can draw multiple frames before the the first one fades.
CRTs were better than most run of the mill LCDs, but you really need to take your rose coloured glasses off.
Re: (Score:2)
I happen to have an ancient iMac G3 .. and I just checked it out. No visible mouse trailing whatsoever. The flicker you see in photos/film of CRT is actually a testament to how quickly the phosphor fades. Clearly the phosphor decay to black is within a millisecond, whereas the LCD might go to "grey" fairly quickly but then decays to black over much a longer period of time.
Re: (Score:2)
Re: Forget it (Score:2)
Maybe gleaming arsenic based wallpaper will come back in style.
Still using 60 - 75 Hz here. (Score:2)
With VGA & DVI cables & KVMs. :P
Re: (Score:2)
Hehe.
I switched to 100 Hz on my Zenith CRT monitor due to noticing flicker out of the side of my eye when the refresh rate was 100 Hz.
There definitely is a difference between 120 Hz and 60 Hz, just not as noticeable as the crappy 30 Hz and 60 Hz difference.
Re: Still using 60 - 75 Hz here. (Score:2)
Wow I remember that too. I forgot about it for a very long time before reading your comment.
I always found that effect to be a bit unsettling.
Re: (Score:2)
That's funny. I too am not a gamer and thus not using gaming products. What else don't you do? I also don't knit or paint my nails.
Ten bladed razor (Score:3)
Re: Ten bladed razor (Score:2)
Protip: If you see a set of disposeable razors with a rather crappy handle next to a set of razors with the rubber "sports car" handle, same count/brand/price, get the pack with the crappy handle. The companies put the more inferior blades in the race car version.
Peripheral vision can notice higher rates. (Score:2)
Personally I can more easily see flicker in my peripheral vision (look at something just outside of the main area I usually focus with). I don't have references, but as I remember it...yes, some people can see flicker to very high rates like this. And for VR displays, where a small panel takes up a lot of your field of view, it might help. Similar to tinier and tinier resolutions (some displays are zoomed in with optics).
Or what about the old "show one field to the left, then the next to the right"? Tha
Re: (Score:2)
Notice? Yes. Use it? No.
I'll believe .. (Score:3)
... that people can tell the difference at much beyond 100 Hz when they can't tolerate LED or fluorescent lamps or have to walk out of movie theaters suffering massive migraines.
There is an old anecdote about a new video card that came out featuring a 60 Hz refresh rate. When 30 or 45 Hz was the norm. All the autists ran out and bought one, claiming that the annoying flicker had finally went away. And just how marvelous the new cards were. But a few months later it was discovered that the Windows drivers had a bug which refused to actually set the card to anything faster than 45 Hz. Even when 60 was selected*. Suddenly, everyone who loved their new cards was crying "Muh headaches!" Just because someone told them that it was the same as the old hardware.
*I believe the bug was discovered when some Linux users ran the old display probe program necessary to fetch parameters and hand edit them into the old X config files.
Re: I'll believe .. (Score:2)
"that people can tell the difference at much beyond 100 Hz when they can't tolerate LED or fluorescent lamps or have to walk out of movie theaters suffering massive migraines."
This can happen because some of these lights are literally flashing at mains frequency, without proper capacitors or high persistence phosphers to greatly reduce or eliminate the flickering.
This is a well known hazard in industrial settings because it can cause running machinery to appear to be idle.
I want to go the other way (Score:2)
I want a FreeSync/GSync or variable refresh rate display on a battery powered device that can go down to 1 Hz so that the multi megabyte framebuffer is not being scanned hundreds of times a second. I'd also like it if the kick off for new frames on VRR is very short, so that scan out can begin immediately after render.
On a tile based render where all the vertices are scanned and collected and assigned to tiles very early in the render, you could almost begin scan out before rendering is complete. If only th
Needs some Monster cables (Score:2)
This reminds me of high-end audio component pitches. They're selling stuff you can't possibly perceive.
If nothing else, no video card is updating frames at anything close to that rate. And I defy you demonstrate that showing a baddie move 4 msec faster is going to make a difference. Humans just can't perceive things that fast: our reaction times are measured in tens to hundreds of msec.
Re: (Score:2)