Nvidia's New G-Sync Pulsar Monitors Target Motion Blur at the Human Retina Level (arstechnica.com) 56
Nvidia's G-Sync Pulsar technology, first announced nearly two years ago as a solution to display motion blur caused by old images persisting on the viewer's retina, is finally arriving in consumer monitors this week. The first four Pulsar-equipped displays -- from Acer, AOC, Asus and MSI -- hit select retailers on Wednesday, all sharing the same core specs: 27-inch IPS panels running at 1440p resolution and up to 360 Hz refresh rates. Nvidia claims the technology delivers the "effective motion clarity of a theoretical 1,000 Hz monitor."
The system uses a rolling scan scheme that pulses the backlight for one-quarter of a frame just before pixels are overwritten, giving them time to fully transition between colors before illumination. The approach also reduces how long old pixels persist on the viewer's retina. Previous "Ultra Low Motion Blur" features on other monitors worked only at fixed refresh rates, but Pulsar syncs its pulses to G-Sync's variable refresh rate.
Early reviews are mixed. The Monitors Unboxed YouTube channel called it "clearly the best solution currently available" for limiting motion blur, while PC Magazine described the improvements as "minor in the grand scheme of things" and potentially hard for casual viewers to notice.
The system uses a rolling scan scheme that pulses the backlight for one-quarter of a frame just before pixels are overwritten, giving them time to fully transition between colors before illumination. The approach also reduces how long old pixels persist on the viewer's retina. Previous "Ultra Low Motion Blur" features on other monitors worked only at fixed refresh rates, but Pulsar syncs its pulses to G-Sync's variable refresh rate.
Early reviews are mixed. The Monitors Unboxed YouTube channel called it "clearly the best solution currently available" for limiting motion blur, while PC Magazine described the improvements as "minor in the grand scheme of things" and potentially hard for casual viewers to notice.
Same principle as Asus ELMB, but more Advanced (Score:1)
OLED exists (Score:2)
Literally topic. OLED exists and offers around 0,03ms response time.
If you're going to make a super expensive display for purposes of "high refresh rate with no ghosting", you may as well just go straight to OLED. No need to stick with LCD.
Re:OLED exists (Score:4, Informative)
The problem is exacerbated by LCDs because their actual pixel change time is so utterly abysmal, but it exists for any display-and-hold display with long pixel display times. OLED + 300+hz does get pretty close to CRT levels of motion- but for the people who care about that, and want to have crisp motion at more reasonable frame rates, there is black-frame-injection. For LCDs, there is Pulsar, which improves upon backlight strobing.
Re: (Score:2)
At 60Hz, you have a frame hold time of 16 2/3rd ms.
A good gaming CRT has a phosphor persistence time of ~3ms, which if we turn into FPS for the image hold time, is about 333.
Smart move staying anonymous for that dumbfuck comment, AC.
Re: (Score:2)
A good gaming CRT
Presumably sold at the same place that also sells record players for cars.
Re: (Score:2)
I've got 16 of them in storage (CRTs, not kids) and I've been offered $500 a piece for my old ViewSonics.
Having seen the difference, though, I get where they're coming from.
Sure, the color and resolution of a CRT is nowhere near what a modern display is- but motion looks absurdly better. A 120Hz OLED looks flat out sloppy compared to
Re: (Score:2)
I know there's some demand from the retro gaming community, since most games from that era just don't look right on a modern LCD, but I've never heard of anyone willingly playing modern games on such a relic.
The main issues with CRTs is that the viewable display area is now considered too small by modern standards, and while some of the pricier models were capable of what would be considered high resolution, the resulting picture was nowhere near as sharp as fixed pixel technology. I'm also not sure why yo
Re: (Score:2)
I know there's some demand from the retro gaming community, since most games from that era just don't look right on a modern LCD, but I've never heard of anyone willingly playing modern games on such a relic.
The reason they don't look right, is they're a smudgy mess. That's because modern displays are display-and-hold for the entire duration of the frame. To your eye, that's motion blur.
Why would someone be willing to take the trade-offs? Because it looks better.
The main issues with CRTs is that the viewable display area is now considered too small by modern standards, and while some of the pricier models were capable of what would be considered high resolution, the resulting picture was nowhere near as sharp as fixed pixel technology. I'm also not sure why you're remembering 60Hz as anything other than the flickery eyestrain inducing mess that it was. Once CRTs became available that could run at higher refresh rates, nobody would tolerate running a refresh rate that low.
It's very true that the sharpness is higher on displays with actual pixels, but that's often considered a downside, even by non-retro enthusiasts, when they're forced to choose between visible fixed pixels vs. visible CRT "pixels" side-by-side.
I'm also not sure why you're remembering 60Hz as anything other than the flickery eyestrain inducing mess that it was.
60Hz loo
Re: (Score:2)
"but I've never heard of anyone willingly playing modern games on such a relic."
1600x1200 is just fine for a gaming resolution. A good 75Hz Trinitron will slap any 144Hz monitor at whatever native resolution it displays.
And the color gamut and contrast at high motion is practically untouchable.
The only reason I have an LCD now is because I have devices that actually do 4K (microscope) so the screen got an upgrade. This large workbench feels wide open without that 21" beast sitting there.
Re: (Score:2)
But yes- 75Hz (frankly, even 60Hz- though that's less pleasant at 1600x1200) will spank a 144Hz panel in clarity of motion, though BFI at 120Hz is pretty indistinguishable. That's a simple matter of frame ti
Re: (Score:2)
Convergence artifacts at the edges and generally imperfect geometry due to beam geometry vs screen shape are generally the main issues with CRT used in modern computers.
Re: (Score:2)
CRT phosphors don't stay lit very long. That's why you had to have high refresh rates - because 60Hz refresh often produced annoying levels of flicker (even worse in PAL countries where they have 50Hz). Good computer displays often refreshed at 72Hz or more to prevent flicker.
So they're great for fast changing content because they don't stay lit very long, but as a side effect, you have to refresh them more because otherwise you're subjected to flicker. And you then run into either monitor limits or RAMDAC
Re: (Score:2)
CRT phosphors don't stay lit very long.
No shit...
How did you not get that from my reply to the following dumbfuckery:
Are you serious? Did nobody tell you that CRTs have phosphors for persistence?
CRT phosphors are in fact lit for so little time, that at any point in time, your CRT is between 90-94% black, depending on the refresh rate.
The difference between 60 and 100Hz is microscopic is the difference between 10% of your screen being lit at any time, and 6% of your screen being lit at any time.
because 60Hz refresh often produced annoying levels of flicker (even worse in PAL countries where they have 50Hz).
And yet the entire world had no problem with it...
Good computer displays often refreshed at 72Hz or more to prevent flicker.
Not really. High refresh rates did become important to prevent flicker, but on
Re: (Score:2)
The actual problem with CRT flicker is twofold:
1. How much of your peripheral vision is seeing the screen (affected by screen size and distance from screen).
2. Your specific susceptibility to CRT flicker.
For example for me, 85Hz was the lowest I could have without headache for even a small PC monitor. 75Hz and I'd get a headache within 30 minutes. 60Hz was borderline unusable, headache in minutes. My last monitor was a 100Hz unit, and I overclocked it to 120Hz as soon as I got it. First monitor where I need
Re: (Score:2)
However, there are critical problems with it.
How on *earth* do you survive going to the cinema?
For me, the more likely explanation is that, in the same way that "headache" is a listed side effect of almost literally everything under the sun, people are absolutely terrible at the kind of rigorous methodology needed to determine cause and effect. Sure, they can tell that a hammer caused that headache. But they'll also tell you that Tequila
Re: (Score:2)
>How on *earth* do you survive going to the cinema?
Went once. Never afterwards. Felt weird. Never actually connected the dots until you pointed this out due to insufficient personal experience.
Parents never went either to my knowledge. I should probably ask them if they have the same experience I did.
Finally, it's pretty easy to figure out how it works. You look at the CRT screen displaying a white image. Then you turn your head away so you see it in peripheral vision only.
You will see flicker up to some
Re: (Score:2)
For you- 60Hz displays are shite- agreed.
Re: (Score:2)
In my experience, for most people the line goes around 72-75Hz. People do get headaches from 60Hz monitors (even those that are fine with 50Hz TVs).
For me, the small monitor line was 85Hz. I remember talking to friends about it back in early Pentium era of computer games, and how several of my friends did notice headaches at default 60Hz, and that they went away when they set the monitors to 75Hz.
For me, that was still uncomfortable for extended period of time, even on small monitors.
Age may also affect thi
Re: (Score:2)
As I said earlier, even for me, flickering because *far* more apparent the higher the resolution goes.
It makes sense to me. Features are smaller, and it fucks with your peripheral vision more than a large feature that's never really 100% blank.
At higher resolutions, I absolutely required higher refreshes for it to be comfortable. TVs were (usually) ma
Re: (Score:2)
No, that's because of the distance. Remember, light sensitivity.
When you're far away from something, you have a lot of stable, non-strobed background light. When you're near a monitor, you have much less non-strobed background light.
This is why I mentionws screen size and distance as factors in addition to refresh rate.
Re: (Score:2)
Of course screen size and distance are factors. But so is what is on the screen.
Re: (Score:2)
Honestly, never seen that make more than a minor difference. It really seems to be about strobing light exposure of certain parts of the eye, provided it's below specific threshold in frequency.
Re: (Score:2)
Re: (Score:2)
I suspect you're on the other side of the evolved feature set on this one if you're unbothered by 60Hz. I moved quite a few people from 60Hz to 75Hz (most 60Hz monitors in 1990s supported 75Hz at the same resolution), and I can't think of a single person who didn't thank me for "making headaches go down/away".
I.e. in my experience, sweet spot for most people is somewhere below 75Hz but well above 60Hz. Granted we're fairly genetically similar here in Finland, so it could be that other ethnic groups have a d
Re: (Score:2)
Notably 60Hz CRTs are horrible for gaming. Headaches from flicker are a norm and you only get 60 images a second even with nanosecond-grade display speed.
If you insist on having a CRT for gaming, consider getting 100-120Hz refresh rate ones that came at the very tail end of CRT manufacturing cycle. Granted you'll have to lower resolution further to get to those refresh rates on most such monitors, or overclock the monitor leading to even lower MBTF and noise issues.
Re: (Score:1)
They really cause headaches.
You know what market fucking exploded? Gaming on shitty 50Hz and 60Hz display devices.
You know why? Because 60Hz CRTs are, in fact, not horrible for gaming at all.
Re: (Score:2)
VR headsets cause simulation sickness. Symptoms are different, as are causes.
Re: (Score:2)
I should elaborate. Simulation sickness is caused by discrepancy between observations of eyes vs inner ear.
When encountering those, our evolved biological systems conclude we have likely ingested something psychoactive, and evacuate contents of the stomach to prevent further poisoning.
CRT problems are caused by discrepancy between what cones and rods sense. Cones mainly populate focus vision, have significantly lower light sensitivity (are not meaningfully affected by CRT flicker). Rods are far more prevale
Re: (Score:2)
We do not seem to have any evolved reaction to this discrepancy, as there are no scenarios in our normal environment with this kind of flicker where it's a sign of danger.
Hence it seems to only result in a headache at worst, and sensitivity varies.
Re: (Score:2)
Yet no literature whatsoever can be dredged from the abyss to support a similar problem for the advent of 50/60Hz CRT gaming.
Physical differences in the phenomena were not the point. The point is- no evidence supports your position that 50/60Hz CRT gaming, TV use, or going to the movie theater, caused "headaches" for some appreciable amount
Re: (Score:2)
The interesting part is that I don't get motion sickness. At all. I was on cruises where people around me were all at least mildly sick, and I am always the one who's completely healthy and taking care of the sick ones.
No VR sickness either that I've experienced. And I've tried really early VR glasses in the late 90s and played Duke3D on them. I recall several people not being able to play for more than a minute before being blue-ish, and I gamed for as long as demo crew let me and I remember it being cool
Re: (Score:2)
It depends if you want smooth CRT like motion, or clarity that gives you a small but measurable advantage in games.
That said OLED is the only choice for me because of its superior colour and contrast.
Re: (Score:2)
That's why they like them.
This is because CRT displays are basically stop motion.
Only a very tiny fraction of them are lit at any time (you can check YouTube for a slow motion capture of a CRT if you want to know just how insanely little is lit).
This makes their motion crystal clear, while a modern flat-panel displays and imagine and holds it until the next refresh. This makes your eyes interpret it as blur.
That being said- I, personally, agree 10
Re: (Score:2)
Oh I like CRT clarity too, but I'm not sure it's necessarily better than high frame rate OLED for any particular purpose. It just looks nice.
Re: (Score:2)
I was able to see what they were talking about- but it's just not worth it to me, particularly in modern games where there's less contrasty high contrast motion going on. Scenes are rich and full of pretty low-contrast details these days in the search for photorealism.
Maybe I'm just too old to really see the awesomeness of speed running Super Mario 2, or I just fell too hard for immersive first person games that are an
Re: (Score:2)
It's worth noting that games like SM2 were specifically built with CRT feature set in mind.
I.e. natural smoothing due to poor geometry handling, near instant display at specific low refresh rates, etc.
Modern games typically offer much higher refresh rates, and expect pixel perfect geometry.
Re: (Score:2)
Re: (Score:2)
Sure, but the problem people have with these is generally cost. OLED is the expensive high end, and it's primary downside nowadays is durability. It's a monitor for a two-three year period, and then you have to switch as UI burn in starts to set in even with regeneration.
These strobing monitors are typically in the OLED price range, and when they just come out, they tend to be well into OLED price range.
That means there's no sense to buy one. Just buy an OLED if you have that much of a budget for a monitor.
Pointless. (Score:2)
A good question is, how many games will be able to keep up with rendering at 360 FPS? A better question is, how many humans can see the difference between 80Hz and 360Hz?
Re: (Score:3)
Re: (Score:2)
It would be harder for the average person to reliably tell the difference between 120Hz and 360Hz.
Actually pretty easy, if you know what to look for.
Things like Pulsar, BFI, and backlight strobing exist to change that, though.
At 120Hz, motion blur is still acutely visible. At 300, It looks nice and clean.
That's not a reasonable game framerate target though, so that's why we have articles like this and teh relevant mitigating technologies.
Re: (Score:1)
Re: (Score:2)
Re: (Score:1)
I don't understand, in the US lights turn on and off at a rate of 30Hz.
I think you mean 120Hz. Twice every cycle, not once every two cycles.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: Pointless. (Score:2)
Re: Pointless. (Score:2)
Re: (Score:2)
I personally think this is pointless for most consumers, but there are plenty of people who buy that marketing. F
The power of imagination (Score:2)
I prefer text-based games and stories. I can do the motion blur in my head.
Re: (Score:2)
A lot of games add motion blur. I usually turn it off, but it's the default in most of them.