Page 1 of 5

So what refresh rate do I need? [Analysis] [very good one!]

Posted: 16 Feb 2014, 18:13
by ScepticMatt
First post, hi!

So, we have a display. I'll try to answer the question: “What minimum refresh rate would someone find acceptable?

Motion perception: Motion perception is primarily a result of the phi phenomenon and beta movement. In phi, there are different images in a single place, whereas in beta the images are in different locations. To perceive both illusions we need a minimum frame of around 20 Hz. 24 Hz for film was chosen for this reason.


Source: ... &q&f=false

Flicker: To avoid eye tracking motion blur, we lower the persistence duration of our display, and use persistence of vision to 'hide' the blacking out of the display. But if the refresh rate is too low, the change in brightness becomes visible as flicker. The threshold frequency is called critical flicker fusion, and primarily depends on brightness, area and location of illumination


So 60Hz is enough to avoid visible flicker in any situation, and can be lower in controlled environments (e.g. movie theater or VR)

Source: ... esolution/

Eye strain and headaches: While flicker higher 60Hz can't be detected consciously, it still has an effect on our visual cortex, especially after longer periods. Rhythmic potentials in the human ERGcan be elicited by fluorescent lighting at frequencies as high as 147 Hz. Lower brightness, lower duration and ambient light are mitigating factors. Refresh rates above 75Hz avoid most averse effects, but higher still shows improvements. 100 Hz is generally considered “flicker free” for displays.

Sources: ... x/abstract

Phantom arrray effect: (A higher order issue example) Eyes constantly look around, scanning the scene for interesting parts. These quick, jerky, simultaneous eye movements are called saccades. The speed of the saccades depends on the angle traveled, with a maximum of 900°/s for angles above 60°.

Persistence of vision causes the image on the retina to blur during saccades, but saccadic masking prevents transmitting low frequency information via the optic nerve during, and slightly before the begin of the saccade.

Ultra low persistence (Oculus Crystal Cove, Lightboost 10%) disables saccadic masking, causing a judder-like effect.

here's what Valve experienced
The first factor is the interaction of low persistence with saccadic masking. It’s a widespread belief that the eye is blind while saccading, and while the eye actually does gather a variety of information during saccades, it is true that normally no sharp images can be collected because the image of the real world smears across the retina, and that saccadic masking raises detection thresholds, keeping those smeared images from reaching our conscious awareness. However, low-persistence images can defeat saccadic masking, perhaps because saccadic masking fails when mid-saccadic images are as clear as pre- and post-saccadic images in the absence of retinal smear. At saccadic eye velocities (several hundred degrees/second), strobing is exactly what would be expected if saccadic masking fails to suppress perception of the lines flashed during the saccade.

One other factor to consider is that the eye and brain need to have a frame of reference at all times in order to interpret incoming retinal data and fit it into a model of the world. It appears that when the eye prepares to saccade, it snapshots the frame of reference it’s saccading from, and prepares a new frame of reference for the location it’s saccading to. Then, while it’s moving, it normally suppresses the perception of retinal input, so no intermediate frames of reference are needed. However, as noted above, saccadic masking can fail when a low-persistence image is perceived during a saccade. In that case, neither of the frames of reference is correct, since the eye is between the two positions. There’s evidence that the brain uses a combination of an approximated eye position signal and either the pre- or post-saccadic frame of reference, but the result is less accurate than usual, so the image is mislocalized; that is, it’s perceived to be in the wrong location.

Not long ago, I wrote a simple prototype two-player VR game that was set in a virtual box room. For the walls, ceiling, and floor of the room, I used factory wall textures, which were okay, but didn’t add much to the experience. Then Aaron Nicholls suggested that it would be better if the room was more Tron-like, so I changed the texture to a grid of bright, thin green lines on black, as if the players were in a cage made of a glowing green coarse mesh.

For the most part, it looked fantastic. Both the other player and the grid on the walls were stable and clear under all conditions. Then Atman Binstock tried standing near a wall, looking down the wall into the corner it made with the adjacent wall and the floor, and shifting his gaze rapidly to look at the middle of the wall. What happened was that the whole room seemed to shift or turn by a very noticeable amount. When we mentally marked a location in the HMD and repeated the triggering action, it was clear that the room hadn’t actually moved, but everyone who tried it agreed that there was an unmistakable sense of movement, which caused a feeling that the world was unstable for a brief moment. Initially, we thought we had optics issues, but Aaron suspected persistence was the culprit, and when we went to full persistence, the instability vanished completely. In further testing, we were able to induce a similar effect in the real world via a strobe light.
Sources: ... ontext=tpr ... ng-judder/

Input Lag: Input lag below 20 ms is generally considered imperceptible. Required refresh rate highly depends on other sources of additional latency. Counter Strike: Go at 144 Hz comes close with ~25 ms, and this requirement could be further reduced with alternative rendering methods, up to a theoretical minimum of 50 Hz. ... trategies/

Conclusion/TLDR: A minimum of ~20Hz is required to perceive motion. For a low persistence display, a minimum of 50-60 Hz is necessary to avoid flicker perception, but a refresh rate higher than 75 Hz might be necessary to avoid discomfort and/or reduce input lag. Ultra low persistence causes additional issues that may necessitate even higher frame rates or alternative mitigation strategies (e.g. low latency eye tracking for saccadic masking)

Re: So what refresh rate do I need? [Analysis]

Posted: 16 Feb 2014, 19:40
by Chief Blur Buster
Excellent observations!
Your first post is definitely "Area 51" worthy, and also welcome to Blur Busters Forums!
For other readers reading this, I consider this post "Chief Blur Buster" approved -- the kind of stuff we like to talk about!

You certainly did your homework, and all your observations are pretty accurate (I should ask you to be a guest writer for Blur Busters).
There are some additional factors where one may require well beyond 75Hz, and possibly far beyond (e.g. 1000Hz), include the stroboscopic effect. (See Why We Need 1000fps @ 1000Hz This Century).

Several sources of what you wrote already explains this, but here's a good demonstration:
View which I've embedded below using the [testufo] tag.

Stare at a stationary point in the middle of the top edge of this moving animation.

1. Put your finger at the top edge of this animation. Keep the finger stationary.
2. Stare at your finger (or next to your finger). Keep your eyes and finger stationary.
3. As the antenna part of Eiffel tower scrolls under your finger, you will see multiple antennas appear
(the stroboscopic effect -- kind of like the reverse version of the phantom array effect -- stationary eye but a series of static images that represent moving object at a finite refresh rate)
4. This problem is most pronounced at 60Hz (e.g. antennas 16 pixels apart at 960 pixels/second)
5. This problem still exists at 120Hz (e.g. antennas 8 pixels apart at 960 pixels/second)

This will work even on the slowest laptop LCD panels, too.

Same kind of situation occurs when you spin your mouse pointer in a circle on a black background, it's not a continuous blur, even a 1000Hz mouse will show up only 120 'copies' of cursor per second (at 120Hz refresh rate) when you spin the mouse pointer rapidly in a circle on a black background while you stare stationary in the middle of your monitor (this is also known as the 'mousedropping' effect).

The only way to eliminate all stroboscopic effects like this, without adding motion blur back, is to do flicker-free persistence at ultrahigh frame rates (either by interpolation, but preferably real frames), so that there's continuous motion rather than static frames that can cause stroboscopic interactions (phantom array, mouse dropping effect, wagonwheel effect, etc).

You can add (1/fps)th millisecond worth of intentional/artifical motion blur to mask this effect, much like movies do (35mm film), in order to fix the strobing, filmmakers add intentional motion blur. For example, at 1000 pixels/second and 16 pixel step per frame (60Hz), you could add 16 pixels of intentional GPU-effect motion blurring, to eliminate this stroboscopic effect.

However, adding motion blur is very bad when you want to simulate virtual reality (as both you and I already know, from John Carmack's talks, and Oculus). For this use case, you want 100% of all motion blur to be 100% natural, created inside the human brain, if possible -- no externally added motion blur as a band-aid. Also, motion blur is undesirable by a lot of readers on Blur Busters, who come to this very site, in the pursuit of elimination of motion blur. So someday into the future, we'd want to attempt to do strobefree low-persistence. To do 1ms persistence without flicker/strobing/phosphor/etc, you need to fill all 1ms timeslots in a second, and that means 1000fps@1000Hz to achieve low-persistence with no form of light modulation. That, as you can guess, is quite hard to do with today's technology, so strobing is a lot easier.

75Hz completely solves the motion blur problem by allowing low persistence above flicker fusion threshold. However, it doesn't solve 100% of the problem of making virtual imagery completely indistinguishable from real life. Certainly, it's often "good enough", and it will have to be good enough for the next decades (or few), probably.

There is a Law of Persistence: 1ms of persistence (strobe length) translates to 1 pixel of motion blurring during 1000 pixels/second. Decay curves (e.g. phosphor) complicate the math, but strobe backlights such as LightBoost, ULMB, BENQ Blur Reduction are essentially near-squarewave and very accurately follow this Law, to the point where I've begun to call this "Blur Busters Law of Persistence". This does make some assumptions (no other weak links, stutter free, frame rate matching refresh rate, perfect smooth VSYNC ON motion such as stutterfree TestUFO motion, motionspeed that are slow enough that random eye saccades are an insignificant motion factor). I find I can track eyes accurately on moving objects on screen (i.e. ability to count eyes in the TestUFO alien, which are single pixels), up to approximately 3000 pixels/second from arm's length away from a 24" monitor. Different humans will have different eye tracking speeds, but this kind of defines the bottom end persistence that we need, since 1ms of persistence at 3000 pixels/second blurs the eyes to 3 pixels wide rather than 1 pixel. This is the reason why I told BENQ to support 0.5ms strobewidth in their new firmware (they listened; now we just have to wait for the fixed XL2720Z firmwares to ship), since I apparently can just about barely detect the motion clarity difference between 0.5ms persistence (strobelength) and 1.0ms persistence (strobelength). For 1080p 24" at arm's length away, most people track reasonably accurately at 960 pixels/second. Others, track at 2000 pixels/second before eye tracking can't keep up. I find I cap out approximately at that motionspeed. During 3000 pixels/second TestUFO animations, this means the difference between 1.5 pixels of motion blurring (insignificant blurring at or ) versus 3.0 pixels of motion blurring (alien eyes blurred at as well as windowframes of buildings blurred at )... I have this beta firmware installed on my XL2720Z, and it confirmed my findings: 1ms persistence is not the final frontier. So, I recommend manufacturers start considering 0.5ms persistence, and not stop at 1.0ms persistence. This will become even more demanding in the VR era, during panning during fast head-turning speeds, and 4K screens (twice as many pixels to track across), so 0.25ms might actually produce a human noticeable improvement over 0.5ms. (e.g. 8000 pixels/second during slow head turning -- creating 2 pixels versus 4 pixels of motion blur during 0.25ms persistence versus 0.5ms persistence). For now, 1ms persistence (LightBoost 10%) is sufficiently low to satisfy the majority of population, as you still get a lot of brightness loss trying to achieve lower persistence, and compensating with brighter strobes gets expensive (oe.g. custom backlights/edgelights). That said, you can still just do 75Hz, with say, 0.25ms persistence and call it a day, unless you were concerned about stroboscopic effects.

We are stuck with stroboscopic effects, ever since humankind invented the concept of frame rates / refresh rates when we came out with zoetropes and kinetoscopes of the 19th century, we have never yet been able to successfully record and playback continuous motion naturally in a framerateless manner, so we have the artifical invention of the frame rate for now -- since it's the easiest way to virtually represent motion.

The lighting industry has done several studies about human detection of stroboscopic effects of flickering light sources (it's a good reason why fluorescent ballasts have gone electronic and often use >10KHz rather than strobing at 120Hz). The stroboscopic-effect detection threshold (phantom array detection) can be quite high, even 10,000Hz for a portion of human population -- see this lighting industry paper, so that will define roughly the refreshrate we need, although we could get by with just 1000fps@1000Hz + 1ms of motion GPU-effect blurring (fairly imperceptible, but enough to prevent wagonwheel effect).


I totally agree with the individuals such as those in Valve Industry and Oculus, about the elimination of the vast majority of artifacts during low-persistence >75Hz -- this is definitely the sweet spot, as you've described. By all means, it doesn't completely eliminate all differences between virtual imagery and real-life imagery, we still will need >1000fps@1000Hz to pull off the "real life indistinguishability" feat, or some kind of future framerateless continuous-motion display, even a display that refreshes faster only where the eye is staring at, etc. By going to low persistence via strobing, we solve a large number of VR problems, just that low persistence using today's technology necessitates strobing and that problem is unsolvable without going to ultrahigh framerates. (0.5ms = 2000fps@2000Hz needed for flickerfree low persistence with zero strobing, zero light modulation)

Corollary/TLDR: As you said, low-persistence 75Hz+ is definitely the sweet spot that solves a lot of problems. However 75Hz is still not enough to pass a theoretical Holodeck Turing Test, "Wow, I didn't know I was standing in Holodeck. I thought I was standing in real life.", because there still remain side effects of finite framerates, that cause motion to not fully mimic the completely step-free continuous motion of real life (no judder, no stutter, no wagonwheel artifact, no blur, no strobing, no visible harmonics between framerate vs refreshrate, no phantom array, no mousedropping effect). To do so via finite refresh rate, we need ultrahigh framerates synchronized to ultrahigh refrehsrates, 4-digit, in order to completely solve all possible human-detectable side effects of a finite frame rate, achieving low persistence via continuous light output, without strobing/phosphor/light modulation, to achieve simultaneously completely stepfree, strobefree, and blurfree motion necessary to mimic real life.

Very interesting talk though -- and we need more people like you, visiting this brand new forum which launched barely more than a month ago!

Also, here's photos of the Eiffel Tower Test. You stare stationary at the screen while the eiffel tower scrolls past. Less strobe effect. The same problem occurs on any finite-refresh-rate display (CRT, LCD, plasma, whatever).



The same problem occurs for CRT and LCD, strobed and non-strobed, flicker and flickerfree, phosphor and phosphorless.]

UPDATE [2017/08/22] New animations demo this better, and

Re: So what refresh rate do I need? [Analysis] [very good on

Posted: 17 Feb 2014, 06:47
by RealNC
ScepticMatt wrote:While flicker higher 60Hz can't be detected consciously
Oh yes, it can. Quite easily, even.

Re: So what refresh rate do I need? [Analysis] [very good on

Posted: 17 Feb 2014, 07:39
by ScepticMatt
I can't see strobing in your example, because the laptop monitor I'm viewing from is too terrible. Well, that is at full brightness, otherwise I can see PWM artifacts …
RealNC wrote:
ScepticMatt wrote:While flicker higher 60Hz can't be detected consciously
Oh yes, it can. Quite easily, even.
To clarify: You can't detect black frames inserted between equally bright frames higher than 60Hz. You can detect bright frames between black frames at much higher frame rates. See this link for a more detailed explanation: ... esolution/

Anyway. There are a lot of 'higher order issues' remaining, especially with lower persistence.Here's another one:
Color phi phenomenon

view with one eye and far enough away to strengthen effect.
source: ... d-gap.html )

In other words, the eye perceives a color change before the new color is shown. In the Valve example above, the green frame on black background shift could partly be caused by the color phi phenomenon.

Combined with the brains future perception – similar to a whopping 100 ms VR prediction mentioned in Carmacks blog it is clear that low frame rates causes our brain to misinterpret reality in unexpected ways.

Now we obviously won't render at 1000 Hz for a while, so I'll try to think up a mitigating strategies:

The simple way:

We need:
a 75-100 Hz signal
a 1000Hz full persistence display w/ FPGA/ASIC

Buffer one frame. Using the FPGA, compare the previous frame to the current frame. Generate a velocity buffer – a map that describes the direction and velocity of the movement between pixels. Use this map to up sample the signal to 1000Hz, using temporal reprojection with velocity weighting and temporal SMAA 1x.

More info and source code:

The advanced way

We need:
the above
an infrared emitter
a high frame rate, low latency infrared camera

Generate a velocity buffer as before. Track eye movements and use predition to lower latency. High parameters occur during saccades, with a typical acceleration of 10,000 °/s2 and angular velocity of 500°/s along the following graph:
Use this info for prediction to reduce latency. Use eye tracking information to create a velocity difference buffer (difference between eye movement and velocity buffer) and add per-pixel motion blur -using the high display frame rate - as needed.

Sources: ... P_15-3.pdf

Re: So what refresh rate do I need? [Analysis] [very good on

Posted: 17 Feb 2014, 12:27
by Chief Blur Buster
Hey -- good reply -- but let's make sure you correctly do the test I described:
ScepticMatt wrote:I can't see strobing in your example
Which example? The Eiffel tower example?
ScepticMatt wrote:because the laptop monitor I'm viewing from is too terrible. Well, that is at full brightness, otherwise I can see PWM artifacts …
If you're talking about the Eiffel Tower:
PWM artifacts are only visible when your eyes are moving.
You don't see PWM artifacts when your eyes are stationary (if above flicker detection threshold).
You're supposed to keep your eyes stationary in the exercise that I describe (regarding the Eiffel Tower)
Keep your finger stationary, at the very center at the top edge of the moving Eiffel Tower animation.
And I'm not talking about "strobing" flicker, but a stationary-eye phantom array effect (stationary eye, moving object).
This one may look like a PWM-like artifact, but for this specific test it isn't a PWM artifact (the phantom array is still there at 100% brightness during this specific test of stationary-eye moving-object).

NOTE: Sometimes there is confusion in terminology. The movie industry sometimes exchange terminology "strobing" for "judder" because low frame rates of very sharp moving objects often creates a flickering-edge effect (30fps @ 60Hz on CRT does the same effect) during moving objects, which is then thus called the "strobing" effect.
ScepticMatt wrote:To clarify: You can't detect black frames inserted between equally bright frames higher than 60Hz. You can detect bright frames between black frames at much higher frame rates. See this link for a more detailed explanation: ... esolution/
This is correct under normal circumstances, but the number "60" is actually elastic.
Here's a test where I was able to bump detection of a single dark frame, up to above "120":

An important consideration is the size of the black frame relative to bright frame; when looking at the test variables:
- Are you displaying the bright frame for the same duty cycle as the black frame? (film, LCD, sample-and-hold)
- Or are you displaying the bright frames at low persistence (brief flash per frame, CRT, strobe backlight) -- meaning much longer OFF duty cycle for a missed frame (almost 1/60sec long for a single dropped 120Hz low-persistence frame)
- Is identification necessary (e.g. identifying object in a dark frame). Or just to detect the dark flicker?
- Is it a static bright image versus static dark image?
- Is it moving images? (much harder to detect inserted dark frames)
- Are you counting frames, or just monitoring the dip in average light output (e.g. a dip of brightness down to 119/120ths -- easily detected -- during 120Hz, for a single dark frame inserted in a series of bright static frames)?

In extreme tests (custom TestUFO animation I program) I am able to see a flicker caused by a single blank black frame randomly inserted in 119 out of 120 frames. High speed camera footage (Casio EX-ZR200) confirms it's just a single frame 119 out of 120 per second bright, 1 dark. I can give you the URL to this hidden test, if you have access to a 120Hz monitor.
-- However, it's true that I wouldn't be able to identify dim objects in that said dark frame, as the surrounding frames would be quite too dominant.
-- I can definitely confirm it is MUCH easier to see 1 bright flash than 1 dim flicker
-- However, I can definitely see a single 1/120sec dark frame (detected as a brief dip of brightness -- averaged over 1 sec it's 119/120ths dip in brightness = say ~1% momentary dip in brightness) inserted as a single black frame at 120Hz, at least when staring at a static image that suddenly gets blacked out for 1/120sec.
-- I repeated the test at high-persistence 144Hz (non-strobed, same duty cycle for bright frames and dark frames), and I was still able to detect a single 1/144sec black frame. I think that this is because of the momentary dip in average brightness (averaged over a 1/10sec period, that's a 14.3/14.4 dip in brightness. averaged over a 1 second period, that's a 143/144ths dip in average image brightness) that a single black frame creates. However, I can say it's much easier to see a single bright frame than a single dark frame -- probably by several orders of magnitude.
-- It's quite possible 60 is the number I would hit when executing the science experiment that came up with the number. No question about it. But changing the varibles, the number "60" actually changes upwards and downwards -- e.g. variables as simple as repeating the test on an LCD (non-strobed) versus repeating the test on a CRT (or LightBoost/strobed; so new that science papers haven't yet been written about modern high-efficiency strobe backlights).

Re: So what refresh rate do I need? [Analysis] [very good on

Posted: 17 Feb 2014, 12:34
by Chief Blur Buster
RealNC wrote:
ScepticMatt wrote:While flicker higher 60Hz can't be detected consciously
Oh yes, it can. Quite easily, even.
ScepticMatt is correct in all of his intended point in that statement -- I read it to mean as "Staring directly at flicker, when the flicker higher than a threshold (of about 60Hz depending on variables), it can't be directly detected consciously" (probably his intended wording, mind you)

There's a difference between detecting flicker directly while eyes are stationary, and detecting flicker indirectly (via stroboscopic / phantom array effect) while eyes are moving relative to the object (or vice-versa).

Variables include:
- Stationary eye (direct detection) versus moving eye (stroboscopic detection)
- Direct viewing versus peripheral viewing. (peripheral is more sensitive)
- Duty cycle of flicker. (Amount of black period versus bright period)
- Softness of flicker. (Sine wave versus sqarewave. Full return to black like LED, partial return to black like phosphor)
- Ambient lighting. (Flicker is harder to detect in a dark room. 48Hz looked fine in movie theaters for years)
- Human dependent factor. (Some humans are more sensitive than others)

Personally I can't tell 40Hz flicker while my eyes are stationary in total darkness of a long-duty-cycle flickering LED (e.g. driven via an Arduino). Conversely, I can see >1000Hz flicker via the stroboscopic effect (e.g. rolling my eyes around in front of a flickering LED, driven via an Arduino).

Now if you make stationary eye a restriction (limit only to direct detection of flicker) -- then my threshold ranges from ~30Hz (direct staring in total darkness, long duty cycle flicker) through 100Hz (peripheral vision, short-duty-cycle flicker).

Re: So what refresh rate do I need? [Analysis] [very good on

Posted: 17 Feb 2014, 13:45
by RealNC
I'm basing this on my CRT monitor. With 60Hz, the image looked "unstable" and not "clean." With 80Hz and above, the image started to look clean and stable, like looking at a painting.

Not sure if I used the right word. Not sure how to describe it. I can ABX it quite easily. No need to start a game even, can just tell by looking at the desktop where nothing is moving; just a stationary image.

This isn't monitor-specific, btw. I went through about 5-6 monitors throughout the CRT days, all had the same effect on me. I still remember that this was driving me crazy in the MS-DOG days and that I had to load a TSR utility that would set 85Hz on all VESA modes.

If what was meant is actually being able to count individual pulses, then of course no, not able to do that.

Re: So what refresh rate do I need? [Analysis] [very good on

Posted: 17 Feb 2014, 13:52
by Chief Blur Buster
RealNC wrote:I went through about 5-6 monitors throughout the CRT days, all had the same effect on me.
Most CRTs had similarly low persistence (<1ms) so that was insignificant difference in motion blur or flicker.

Re: So what refresh rate do I need? [Analysis] [very good on

Posted: 17 Feb 2014, 14:46
by Chief Blur Buster
To update the stroboscopic effect that I've been illustrating, staring eye stationary while moving objects occur, you get a stroboscopic effect (phantom array effect, which looks like a PWM-like artifact even at 100% brightness even on the worst laptop LCDs). Here are photographs of the effect (stationary camera as a stand-in for stationary eye stare) on the TestUFO Moving Photo Test on Eiffel Tower, and then testing the stationary eye situation. I turned off LightBoost/ULMB when taking these photographs. This is non-strobe-backlight, PWM-free.



Again, this is strobe-free and PWM-free, monitor emitting steady light (verified).

In real life, moving objects should blur naturally (by your human brain) as they move past your stationary eye. But that doesn't happen with finite-refresh-rate displays (of double-digit or triple-digit Hz) if there's no GPU motion blur effect added to the frames. This even happen on the worst 33ms LCDs I have ever seen, so LCD limitations will never mask the phantom array effect during stationary-eye moving-object situations like this. (mousedropping effect is related) This form of stroboscopic effect is always detectable, no matter how slow the LCD GtG is (even if it's 33ms or 50ms).

This is a side effect of finite-refresh-rate displays that will remain with us for a very long time, only solvable by adding extra motion blur (GPU effect or longer camera exposures per frame -- like for movies), or by adding more frames until the stroboscopic effect falls below human detectability threshold (in the ballpark of >1KHz-10KHz, as in the lighting study diagram)

Movies and television can easily fix this by adjusting camera exposure to add motion blur to make it more natural looking by eliminating this effect. But as both you and I already know, that is not a proper solution for virtual reality use case (or the theoretical Holodeck use case). And often undesirable to many of us users of low-persistence monitors who enjoy zero-motion-blur gaming (LightBoost style).

This still-existing effect isn't nearly as important as simply eliminating motion blur, so simply having low-persistence (at flicker rates above ~75Hz) solves a huge number of problems during virtual reality. However for the use case of attempting to simulate real life, it is extremely challenging engineering-wise (requiring stobe-free flicker-free step-free motion abilities, such as via ultrahigh frame rates or some kind of as-of-yet-not-invented framerateless display that can represent motion as continuous motion rather than as a series of frames, completely free of any side effects such as stroboscopic or wagonwheel effects).

(Michael Abrash of Valve Software described this too as well in the "Down The Rabbit Hole" article)

Re: So what refresh rate do I need? [Analysis] [very good on

Posted: 17 Feb 2014, 16:01
by Haste
ScepticMatt wrote:
RealNC wrote:
ScepticMatt wrote:While flicker higher 60Hz can't be detected consciously
Oh yes, it can. Quite easily, even.
To clarify: You can't detect black frames inserted between equally bright frames higher than 60Hz. You can detect bright frames between black frames at much higher frame rates. See this link for a more detailed explanation: ... esolution/
I can detect consciously the flicker at 60Hz on my CRT monitor.

I'm looking at it in such a way that:
-I am not moving my eyes
-I am using the central part of my vision (not peripheral vision)
The room in which I'm watching my screen is dark (10pm, lights off)

Not only I can detect it consciously but I can't even look at it more than a few seconds.
It is a pure torture. Extremely straining on the eyes.

Is it due to the fact that the CRT pulse the image for a brief moment? (lower than a 120th of a second for 60Hz mode)

In this case doesn't it make this theoretical 60Hz fusion flicker threshold irrelevant to the subject of low persistent display?

Thank you.
Great contribution btw. I'll make sure to read the links you provided.