Blur Busters Forums

Who you gonna call? The Blur Busters! For Everything Better Than 60Hz™ Skip to content

Highest perceivable framerate? [good discussion]

Advanced display talk, display hackers, advanced game programmers, scientists, display researchers, display manufacturers, vision researchers. The masters on Blur Busters.

Highest perceivable framerate? [good discussion]

Postby Alcazar » 01 Jun 2014, 10:38

Since GSYNC will be offering us the ability to see higher framerates now, I'd like to offer up a bit of my own real-world testing of >60hz framerate perception.

This data comes from back in my pro gaming days, when I was using a lightning-quick CRT monitor capable of 120hz and *above*. Playing countless hours of Quake with the top echelon of professional competitive gamers on these CRT monitors meant having less input lag (higher PS/2 and USB port signal sampling) and higher monitor framerates (>60hz refresh rates) sometimes decided your fate. So back then, before the LCD, gamers were tweaking systems to play at really high framerates like 90, 120, 140hz. This was beautifully framerate matched by the gaming engine to reduce tearing (though not completely) and with near-zero motion blur. The result? Silky smooth animation that resembled the look of REAL LIFE. Ahh those were the days.

Anyhow, one day I sat down and decided to do some real-world tests at different framerates to see if the human brain and eyes (my human brain and eyeballs, to be precise) could perceive the differences of higher framerates, or gain any additional benefit from more frames. I did this by setting the monitor to progressively higher refresh rates, and game to match with higher framerates, and stopping to do a series of perception tests (waving hands in front of monitor, swirling the mouse in concentric circles, etc)

The results:
    60 FPS = Obvious stutter
    90 FPS = Some stutter
    100 FPS = Very little stutter
    110 FPS = No perceivable stutter
    120 FPS = No change
    140 FPS = No change

So, there you have it. Curious, I wonder if we will find some brains that perceive higher framerates though. I ended up taking this data and concluding that the ideal configuration was to match my display to my input sample rate (125hz) and felt that gave me the best marriage of lowest input lag and silky smooth performance.

Cheers
User avatar
Alcazar
 
Posts: 9
Joined: 31 May 2014, 10:10
Location: Silicon Valley, CA

Re: Highest perceivable framerate?

Postby RealNC » 01 Jun 2014, 11:21

Just move the mouse cursor around on the desktop as fast as you can. You can easily still see gaps. The lower the refresh rate, the larger the gaps (those gaps are what we call "stutter.") This means that the faster an animation is, the higher a refresh rate is needed for it not to stutter.

I can't tell a difference between 30FPS and 144FPS with slow animations. That's because there is none. If an animation moves at 30 pixels per second, 30FPS and 144FPS will show exactly the same thing on screen. As the animation gets faster, pixels are skipped if the refresh rate isn't high enough to make the animation move one pixel at a time; the pixels in-between are not shown. In other words, you lose motion resolution (= you get stutter.)

I don't know if there's an upper limit to this when it comes to perception. I suspect that if there is, it would be probably in the thousands of Hz rather than hundreds.
TwitterSteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
User avatar
RealNC
 
Posts: 2792
Joined: 24 Dec 2013, 18:32

Re: Highest perceivable framerate?

Postby flood » 01 Jun 2014, 15:12

RealNC wrote:The lower the refresh rate, the larger the gaps (those gaps are what we call "stutter.")

I'd call that temporal aliasing. If the separation between gaps are not uniform, then you have stutter (e.g. 125hz mouse on 120hz monitor)
flood
 
Posts: 897
Joined: 21 Dec 2013, 01:25

Re: Highest perceivable framerate?

Postby spacediver » 01 Jun 2014, 15:55

Alcazar wrote:The results:
    60 FPS = Obvious stutter
    90 FPS = Some stutter
    100 FPS = Very little stutter
    110 FPS = No perceivable stutter
    120 FPS = No change
    140 FPS = No change


There is a potential confound here. The quake engine, as I understand it, can only render at 1000/n fps where n is an integer. So setting com_maxfps to 120 will result in 125 fps or 111.11 fps (you probably noticed this when you used cg_drawfps 1).

What this means is that if you had your refresh at 120hz, but your framerate at 125 fps, then there are some temporal artifacts due to the discrepancy.

btw, what name did you use - do you still play quakelive? If so, add me - I use the name Julios (from [xeno]).
spacediver
 
Posts: 505
Joined: 18 Dec 2013, 23:51

Re: Highest perceivable framerate?

Postby RealNC » 01 Jun 2014, 16:10

flood wrote:I'd call that temporal aliasing. If the separation between gaps are not uniform, then you have stutter (e.g. 125hz mouse on 120hz monitor)

You can have uniform seperation at 5FPS. It's still stutter. The only case where you won't get stutter is very, very slow animation. These gaps are easily perceivable at high refresh rates. Motion blur effects (both in video as well as games) are used to hide those gaps at the expense if motion clarity.

Or maybe we shouldn't call this "stutter", actually. What's the term for "not smooth?" :P
TwitterSteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
User avatar
RealNC
 
Posts: 2792
Joined: 24 Dec 2013, 18:32

Re: Highest perceivable framerate?

Postby Chief Blur Buster » 01 Jun 2014, 19:09

Alcazar wrote:So, there you have it. Curious, I wonder if we will find some brains that perceive higher framerates though.

That's silly. Science papers have already disproven the notion of human brain functioning on frame rates. The human brain doesn't function on frame rates. "Brains that perceive higher framerates" is a non-sequitur, since there are many other variables that shifts these numbers around. There are multiple thresholds (e.g. flicker threshold, motion blur detection, stroboscopic detection), that function independently (see bottom of this post), but first, let's cover some bases first.

When we eliminate flicker (no CRT flicker, no strobe backlights, no impulse driving), then "Houston, We Have a Problem": persistence. Even 0ms instant GtG transitions can create motion blur (Why Do Some OLEDs Have Motion Blur?). For sample-and-hold displays that creates lots of motion blur, and if an engineer says you're not allowed to insert any blackness between refresh cycles, then you're totally hamstrung by persistence, forcing you to raise framerates in order to reduce motion blur, because trackign-based motion blur (http://www.testufo.com/eyetracking) is always mathematically dictated by the frame visibility length.

Full-Persistence (Sample-And-Hold) Displays Derive Benefits From Higher Hz More Than Impulse Driven Displays
a.k.a. You get more motion blur reductions via raising Hz on a traditional non-strobed LCD, than raising Hz on a CRT
(This only happens on flickerfree displays, not applicable to CRT)
60fps@60Hz -- 16.7ms persistence
120fps@120Hz -- 8.3ms persistence -- 50% less motion blur
240fps@240Hz -- 4.1ms persistence -- 75% less motion blur
480fps@480Hz -- 2ms persistence -- 87.5% less motion blur
960fps@960Hz -- 1ms persistence -- 93.75% less motion blur
There are some 1-bit monochrome DLPs in the laboratory capable of true 1000fps@1000Hz. Triple-digit refresh rates are still distinguishable via certain persistence effects and stroboscopic effects.

There are definitely points of diminishing returns, but the final frontier certainly isn't 120fps@120Hz. Real life doesn't flicker, so someday we have to permanently discontinue flicker, and achieve low-persistence via flicker-free manner. And the only way to do low persistence without any impulse-driving, is raising the Hz.

Vision research, and tests on actual true-500Hz displays
a.k.a. Yes, science labs have true-500Hz/1000Hz displays, and yes, we can tell a difference.
Vision researchers have shown that, given the correct motion test and correct display, most humans easily tell apart 240fps@240Hz versus 480fps@480Hz, if you're not allowed to lower the persistence of the display via impulse-driving techniques. Places like Viewpixx.com sells true-500Hz scientific testing displays, for vision researchers. The motion blur science is actually totally different from the old notion of "human framerate limit". What is true is diminishing points of returns occur, and impulse driving allows lower Hz (e.g. 60Hz or 120Hz) to be completely blurfree, but that is a disadvantage of introducing flicker.

Don't Forget Stroboscopic Effects Play a Role
a.k.a. The Old Fashioned Wagon Wheel Effect or the Mouse Dropping Effect
If you dramatically go from a 100Hz CRT to 200Hz CRTs, the benefits start to be noticed again in fewer strobe effects (e.g. half size steps between strobes, like waving a mouse cursor in circles on dark background). Retest a CRT at 100Hz then retest a CRT at 200Hz, the big step now begins to become noticeable again in the improved stroboscopic effect. Usually, 10Hz differences isn't stroboscopically noticeable, but doubled stroberates are often stroboscopically noticeable. Have you heard of the wagon wheel effect -- for example, an 8-spoke wheel spinning at 200 spins a second under a 1600Hz strobe backlight? The wheel still looks stationary; the human is still detecting stroboscopic differences. The stroboscopic effect (cousin of wagonwheel effect) is still indirectly detectable to humans even to 10,000Hz (see page 6 of lighting study paper). Also, I have done personal tests with an Arduino+LED where I was able to easily tell apart 1500Hz strobe versus 5000Hz strobe, via the stroboboscopic effect. However, it was much harder to tell apart a 1500Hz versus 2000Hz strobe (the phantom array effect (stroboscopic trailing effect) is only 1.3x sparser instead of 5x sparser). \

The familiar mouse dropping effect demonstration demonstrate that even a 1000fps@1000Hz display may not be the final frontier, either
a.k.a. Moving mouse very fast on a black background, and the "multiple cursors" effect
Likewise, the mousedropping effect is the same thing (stare stationary, move mouse quickly in a circle on a black background). The higher the Hz of your display, the smaller the gaps between the arrows. At 120Hz, the gaps between arrows is one-half as much as at 60Hz. At 240Hz, the gaps between arrows is one-quarter as much as at 60Hz. At 480Hz, the gaps between arrows is one-eighth as much as at 60Hz. At some point, the mouse looks like a continuous motion blur, but this only occurs if there are refreshes for each pixel position. For example, at 1000 pixel/second mouse movement (medium speed half-screen-width-per-second movement), you would need 1000fps@1000Hz to make the mouse arrow movement look like a continuous blur without any stroboscopic effect. Forget faster mouse movements, where the stroboscopic effect will come back anyway -- Motion still won't look "Holodeck-realistic" yet, unless you add intentional GPU motion-blurring effect to it to eliminate the stroboscopic effect (and GPU-based motion blurring is often undesirable).

Rule of thumb when testing high framerates
a.k.a. You must bypass diminishing points of returns via larger steps upwards
You need to jump up dramatic steps. e.g. Don't step 110fps->120fps, but step 100fps@100Hz -> 200fps@200Hz, or large steps upwards Once you hit diminishing points of returns, it requires much more dramatic steps upwards to start to notice differences again. However, what's already apparent is that differences continue to still be noticed at far higher numbers (200, 500, 1000), so the diminshing point of returns do not stop at 120 even for CRT. That said, it's much easier to tell apart on non-flicker displays (sample and hold displays) because the frame cycle is the persistence itself, and the only way to reduce persistence without strobing is a higher framerate. Reducing persistence is easier done by adding black time between refreshes, but as we all know, not everyone likes flicker. So currently, due to technological limitations, we cannot have our cake and eat it too. (aka "Do you want blur, or do you want flicker?")

Most motion blur on modern LCDs is caused by persistence, and NOT by GtG
Or strobe backlights wouldn't work at all!
Many LCD monitor users think the motion blur is caused by slow pixel response, when in fact that's not true for modern monitors (most motion blur on modern LCD is caused by persistence, and NOT via pixel response. Even 1ms LCD pixel response monitors still have 16.7ms of persistence, the only way to reduce persistence is to shorten the refreshes themselves, either via higher refreshrate or more black gaps between refreshes (ala impulse-driving). The silly pursuit of faster GtG response often neglects the fact that more motion blur is caused by persistence (pixel static state), instead of caused by GtG (pixel transition/movement between colors) because of the blur effect explained http://www.testufo.com/eyetracking ... So oldtimers who like CRT clarity and hate LCD motion blur, are often surprised that even 1ms 120Hz LCD still isn't as clear as CRT. Why does 1ms GtG LCD have more motion blur than 1ms persistence CRT? That's because GtG (pixel transition) is not the same thing as persistence (pixel visibility state). Most 1-2ms LCDs have a full frame cycle of persistence (e.g. 16.7ms persistence or more during 60Hz). And people often can't fathom persistence until they walk into a scientific lab and get shown a LCD doing motion tests at 60fps@60Hz, 120fps@120Hz and 240fps@240Hz, and then they suddenly realize that motion blur is directly proportional to framerate on a flickerfree display (eureka!), showing a continuous continuum of motion blur reduction that continues for a long time, even well beyond 1000fps, despite diminshing points of returns. Now, it finally makes sense that the only way to do the same amount of motion blur as LightBoost (1.4ms strobe = 1/700sec) without flicker is to use 700fps@700Hz (1.4ms frametimes).

Mouse Hz can be a limiting factor in seeing smoothness improvements on your monitor
a.k.a. 125Hz mice can prevent seeing microstutter improvements beyond ~125fps. Need 500/1000Hz to see fewer microstutters at beyond 125fps
Also, in the old days, PS/2 mice and 125Hz USB mice produced a limitation on the ability to see further motion fluidity improvements on CRTs. You would need an ultra-high Hz mouse (e.g. 500Hz-1000Hz) in order to see further motion fluidity improvements at ultrahigh frame rates / ultrahigh refresh rates. So, this produces another variable that hits against a barrier, e.g. using left/right mouse movement as a fluidity test. Mouse microstutter vibration amplitude is equal to (1/Hz). Thus an old 125Hz USB mouse can microstutter a 2000 pixels/sec panning by 8 pixels (e.g. 8 pixel microstutter vibrations -- 1/125th of 2000 is equal to 8). Switching to a 1000Hz mouse produces a huge fluidity upgrade especially on impulsed displays (CRTs and LightBoost) especially if like to avoid software-based mouse smoothing, since 1/1000th of 2000 pixels/second (panning of 1 screen width per second during medium-fast speed turns/strafing) would create only 2 pixel amplitude of mouse microstutters.

Flicker threshold is NOT the same thing as stroboscopic detection
a.k.a. You can still see a wagonwheel effect in a wagon wheel spinning at 400Hz under a 1600Hz xenon strobe
Commonly, the flicker threshold is the common sweet spot. Lots of humans have it around 75Hz or 85Hz, and some to 100-120Hz. So once you've reached this, you've mostly arrived at motion nirvana. However, it won't yet pass the theoretical Holodeck Turing Test ("Wow, I didn't Know I was Standing in a Holodeck!") if you're trying to emulate real life. Real life has no flicker. Real life doesn't force extra motion blur above-and-beyond your vision limitations. Real life has no static frames. There are always side effects visible. Someday, we need perfectly flicker-free low-persistence, and the only way to pull that off is via ultrahigh framerates (e.g. 1000fps@1000Hz). For now, we're stuck with low persistence via impulsing (e.g. CRT/LightBoost) due to technological limitations preventing us from reaching ultrahigh refreshrates except in the laboratory (e.g. ViewPixx Scientific Display, etc)

The Side Effect of Using Finite Framerates To Represent Motion

Stop-Motion Effect Threshold: Motion starts becoming continuous-looking at approximately 24fps. This threshold varies (human dependant, etc)
Flicker Threshold: Stops visibly being noticed at ~85Hz. Flicker thresholds vary (human to human, brightness, duty cycle, ambient light, etc)
Persistence/Blur Threshold: Stops visibly improving at ~1/1000th to ~1/10,000th second persistence (give or take an order of magnitude) depending on resolution/motionspeeds/other variables. CRT have constant persistence at all refresh rates , while flickerfree LCD persistence varies based on refresh rate (the sample-and-hold effect).
Stroboscopic effect Threshold: No real threshold for wagonwheel effect (1 megahertz strobe on a 1 megaspins/sec wheel, will still look stationary), though real-life stroboscopic effects tend to practically disappear beyond ~10KHz (lighting study paper, see page 6)

All the above thresholds are different. Your tests only exercised stop-motion threshold and flicker threshold, but not the persistence/blur threshold nor the stroboscopic effect threshold. Also, it didn't matter for CRT because CRT persistence is independent of refresh rate. Most gamers don't notice the subtle side effects of persistence/blur.

And while we're talking on this topic, one of my favourite quotes to educate people that vision science is not as black and white as "What's the highest perceivable framerate?":
Wagon Wheel Effect
- http://en.wikipedia.org/wiki/Wagon-wheel_effect
- http://www.michaelbach.de/ot/mot_wagonWheel/index.html
Example: An 8-spoke wagon wheel spinning clockwise 50 times per second under a 400 Hz stroboscope will appear to be stationary. However, if the stroboscope runs at 401 Hz, the wheel will spin slowly counter-clockwise. If the stroboscope runs at 399 Hz, the wheel will spin slowly clockwise.

Phantom Array Effect / Stroboscopic Effect
- http://opensiuc.lib.siu.edu/cgi/viewcon ... ontext=tpr (500 Hz detected)
- http://www.lrc.rpi.edu/programs/solidst ... licker.asp (300 Hz detected)
- http://www.lrc.rpi.edu/programs/solidst ... licker.pdf (10,000 Hz detected)
- http://cormusa.org/uploads/2012_2.10_Bu ... ffects.pdf
- http://people.ds.cam.ac.uk/ssb22/lighting.html
Synopsis: Humans can indirectly detect a 500 Hz flicker via the “phantom array” effect: A fast moving flickering light source in a dark room, appears as a dotted trail instead of a continuous blur. This can also occur when rapidly moving/rolling eyes in front of flickering lights in a darkened room (e.g. old LED alarm clocks, neon lights, unrectified LED decoration light strings). Flicker all the way up to 10,000 Hz was indirectly detectable in some studies, in certain situations.

Rainbow Artifacts (DLP projectors)
- http://www.projectorcentral.com/lcd_dlp ... -Artifacts
- http://www.ausmedia.com.au/DLP_Sensitive.htm
- http://en.wikipedia.org/wiki/Digital_Light_Processing
Synopsis: Related to the phantom array effect, rainbow artifacts are observed by some people watching images from a single-chip DLP projector, even with 4X or 6X color wheel (240Hz and 360Hz). Fast eye movement causes white objects on black background to have a rainbow-colored blur (red-green-blue trails) instead of a continuous white blur.


TL;DR: Human vision do not function on frame rates. Many indirect visual side effects (persistence motion blur, stroboscopic effect, etc) still occur from a series static frames trying to represent real-life continuous motion. That said, the highest scientifically confirmed distinguishable refresh rate (can I tell "X" Hz from "Y" Hx?) can go to thousands of Hz via specially crafted tests that exercise the stroboscopic effects / persistence effects.

In other words, ultrahigh framerates (1000fps@1000Hz) or framerateless (infinite-framerate) technology will be needed towards the end of this century for mimicking real life, ala Holodeck, to minimize all possible side effects of discrete refresh rates including stroboscopic effects.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors
User avatar
Chief Blur Buster
Site Admin
 
Posts: 6404
Joined: 05 Dec 2013, 15:44

Re: Highest perceivable framerate?

Postby flood » 01 Jun 2014, 19:52

well the "stutter" that vsync and gsync aim to eliminate is the one that I meant. I guess a term we agree on for this is microstutter.

The stutter you're talking about is due to the fact that each frame only samples from a single instant; this is analogous to the pixelation in a 3d game running without antialiasing.


my take on the framerate at which video becomes indistinguishable from infinite framerate video

1. eye stationary, no motion blur: several 1000. basically this depends on the speed of motion since there will be the gaps as discussed by realnc
2. eye stationary, motion blur: in the hundreds; I'd guess around 300 maybe. this is basically determined as the inverse of the width of the retina's step response.
3. eye tracking a single object on a plain background, no motion blur: 100hz refresh rate with submillisecond strobes should be enough for linear motion. for more complicated motion, well eye tracking isn't perfect and we're kind of back to case 1.
4. eye tracking a single object on a plain background, with motion blur: the object will be blurred due to the motion blur, and the amount of blur depends on the speed of the object. probably needs several 1000 hz.

from what I can tell, the only way for perfect (indistinguishable from infinite framerate) video without refresh rates in the 1000s is using eye-movement tracking and adding motion blur to whatever moves on the retina.
flood
 
Posts: 897
Joined: 21 Dec 2013, 01:25

Re: Highest perceivable framerate?

Postby Chief Blur Buster » 01 Jun 2014, 19:58

flood wrote:well the "stutter" that vsync and gsync aim to eliminate is the one that I meant. I guess a term we agree on for this is microstutter.

Good point. That would focus on:

Mouse Hz can be a limiting factor in seeing smoothness improvements on your monitor
a.k.a. 125Hz mice can prevent seeing microstutter improvements beyond ~125fps. Need 500/1000Hz to see fewer microstutters at beyond 125fps
Also, in the old days, PS/2 mice and 125Hz USB mice produced a limitation on the ability to see further motion fluidity improvements on CRTs. You would need an ultra-high Hz mouse (e.g. 500Hz-1000Hz) in order to see further motion fluidity improvements at ultrahigh frame rates / ultrahigh refresh rates. So, this produces another variable that hits against a barrier, e.g. using left/right mouse movement as a fluidity test. Mouse microstutter vibration amplitude is equal to (1/Hz). Thus an old 125Hz USB mouse can microstutter a 2000 pixels/sec panning by 8 pixels (e.g. 8 pixel microstutter vibrations -- 1/125th of 2000 is equal to 8). Switching to a 1000Hz mouse produces a huge fluidity upgrade especially on impulsed displays (CRTs and LightBoost) especially if like to avoid software-based mouse smoothing, since 1/1000th of 2000 pixels/second (panning of 1 screen width per second during medium-fast speed turns/strafing) would create only 2 pixel amplitude of mouse microstutters.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors
User avatar
Chief Blur Buster
Site Admin
 
Posts: 6404
Joined: 05 Dec 2013, 15:44

Re: Highest perceivable framerate?

Postby flood » 01 Jun 2014, 20:05

flood wrote:the only way for perfect (indistinguishable from infinite framerate) video

actually another way would be to have hardware motion interpolation built into the display panel. sounds ridiculous, but hey our brains can do it :D
flood
 
Posts: 897
Joined: 21 Dec 2013, 01:25

Re: Highest perceivable framerate?

Postby Chief Blur Buster » 01 Jun 2014, 20:18

flood wrote:1. eye stationary, no motion blur: several 1000. basically this depends on the speed of motion since there will be the gaps as discussed by realnc
Yep, a good rough soft guesstimate. The stroboscopic effect, the mousedropping effect, and also the lighting study paper too. Note: This is not a hard limit. It can go in the millions (e.g. extreme wagonwheel effect: 1 million Hz wagonwheel spinning at 1 million cycles per second will still look stationary) but real-life situations creating stroboscopic effects will become rarer and rarer as you go to the thousands of Hz, so a practical limit seems to be in "the thousands of Hz" where strobe effect stop being noticeable

flood wrote:2. eye stationary, motion blur: in the hundreds; I'd guess around 300 maybe. this is basically determined as the inverse of the width of the retina's step response.
This is beyond scope here, but keep in mind there are variables like viewing distance, resolution, etc. It's easier to see motion deficiencies on a bigger, higher-resolution display display, that covers more of your field of vision (imagine an 8192x8192 display covering your whole vision field). So the thresholds vary on the variables too.

flood wrote:3. eye tracking a single object on a plain background, no motion blur: 100hz refresh rate with submillisecond strobes should be enough for linear motion.
Yep, pretty much spot on: The flicker threshold, which may vary from under 60Hz to a bit above 100Hz, depending on human/environment/brightness/flicker duty cycle. Also, that's a single object on a black background (nothing else, no background textures to create stroboscopic side effects).

flood wrote:4. eye tracking a single object on a plain background, with motion blur: the object will be blurred due to the motion blur, and the amount of blur depends on the speed of the object. probably needs several 1000 hz.
Pretty close, but I'll go out on a limb and say: closer to the magnitude of 20,000Hz (guesstimate). Two Blur Busters users and myself, did some tests where 0.16ms and 0.5ms was distinguishable (via Service Menu tweaking of BENQ Z-Series strobe) -- so blur from 1/6250sec persistence versus 1/2000sec persistence was actually just about barely noticeable. Very, very barely (non-placebo). It is extremely hard to see and requires fast motionspeeds (e.g. 3000 pixels/second). At 3000 pixels/second, 0.5ms persistence creates 1.5 pixel of motion blurring in motion tests, while going down to 0.16 reduces that to 0.5 pixel of motion blurring. The fastest motionspeed at this monitor dpi at regular monitor viewing distance, is accurately tracking eyes at 3000 pixels/second on moving objects in TestUFO on my monitor. But 1080p is low-DPI, and the width of 1080p display don't give me enough time for accurately eye-tracking and observing motion blur since 3000 pixels/second means the motion is only onscreen for less than a second. So a wider display or a dome display (a retina display filling full vision field), or even a Holodeck so I can turn 360 degrees while tracking motion. Also head turning may allow tracking faster motion. Extrapolating this on full-vision field, higher DPI, (e.g. theoretical retina display covering 180-degree FOV), I'd say 0.05-0.1ms (1/20,000sec-1/10,000sec) which coincidentally, is very consistent with the lighting studies where detection of stroboscopic effects (wagonwheel effect type artifacts) pretty much disappear in real world use at ~10KHz-20KHz
Regardless, while there is disagreement, the agreement among experienced scientific researchers is unamious at least in the minimum magnitude: It's in the "thousands" of Hz, minimum before visible effects of going higher Hz completely disappear.

flood wrote:from what I can tell, the only way for perfect (indistinguishable from infinite framerate) video without refresh rates in the 1000s is using eye-movement tracking and adding motion blur to whatever moves on the retina.
Very interesting concept. Sounds more-or-less about right. Eye movement tracking, alas, isn't going to be practical since you need sub-millisecond latency for that. Suddenly flitting eyes back and fourth will otherwise create some strange motion artifacts, since even a 1ms added lag in eye-tracking, will create 1-pixel disjoints during 1000 pixels/sec complex motion. And it would preclude the ability to have multiple viewers on the same display. Unless it was designed as a virtual reality helmet designed to immerse you into a "Holodeck"-like environment. Even so, you would need sub-millisecond latency.

flood wrote:actually another way would be to have hardware motion interpolation built into the display panel. sounds ridiculous, but hey our brains can do it :D
This is another method, but that adds input lag. However, once the input refresh rate is high enough (e.g. 240Hz), then the lag of interpolation can be pretty low (e.g. 1/240sec latency to interpolate 240fps -> 2,400fps). So persistence could be lowered that way without using strobing.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors
User avatar
Chief Blur Buster
Site Admin
 
Posts: 6404
Joined: 05 Dec 2013, 15:44

Next

Return to Area 51: Display Science, Research & Engineering

Who is online

Users browsing this forum: No registered users and 2 guests