Alcazar wrote:So, there you have it. Curious, I wonder if we will find some brains that perceive higher framerates though.
That's silly. Science papers have already disproven the notion of human brain functioning on frame rates. The human brain doesn't function on frame rates
. "Brains that perceive higher framerates" is a non-sequitur, since there are many other variables
that shifts these numbers around. There are multiple thresholds (e.g. flicker threshold, motion blur detection, stroboscopic detection), that function independently (see bottom of this post), but first, let's cover some bases first.
When we eliminate flicker (no CRT flicker, no strobe backlights, no impulse driving), then "Houston, We Have a Problem
": persistence. Even 0ms instant GtG transitions can create motion blur (Why Do Some OLEDs Have Motion Blur?
). For sample-and-hold displays that creates lots of motion blur, and if an engineer says you're not allowed to insert any blackness between refresh cycles, then you're totally hamstrung by persistence, forcing you to raise framerates in order to reduce motion blur, because trackign-based motion blur (http://www.testufo.com/eyetracking
) is always mathematically dictated by the frame visibility length.
Full-Persistence (Sample-And-Hold) Displays Derive Benefits From Higher Hz More Than Impulse Driven Displays
a.k.a. You get more motion blur reductions via raising Hz on a traditional non-strobed LCD, than raising Hz on a CRT
(This only happens on flickerfree displays, not applicable to CRT)
60fps@60Hz -- 16.7ms persistence
120fps@120Hz -- 8.3ms persistence -- 50% less motion blur
240fps@240Hz -- 4.1ms persistence -- 75% less motion blur
480fps@480Hz -- 2ms persistence -- 87.5% less motion blur
960fps@960Hz -- 1ms persistence -- 93.75% less motion blur
There are some 1-bit monochrome DLPs in the laboratory capable of true 1000fps@1000Hz. Triple-digit refresh rates are still distinguishable via certain persistence effects and stroboscopic effects.
There are definitely points of diminishing returns, but the final frontier certainly isn't 120fps@120Hz. Real life doesn't flicker, so someday we have to permanently discontinue flicker, and achieve low-persistence via flicker-free manner. And the only way to do low persistence without any impulse-driving, is raising the Hz.
Vision research, and tests on actual true-500Hz displays
a.k.a. Yes, science labs have true-500Hz/1000Hz displays, and yes, we can tell a difference.
Vision researchers have shown that, given the correct motion test and correct display, most humans easily tell apart 240fps@240Hz versus 480fps@480Hz, if you're not allowed to lower the persistence of the display via impulse-driving techniques. Places like Viewpixx.com sells true-500Hz scientific testing displays, for vision researchers. The motion blur science is actually totally different from the old notion of "human framerate limit". What is true is diminishing points of returns occur, and impulse driving allows lower Hz (e.g. 60Hz or 120Hz) to be completely blurfree, but that is a disadvantage of introducing flicker.
Don't Forget Stroboscopic Effects Play a Role
a.k.a. The Old Fashioned Wagon Wheel Effect or the Mouse Dropping Effect
If you dramatically go from a 100Hz CRT to 200Hz CRTs, the benefits start to be noticed again in fewer strobe effects (e.g. half size steps between strobes, like waving a mouse cursor in circles on dark background). Retest a CRT at 100Hz then retest a CRT at 200Hz, the big step now begins to become noticeable again in the improved stroboscopic effect. Usually, 10Hz differences isn't stroboscopically noticeable, but doubled stroberates are often stroboscopically noticeable. Have you heard of the wagon wheel effect -- for example, an 8-spoke wheel spinning at 200 spins a second under a 1600Hz strobe backlight? The wheel still looks stationary; the human is still detecting stroboscopic differences. The stroboscopic effect (cousin of wagonwheel effect) is still indirectly detectable to humans even to 10,000Hz (see page 6 of lighting study paper
). Also, I have done personal tests with an Arduino+LED where I was able to easily tell apart 1500Hz strobe versus 5000Hz strobe, via the stroboboscopic effect. However, it was much harder to tell apart a 1500Hz versus 2000Hz strobe (the phantom array effect (stroboscopic trailing effect) is only 1.3x sparser instead of 5x sparser). \
The familiar mouse dropping effect demonstration demonstrate that even a 1000fps@1000Hz display may not be the final frontier, either
a.k.a. Moving mouse very fast on a black background, and the "multiple cursors" effect
Likewise, the mousedropping effect is the same thing (stare stationary, move mouse quickly in a circle on a black background). The higher the Hz of your display, the smaller the gaps between the arrows. At 120Hz, the gaps between arrows is one-half as much as at 60Hz. At 240Hz, the gaps between arrows is one-quarter as much as at 60Hz. At 480Hz, the gaps between arrows is one-eighth as much as at 60Hz. At some point, the mouse looks like a continuous motion blur, but this only occurs if there are refreshes for each pixel position. For example, at 1000 pixel/second mouse movement (medium speed half-screen-width-per-second movement), you would need 1000fps@1000Hz to make the mouse arrow movement look like a continuous blur without any stroboscopic effect. Forget faster mouse movements, where the stroboscopic effect will come back anyway -- Motion still won't look "Holodeck-realistic" yet, unless you add intentional GPU motion-blurring effect to it to eliminate the stroboscopic effect (and GPU-based motion blurring is often undesirable).
Rule of thumb when testing high framerates
a.k.a. You must bypass diminishing points of returns via larger steps upwards
You need to jump up dramatic steps. e.g. Don't step 110fps->120fps, but step 100fps@100Hz -> 200fps@200Hz, or large steps upwards Once you hit diminishing points of returns, it requires much more dramatic steps upwards to start to notice differences again. However, what's already apparent is that differences continue to still be noticed at far higher numbers (200, 500, 1000), so the diminshing point of returns do not stop at 120 even for CRT. That said, it's much easier to tell apart on non-flicker displays (sample and hold displays) because the frame cycle is the persistence itself, and the only way to reduce persistence without strobing is a higher framerate. Reducing persistence is easier done by adding black time between refreshes, but as we all know, not everyone likes flicker. So currently, due to technological limitations, we cannot have our cake and eat it too. (aka "Do you want blur, or do you want flicker?")
Most motion blur on modern LCDs is caused by persistence, and NOT by GtG
Or strobe backlights wouldn't work at all!
Many LCD monitor users think the motion blur is caused by slow pixel response, when in fact that's not true for modern monitors (most motion blur on modern LCD is caused by persistence, and NOT via pixel response. Even 1ms LCD pixel response monitors still have 16.7ms of persistence, the only way to reduce persistence is to shorten the refreshes themselves, either via higher refreshrate or more black gaps between refreshes (ala impulse-driving). The silly pursuit of faster GtG response often neglects the fact that more motion blur is caused by persistence (pixel static state), instead of caused by GtG (pixel transition/movement between colors) because of the blur effect explained http://www.testufo.com/eyetracking
... So oldtimers who like CRT clarity and hate LCD motion blur, are often surprised that even 1ms 120Hz LCD still isn't as clear as CRT. Why does 1ms GtG LCD have more motion blur than 1ms persistence CRT? That's because GtG (pixel transition) is not the same thing as persistence (pixel visibility state). Most 1-2ms LCDs have a full frame cycle of persistence (e.g. 16.7ms persistence or more during 60Hz). And people often can't fathom persistence until they walk into a scientific lab and get shown a LCD doing motion tests at 60fps@60Hz, 120fps@120Hz and 240fps@240Hz, and then they suddenly realize that motion blur is directly proportional to framerate on a flickerfree display (eureka!), showing a continuous continuum of motion blur reduction that continues for a long time, even well beyond 1000fps, despite diminshing points of returns. Now, it finally makes sense that the only way to do the same amount of motion blur as LightBoost (1.4ms strobe = 1/700sec) without flicker is to use 700fps@700Hz (1.4ms frametimes).
Mouse Hz can be a limiting factor in seeing smoothness improvements on your monitor
a.k.a. 125Hz mice can prevent seeing microstutter improvements beyond ~125fps. Need 500/1000Hz to see fewer microstutters at beyond 125fps
Also, in the old days, PS/2 mice and 125Hz USB mice produced a limitation on the ability to see further motion fluidity improvements on CRTs. You would need an ultra-high Hz mouse (e.g. 500Hz-1000Hz) in order to see further motion fluidity improvements at ultrahigh frame rates / ultrahigh refresh rates. So, this produces another variable that hits against a barrier, e.g. using left/right mouse movement as a fluidity test. Mouse microstutter vibration amplitude is equal to (1/Hz). Thus an old 125Hz USB mouse can microstutter a 2000 pixels/sec panning by 8 pixels (e.g. 8 pixel microstutter vibrations -- 1/125th of 2000 is equal to 8). Switching to a 1000Hz mouse produces a huge fluidity upgrade especially on impulsed displays (CRTs and LightBoost) especially if like to avoid software-based mouse smoothing, since 1/1000th of 2000 pixels/second (panning of 1 screen width per second during medium-fast speed turns/strafing) would create only 2 pixel amplitude of mouse microstutters.
Flicker threshold is NOT the same thing as stroboscopic detection
a.k.a. You can still see a wagonwheel effect in a wagon wheel spinning at 400Hz under a 1600Hz xenon strobe
Commonly, the flicker threshold is the common sweet spot. Lots of humans have it around 75Hz or 85Hz, and some to 100-120Hz. So once you've reached this, you've mostly arrived at motion nirvana. However, it won't yet pass the theoretical Holodeck Turing Test ("Wow, I didn't Know I was Standing in a Holodeck!") if you're trying to emulate real life. Real life has no flicker. Real life doesn't force extra motion blur above-and-beyond your vision limitations. Real life has no static frames. There are always side effects visible. Someday, we need perfectly flicker-free low-persistence, and the only way to pull that off is via ultrahigh framerates (e.g. 1000fps@1000Hz). For now, we're stuck with low persistence via impulsing (e.g. CRT/LightBoost) due to technological limitations preventing us from reaching ultrahigh refreshrates except in the laboratory (e.g. ViewPixx Scientific Display, etc)
The Side Effect of Using Finite Framerates To Represent Motion
Stop-Motion Effect Threshold:
Motion starts becoming continuous-looking at approximately 24fps. This threshold varies (human dependant, etc)
Stops visibly being noticed at ~85Hz. Flicker thresholds vary (human to human, brightness, duty cycle, ambient light, etc)
Stops visibly improving at ~1/1000th to ~1/10,000th second persistence (give or take an order of magnitude) depending on resolution/motionspeeds/other variables. CRT have constant persistence at all refresh rates , while flickerfree LCD persistence varies based on refresh rate (the sample-and-hold effect).
Stroboscopic effect Threshold:
No real threshold for wagonwheel effect (1 megahertz strobe on a 1 megaspins/sec wheel, will still look stationary), though real-life stroboscopic effects tend to practically disappear beyond ~10KHz (lighting study paper
, see page 6)
All the above thresholds are different. Your tests only exercised stop-motion threshold and flicker threshold, but not the persistence/blur threshold nor the stroboscopic effect threshold. Also, it didn't matter for CRT because CRT persistence is independent of refresh rate. Most gamers don't notice the subtle side effects of persistence/blur.
And while we're talking on this topic, one of my favourite quotes to educate people that vision science is not as black and white as "What's the highest perceivable framerate?":
TL;DR: Human vision do not function on frame rates. Many indirect visual side effects (persistence motion blur, stroboscopic effect, etc) still occur from a series static frames trying to represent real-life continuous motion. That said, the highest scientifically confirmed distinguishable refresh rate (can I tell "X" Hz from "Y" Hx?) can go to thousands of Hz via specially crafted tests that exercise the stroboscopic effects / persistence effects.
In other words, ultrahigh framerates (1000fps@1000Hz) or framerateless (infinite-framerate) technology will be needed towards the end of this century for mimicking real life, ala Holodeck, to minimize all possible side effects of discrete refresh rates including stroboscopic effects.
Wagon Wheel Effect
Example: An 8-spoke wagon wheel spinning clockwise 50 times per second under a 400 Hz stroboscope will appear to be stationary. However, if the stroboscope runs at 401 Hz, the wheel will spin slowly counter-clockwise. If the stroboscope runs at 399 Hz, the wheel will spin slowly clockwise.
Phantom Array Effect / Stroboscopic Effect
- http://opensiuc.lib.siu.edu/cgi/viewcon ... ontext=tpr
(500 Hz detected)
- http://www.lrc.rpi.edu/programs/solidst ... licker.asp
(300 Hz detected)
- http://www.lrc.rpi.edu/programs/solidst ... licker.pdf
(10,000 Hz detected)
- http://cormusa.org/uploads/2012_2.10_Bu ... ffects.pdf
Synopsis: Humans can indirectly detect a 500 Hz flicker via the “phantom array” effect: A fast moving flickering light source in a dark room, appears as a dotted trail instead of a continuous blur. This can also occur when rapidly moving/rolling eyes in front of flickering lights in a darkened room (e.g. old LED alarm clocks, neon lights, unrectified LED decoration light strings). Flicker all the way up to 10,000 Hz was indirectly detectable in some studies, in certain situations.
Rainbow Artifacts (DLP projectors)
- http://www.projectorcentral.com/lcd_dlp ... -Artifacts
Synopsis: Related to the phantom array effect, rainbow artifacts are observed by some people watching images from a single-chip DLP projector, even with 4X or 6X color wheel (240Hz and 360Hz). Fast eye movement causes white objects on black background to have a rainbow-colored blur (red-green-blue trails) instead of a continuous white blur.