Re: After Images On Display and Persistence of Vision [Long!
Posted: 18 Dec 2013, 21:15
Mark Rejhon wrote:I agree with your assertion that 0.25ms strobes would start being the very flat side of the diminishing-returns curve, at least about 1:1 ratio screen viewing distance to screen size (24" viewing distance from a 24" display). That is a very accurate guess. At 0.25ms, you will only have ~1 pixel of motion blurring on a 4K display during fast 4000 pixels/sec motion.
But you never know, that might still not end up being the final frontier when we start talking about 4K virtual reality goggles strapped to your head. Imagine turning around 180 degrees quickly, and getting 10,000 pixels/second motion. You'd still get 2.5 pixels of motion blurring, which might actually be briefly noticeable in the form of slightly blurried-up high details (if we have ultra-powerful graphics and displays of "Retina"-quality graphics covering the whole human visionfield). But at these levels, the stroboscopic effect is a bigger problem.
-- Yes, 100Hz is often good enough...
BUT....It should never be the final frontier.
For example:
1. Go to http://www.testufo.com/photo#photo=eiffel.jpg
2. Put your fingertip in the middle of the top of the pan (where the eiffel tower antenna goes underneath)
3. Stare at your fingertip, stare stationary, don't track your eyes on the antenna
4. Observe you see multiple copies of the Eiffel Tower antenna as it pans underneath your finger.
5. That is a side effect of a finite framerate display.
120Hz will have 1/2th the separation in this effect relative to 60Hz
240Hz will have 1/4th the separation in this effect relative to 60Hz
480Hz will have 1/8th the separation in this effect relative to 60Hz
So flicker isn't the only problem. You still get wagonwheel / strobing effects, and that still bothers people.
However, some people still hate flicker, and they still get headaches with 120Hz flicker. So 120Hz is still not the final frontier. There was a study on lighting, and that humans could tell stroboscopic effects even all the way to 10,000Hz in the lighting industry. Here's the PDF paper: http://www.lrc.rpi.edu/programs/solidst ... licker.pdf ... I agree with this assertion as I have personally done Arudino tests flickering an LED at 1KHz and 2KHz -- I could still detect the LED was flickering at 2KHz simply by rolling my eyes around aggressively. This creates a dotted blur, instead of a continuous motion blur.
So basically people aren't bothered by the flicker, but bothered by the stroboscopic effect (like getting a headache from the wagonwheel effect because it confuses their brain). You are already familiar with the wagon wheel effect still being visible at hundreds and thousands of Hertz? Even moving eyes around in a room that uses a strobe light flickering ultra fast at 1KHz (squarewave, e.g. phosphorless LED), causes immediate headaches for a small fraction of population. The stroboscopic effect gives some people a headache. Wagonwheel effects, strobe effects, and other side effects of non-continuous light, often occur. The paper is 100% applicable to displays, and that's why someday I'd love to see a high-framerate flickerfree display (1000fps@1000Hz), to finally be simultaneously strobe-free and blur-free (mathematically impossible without going to quadruple-digit refresh rates, at least at the location the human eye is directly staring at).
It's hard to scientifically determine exactly what the headaches are caused by -- the flicker? the strange effects? the strobe array effect? etc? -- but my experience is some people get headaches from PWM dimming (360Hz, 432Hz) and that's why BENQ started the ZeroFlicker initiative. I know people who get headaches from PWM dimming at these levels. And I also despise LED car taillights that uses less than 5KHz PWM, because it's so easy to see the phantom array strobe effect (When moving my eyes around very fast) while driving at night behind these tail lights.
It's hard to define a limit, but going from 120Hz flicker to 20,000Hz flicker, appears to eliminate something like ~50% of headaches, according to the lighting industry PDF paper. Today, in Canada, most fluorescent lights uses 20KHz electronic ballasts, instead of 120Hz old-fashioned ballasts.
LightBoost users report having more headaches (flicker headaches), while other LightBoost users report having less headaches (motion blur headaches). Scientists need to focus on studying these aspects. One such person is posted in the LightBoost testimonials, I have run across a dozen of posts that indicate similiar experiences (people who get headaches with LCD's, but never with CRT's... And people who get headaches with CRT's, but never with LCD's). And we've got other complications such as people getting headaches from the excess blue light from LED backlights. So there are so many different causes of headaches that displays can give, and I believe every single cause is legitimate. Manufacturers like BENQ are finally focussing on this problem (low blue light, ZeroFlicker during non-LightBoost mode, etc).
Strobing is a compromise, because you need one strobe per unique frame is required for best motion quality. Which means lower-frequency flicker that some people are affected by. Hopefully this is just a temporary stopgap in humankind, towards a future display solution (ultra-high-update-rate displays of some kind, or framerateless continuous-motion technologies that never has static frames during display of moving images, even under ultrahighspeed cameras, etc --
Who knows what new tech may come later this century. I read somewhere of some very experimental camera that accurately timecodes each photon that arrives -- I forgot which one. If this can be done for visible light spectrum and a display can be created that can reliably play back these photons, we would theoretically be able to do framerateless display that can be indistinguishable from real life like a Star Trek Holodeck, no display-enforced motion blur, no sample and hold, no strobing, no wagonwheel effect, everything is natural by human eye. Or it might all be impossible; and we're stuck with the invention of "frame rate" and "refresh rate" (manmade inventions ever since humans invented movies and television) and its attendant side effects like strobing, or wagonwheel effects, or motion blur from persistence, etc; that affects motion quality.
We wish there were more scientists studying this, for peer-reviewed papers, but I can certianly confirm several human specimens that are affected by all the above, in various different ways...and I agree with the existing research that the ballparks of these numbers (e.g. 20KHz) is the magic number where the vast majority of population stops noticing "Something's odd about this light source..." and this is all very nicely applicable to the display industry, and not just the lighting industry (few scientists make the connection)
Sincerely,
Mark Rejhon