Ocular microtremors?

Advanced display talk, display hackers, advanced game programmers, scientists, display researchers, display manufacturers, vision researchers. The masters on Blur Busters.
Post Reply
User avatar
cosmitz
Posts: 6
Joined: 31 Jul 2014, 11:56

Ocular microtremors?

Post by cosmitz » 07 Jan 2015, 14:05


User avatar
Chief Blur Buster
Site Admin
Posts: 6509
Joined: 05 Dec 2013, 15:44

Re: Ocular microtremors?

Post by Chief Blur Buster » 07 Jan 2015, 14:54

Yep, I replied to Simon Cooke in the comments section.
Mark Rejhon wrote:Hello Simon!
You've written a fascinating article. There are so many unexpected visual phenomenae associated with using a rapid series of static images to represent motion. Whether it be judder, motion blur, wagonwheel effects, and this effect you describe.
There is actually really no specific framerate limit for indirect human eye detection, because any series of static images can potentially still produce wagonwheel effects or motion blur effects, although the likelihood diminishes the higher the framerate you go.

I run Blur Busters, a blog about 120Hz+ gaming monitors. During gaming, you have close viewing distances, fast eye tracking, fast panning speeds, very high-resolution graphics (especially many of users are using powerful graphics card setups costing over a grand). In this situation, the benefits of high refresh rates are amplified. However, for some of us, 120fps @ 120Hz still produces too much motion blur (on flicker-free displays such as LCDs). The reason is because most motion blur on LCD displays is caused by the display's own persistence (16.7ms at 60Hz) rather than its pixel response limitations (most 120Hz monitors have 1-2ms GtG). A great animation demonstrating that persistence (sample-and-hold) is the bigger cause of motion blur, is an animation I created at: www.testufo.com/eyetracking ...

From a display perspective, there are two main ways to reduce persistence to reduce motion blur: Raise the framerate (And refreshrate), or add black periods between frames. Many HDTVs do this, by interpolation or by flickering their backlights (or both). Motion blur caused by eye-tracking is directly proportional to the frame period. At 120Hz, each frame is displayed for 1/120sec on a typical flickerfree display, even if pixel transitions are instantaneous. That's why even OLEDs can have motion blur (www.blurbusters.com/faq/oled-motion-blur). 1/120sec is 8.3ms, and the percieved motion blur (www.testufo.com/eyetracking) is the same as a camera shutter speed of 1/120second -- it will blur during camera motion (across a static scene), and likewise it will blur during eye tracking motion (across static refresh). This makes it [impossible] for a regular 60Hz or 120Hz LCD display to successfully pass the TestUFO Panning Map Readability Test at http://testufo.com/#photo=toronto-map.png&test=photo ... where you try to read the street name labels while the map is panning. On a scientifically perfect display, it would be as readable as a similiar-size paper map being panned across your face at a similiar speed. Yet, if you have a CRT display, or a high-efficiency strobe backlight LCD (like NVIDIA's ULMB, LightBoost, EIZO's Turbo240, or BENQ's Blur Reduction) at www.blurbusters.com/faq/120hz-monitors, you can read the street name labels. For a faster motion speed, you need shorter persistence to keep things extremely sharp. There is 1 pixel of motion blur for every 1ms of persistence during 1000 pixels/second. During fast action video games, accurate eye tracking can go faster than one-screen-width-per-second. Tests at varying motion speeds at www.testufo.com (selectable speeds at top) shows that many gamers are able to track at motionspeeds of approximately 2000 pixels/sec to 4000 pixels/sec, depending on size of screen and viewing distance, on a typical 1920x1080 display, with typical gaming view distance being approximately 1:1 screen width away from your eyes. We also subsequently found that our eyes were able to tell apart 0.5ms persistence and 1.0ms persistence in the TestUFO panning map test at the motionspeed of 3000 pixels/sec -- since that is comparing 1.5pixel of motion blurring versus 3pixel of motion blurring -- a big difference in how easily fast-panning tiny 6-point text is readable. On some models of strobe backlights, such as the ASUS PG278Q, which has a strobe duty cycle control, you can get the strobe flash as small as about 0.25 milliseconds.

There are certainly major points of diminishing returns, but it's worth pointing out that the only way to successfully achieve 1ms of persistence without using flicker, is to fill a whole second with unique 1ms-long frames, resulting in 1000fps@1000Hz if you wanted to avoid flicker or black-frame insertion. And we haven't gone into the subject of input lag, which is another subject altogether beyond the scope of this followup.

It really shows that displays have truly a long way to go before they successfully pass a theoretical Holodeck Turning Test, where VR looks like real life, as in "Wow, I didn't know I was wearing VR goggles instead of transparent ski goggles!"
Mark Rejhon wrote:The use of a motion blur reduction strobe backlight now commonly found in most newer 120Hz desktop gaming monitors, now all easily pass the TestUFO panning map test (I have 6 of those monitors sitting here). They also make HFR 120fps video look crystal smooth, too. I occasionally blog about HFR too, at http://www.blurbusters.com/category/hfr/
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

blargg
Posts: 66
Joined: 20 Sep 2014, 03:59

Re: Ocular microtremors?

Post by blargg » 08 Jan 2015, 00:22

It would be interesting to do some informal tests of the things he theorizes. E.g. an image flashed for 1/80 sec should look less-clear than one at 1/40 sec, which should look just as clear as the 1/20 sec one (or maybe you'd start at 1/40; either way the idea is the same). This might not show such a profound effect, so perhaps you'd have the image flash between two sets of fields at 40Hz with user adjustment, and the user should see whether they can get slow "beats" of detailed appearance among it looking pixellated. That'd be pretty profound a demonstration and confirmation.

Post Reply