Yes. Assuming 1000Hz input reads (average input latency of 0.5ms) and 1ms flash (midpoint of flash would be 0.5ms ago, aka the midpoint of the nearly-negligible motion blur). Since 0.5ms+0.5ms equals 1ms, you're in the right ballpark.jlafarga wrote:so what you mean by low persistency is that if you ran the display at say 120hz and flashed the cursor for 1ms your brain would interpret it as if though the cursor was 1 millimeter behind your finger ??? (if your finger was moving at 1 meter per second)
There are multiple variables involved, other lag factors (e.g. software, display signal latency, etc) that can fudge this around, including human vision response, as well as whether or not the display adds latency in order to do strobing. It's fun to play with http://www.testufo.com/blackframes to study the latency-lowering effect of lower persistence.