Page 1 of 1

Lag Randomness from Refresh Rate Limit. Need 1000Hz display!

Posted: 30 Sep 2017, 10:30
by Chief Blur Buster
Most sites and rreviewers don't realize that aiming accuracy is affected by lag randomness (min lag/max lag being very different), which is another error factor other than absolute lag! It even occurs during apples-vs-apples comparisions (1000fps-vs-1000fps VSYNC OFF comparisions at multiple different refresh rates).

Lag jittering (min/max/avg) improvements is especially very dramatic when you compare 60Hz versus 240Hz. This is truly yet another reason to get 240Hz instead of 144Hz.


At 1000fps @ 60Hz, your min/max is 14/27 (a 13ms random-lag range!). But at 1000fps @ 240Hz, your min/max is 12/16 (a 4ms random-lag range!).

This is based on "first-screen-reaction" measurement metric, not "single-point-screen-reaction" -- like an eSports player reacting to a full screen flash. This reveals random-lag effects that are not revealed by single-point-screen-reaction measurements (which is important too, e.g. crosshairs) but consistent lag throughout the screen can be important.

Different parts of the screen have amplified differences in lag at lower refresh rate. That close-range enemy (or explosion, or full-screen-height activity) you're about to react to to in a competitive game --- will often be noticed somewhere else on the screen far away from the crosshairs, before the pixel refresh reaches the monitor's screen centre. Your peripherial vision will react before your vision centre, often creating reaction time errors caused by refresh rate granularity. No matter how high your frame rate is during VSYNC OFF.
Going to 240Hz instead of 144Hz reduces this lag-randomness error margin caused by scanout slowness.

At any VSYNC OFF at any frame rate, lag "randomization" error margin, caused by refresh rate limitation, is up to one full refresh cycle. That improves aiming quite a lot when the refresh-rate-granularity-related forced lag jitterring (lag randomness) is reduced by sheer insanity in refresh rates.

Absolute lag is important. But lag consistency is important. High-Hz improves lag consistency. Many other mainstream sites do not understand this. But we do.

Manufacturters, bring on the 480Hz and 1000Hz monitors!

Re: Lag Randomness caused by Refresh Rate Limitation

Posted: 30 Sep 2017, 10:42
by Chief Blur Buster
To improve understanding of this topic, each frame-slice (during VSYNC OFF) is its own independent lag gradient:

240 frames per second at 120Hz


Here's some newer some diagrams of screen scanouts (a random 4 refresh cycle snapshot in a continuous stream of refresh cycles; frame numbers and refresh cycle numbers are relative):

288 frames per second at 144Hz


432 frames per second at 144Hz


1000 frames per second at 144Hz


The heights of frame slices are proportional to frame render times. The scan is always a constant speed.

The top edge of frame slices always have less input lag than the bottom edge. The use of higher frame rates will reduce input lag gradients of each frame slices (during VSYNC OFF or Fast Sync), but will never be able to do a global-refresh of the screen simultaneously. That means you can still react to events too early/too late (e.g. an explosion flash or close-range enemy) that begins appearing for the first time near the top edge or bottom edge (instead of screen centre at crosshairs). Using a higher refresh rate, gets you proportionally closer to a lagless global-refresh display.

Using VSYNC ON fixes this lag random jittering, but adds too much absolute lag. But ultra-high-Hz helps to solve both VSYNC ON and VSYNC OFF simultaneously! VRR is absolutely fantastic and will be needed for a great many years to come, and still likely be relevant at ever-higher Hz (e.g. 1000Hz VRR), since microstutters are still see-able at 480Hz (from our tests).

While some displays scan-convert (e.g. plasma, DLP scans differently from the pixel dlivery order on the cable).... most eSports LCDs are synchronous between cable scanout and panel scanout. Nowadays, often using tiny rolling-window line buffers (instead of prebuffering a full refresh cycle) for low-lag overdrive processing, etc.

This is why we need 480Hz and 1000Hz displays. Other websites, several stupidly say that 240Hz is not worth it, but according to our tests (we recently tested a 480Hz display) -- a 1000Hz display can actually, in theory, be actually more important lag-wise than 1000Hz mice! (Both used together, will be very sweet, indeed, though!)

There was homebrew 240Hz in 2013 before the manufacturers did it. Now, there is a literally homebrew 480Hz, before manufacturers even are remotely thinking of 480Hz. The scientific/laboratory industry is already in the quadruple-digit refresh rates now (e.g. ViewPixx true-1440Hz DLP projector).

Our predictions:
Mainstream 480Hz gaming monitors -- roughly ~2020
Mainstream 1000Hz gaming monitors -- sometime ~2025 once GPUs have frame rate amplification tech

I do think homebrew will beat manufacturers (yet again) to a true-1000Hz gaming monitor. I'm surprised that manufacturers don't realize just how important >240Hz will be for future eSports gaming -- it truly reduces lag-randomness error, as I've explained above in my previous post.

And, before anyone complains about GPU-limitation problems, don't forget Frame Rate Amplification Technologies will make 1000fps possible in mid-range GPUs by 2025.

Today, Oculus is doing it in software to convert 45fps to 90fps, without needing a more powerful GPU -- but eventually this will become far more artifact-free, more geometry-aware, and in silicon -- this is key to 1000fps with high detail levels without needing unobtainium GPUs!

The ability to convert 200fps->1000fps using 5:1 frame rate amplification technology (lagless geometry-aware interpolators with improved occulsion-reveal de-artifacting, from improved multi-layer Zbuffers and similiar advanced artifact-free re-projection tricks) -- this is the breakthrough making blurless sample-and-hold possible in less than 1 human generation! Today, Oculus 45->90fps. Tomorrow, 100fps->1000fps or 200fps->1000fps (and more artifact-free, too!). So GPUs can be a problem-solved within a few years.

If NVIDIA and AMD reads this, they need to heed this -- consider adding dedicated silicon for frame rate amplification tech -- for the journey to the "real-life display" (strobeless blur reduction) -- as ultrahigh Hz is needed for "blurless sample-and-hold" or "strobeless ULMB", or "flicker-less CRT" -- basically combining perfect motion clarity and sample-and-hold (With no blurring) as low-persistence with no strobing needed.

So... Manufacturers, bring on 480Hz and 1000Hz+.
--> It reduces absolute lag
--> It reduces lag-randomness by being closer to a global-refresh display
--> It reduces motion blur without strobing
--> It's closer to real life (no strobing AND no blur)

(P.S. I have emailed several manufacturers this link)

Re: Lag Randomness from Refresh Rate Limit. Need 1000Hz disp

Posted: 30 Sep 2017, 11:20
by Chief Blur Buster
Also, for credentials, since this probably brings in some new engineer-lurkers:

I currently have a peer-reviewed conference paper with researchers from, NOKIA and KELTEK:

So, now you can pay attention; we truly know what we're saying -- when we say we need 1000fps@1000Hz.


Many reviewers are now my invention of photographing motion blur cheaply via a camera-rail-mounted pursuit camera (tracking camera stand-in for a tracking eyeball, for true WYSIWYG motion blur photography representative of persistence/MPRT) -- including,,, SWEclockers,, HDTV Poland, TechPorn Phillipines, etc.

From these research, we have definitively confirmed that the only way to reduce persistence (reduce motion blur further without strobing) -- aka blurless sample-and-hold -- is trying to get closer to real-life; since real life has no strobing and no refresh rate. The only way to get mathematically closer to that, is to keep increasing refresh rate.

Motion blur is good and artistic when used properly. BUT, you don't want extra motion blur forced upon you, above-and-beyond real life, if you're trying to emulate real life. Or trying to mimick motion clarity of a monitor with a Blur Reduction mode -- trying to do the same without using strobing (e.g. ULMB, DyAc, ELMB, LightBoost, etc) requires currently seemingly-insane refresh rates.

Mathematically, 1ms persistence (1ms MPRT) translates to 1 pixel of motion blur during 1000 pixels/second motion. (This formula becomes more exact for squarewave persistence, but still is approximately near-exact for sample-and-hold displays when the majority of GtG pixel response is a tiny fraction of a refresh cycle)

This also means, even 1000fps isn't the final frontier in humankind. During virtual reality, even 1ms strobed persistence still creates 8 pixels of motion blur during a moderate-speed head-turn on a 4K VR headset (8000 pixels/sec panning) -- that's still way too much.

However, 1000fps@1000Hz will be a good stepping stone for the 2020s.