The Amazing Human Visible Feats Of The Millisecond

Advanced display talk, display hackers, advanced game programmers, scientists, display researchers, display manufacturers, vision researchers & Advanced Display Articles on Blur Busters. The masters on Blur Busters.
Locked
User avatar
Chief Blur Buster
Site Admin
Posts: 9071
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

The Amazing Human Visible Feats Of The Millisecond

Post by Chief Blur Buster » 08 May 2020, 12:57

Crosspost from the other thread

____________________________

Advantages of high refresh rates?

Someone dare ask that question on Blur Busters ;)
While OP knows some of this already, most posters don't realize how many contexts the millisecond is important in.

<Big Rabbit Hole>
I am going write a shorter version of one of famous Blur Busters flavored pieces.
This post will be a bit of a boast, justifiably so, because we're the "Everything Better Than 60Hz" website.

There are many contexts where the humble millisecond is important. Sometimes it's not important. Sometimes it's useless. But milliseconds are important in lots of display science -- motion clarity, strobe crosstalk, reaction times, refresh cycles, stutters, frametime differences, latency, etc. Sometimes you optimize a display to have 1ms less latency, but there's occasionally also (internal engineering) beneficial side effects when so many display factors interact with each other.

Frametime Context / Refresh Rate Context
The most famous Milliseconds Matter example, 144fps vs 240fps is only a 2.78ms frametime difference, YET it is still human visible as improved motion, including less motion blur on sample-and-hold displays, and reduced stroboscopic effects. Also, 240Hz vs 360Hz is only a 1.38ms difference yet still human-visible too.

Image

The above is simplified because it's slower motion (960 pixels/sec). It becomes much more visible (and 1000Hz shows limitations) at higher motion speeds like 3840 pixels/sec rather than 960 pixels/sec as in above.

That said, we ideally need to go up the curve geometrically (60Hz->120Hz->240Hz->480Hz->960Hz) rather than incrementally (144Hz->165Hz, 240Hz->280Hz). 60Hz-vs-144Hz is a 2.4x improvement in motion clarity (if GtG=0), while 144Hz-vs-240Hz is only a 1.6x improvement (if GtG=0), whereas 144Hz-vs-360Hz is a 2.5x improvement in motion clarity (if GtG=0). The jump 60Hz-144Hz is more similar to the jump 144Hz-360Hz as a result.

Stutter Context
Stutters are because of gametime:photontime variances. Many reasons exists such as the game engine, sync technology, and fluctuating frame rates, etc. Humans can still see frame rate fluctuations that are only a few milliseconds apart in frametime. 100fps vs 200fps is only a 5 millisecond difference in frametime, and it's definitely human visible on 240Hz displays with fast GtG. Variable refresh such as G-SYNC and FreeSync can make stutter less visible by avoiding the fps-vs-Hz aliasing effect of the fixed refresh cycle schedule (animation of variable refresh rate benefits), but is not completely immune and gametime:photontime can still diverge for other reasons like engine inefficiencies, multi-millisecond-scale system freezes, dramatic rendertime differences between consecutive frames, etc. There is even a a piece for game developers, regarding how multi-millisecond issues can add stutters to VRR, and it's a simpler bug to fix than many developers realize. Recently Blur Busters helped a game developer fix stutters in VRR, with rave reviews from end users, precisely because of this.

Input Lag Context
For input latency, you don't need to feel the milliseconds to win by the milliseconds. When you're earning $100,000 in esports, milliseconds matters when those top champions have relatively well-matched players like Olympics sprinters at the starting line waiting for the starting pistol.
- The "Olympics finish line effect": Two racers pass finish line milliseconds apart. Likewise, two esports players go around corner in an alley or dungeon, see each other simultaneously, draws gun simultaneously, shoots simultaneously. The one with less lag is more statistically likely to win that frag.
- The "I'm suddenly missing my sniping shots" factor: Remember 1ms equals 1 pixel every 1000 pixels/sec. Say, 5ms and 2000 pixels/sec (one screenwidths per second), maths out to 10 pixels offset relative to your trained aim. The "Dammit, why does this display make me feel like I'm missing my shots" effect [ugh] [later discovers that display has more lag than player's previous display].

Reaction Time Context
Also, Blur Busters commissioned a paid reaction-time study, called Human Reflex, and it has three sections with some rather interesting findings. There's many kinds of reaction stimuli (visual, aural, and of many subtypes, such as sudden appearance stimuli, or motion-change stimuli, etc), with different reaction times, and this studied a different kind of stimuli that may apparently be faster (<100ms!) than a starting-pistol-type stimuli. More study is needed, but it shows how complex reaction time stimuli is, and it's only barely scratched the surface.

Netcode Context
Yes, netcode lag and network jitter applies. But in the era of FTTH and LAN play, even with 128tick servers, 4ms means you're 50% more likely to get that earlier tick, and that frag too. 4ms is one full 1/240sec refresh cycle! And, did you know.... Battle(non)sense, the YouTube star about netcode lag, also wrote a guest article for Blur Busters.

MPRT Context
Now, milliseconds also matters in other contexts (motion quality), given that 0.25ms MPRT versus 0.5ms MPRT versus 1.0ms MPRT is now human-visible motion clarity differences in the refresh rate race to retina refresh rates -- especially at 4000 pixels/second. (Just adjust ULMB Pulse Width on an NVIDIA ULMB monitor while viewing TestUFO at 2000 thru 4000 pixels/sec, to witness clarity differences of sub-millisecond MPRT). This is thanks to the Vicious Cycle Effect where bigger displays, higher resolutions, higher refresh rates, wider FOV, faster motion, all simultaneously combine to amplify the visibility of millisecond-scale flaws.

Strobe Backlight Context
Also for improved strobe backlights -- GtG limitations is why ULMB was disabled for >144Hz. Faster GtG makes it easier to hide GtG in VBI to reduce strobe crosstalk. 0.5ms GtG is easier to hide LCD pixel response limitations between 240Hz (1/240sec = 4.16ms) refresh cycles, because you have to flash between scanout sweeps (high speed video #1, high speed video #2). Even a 0.5ms mistime can amplify strobe crosstalk by 2x to 10x, depending on if it starts to enroach a bad part of the GtG curve.

Pixel Response FAQ, GtG vs MPRT
Needless to say, Blur Busters also got one of the world's best Pixel Response FAQs, GtG versus MPRT. While 1ms is unimportant on 60Hz displays, it's a giant cliff of a problem at 360Hz and GtG needs to become faster than 1ms. GtG needs to be a tiny fraction of a refresh cycle to prevent bottlenecking the Hz improvements. Also, strobeless blur reduction requires brute Hz. In the strobeless MPRT context, doubling Hz halves motion blur, and you need approximately ~1000Hz to achieve ULMB strobelessl & laglessly. Full persistence simultaneously with low persistence, with no black periods between short-persistence frames.

Milliseconds Work With Manufacturers
We often work with vendors and manufacturers nowadays (we're more than a website) -- services.blurbusters.com .... We've also got the Blur Busters Strobe Utility, as well as the Blur Busters Approved programme.

Strobe Backlight Precision Context
Did you know 10 microseconds became human visible in this case? I once helped a manufacturer debug an erratically-flickering strobe backlight. There's 1% more photons in a 1010 microsecond strobe flash versus a 1000 microsecond strobe flash. A 1% brightness change is almost 3 RGB shades apart (similar to greyscale value 252 verus greyscale value 255). If you erratically go to 1010 microseconds for a few strobe flashes a second, it becomes visible as an erratic faint candlelight flicker when staring into a maximized Windows Notepad window or bright game (e.g. outscore scene). Yup. 10 microsecond. 0.01 milliseconds. Annoying human visible artifact.

Discovery of G-SYNC Frame Capping Trick
Oh, and we are also the world's first website in 2014 to discover how to measure input lag of G-SYNC. This led to the discovery of the "Cap below max Hz" trick -- we're the first website to recommend that. Now it's standard parrot advice to "cap 3fps below" on VRR displays, now common advice.

Journey to 1000Hz Displays
And if you're enthralled by these articles, you probably should be aware of Blur Busters Law: The Amazing Journey To Future 1000 Hz Displays, as well as other articles like Frame Rate Amplification Technology that enables more frame rates on less powerful GPUs (it's already happening with Oculus ASW 2.0 and NVIDIA DLSS 2.0 but will continue to progress until we get 5:1 or 10:1 frame rate amplification ratios). ASUS has already roadmapped for 1000Hz displays in about ten years, thanks to a lot of Blur Busters advocacy, as told to us, to PC Magazine, and to other media by ASUS PR.

Also, sometimes improving one millisecond context also automatically improves a different millisecond context (lag <-> image quality), though there can be interactions where one worsens the other.

Blur Busters exists because Milliseconds Matters. Blur Busters is all about milliseconds. Motion blur is about milliseconds. Our name sake is Blur Busters. We're paid to pay attention to the millisecond. :D

We know our milliseconds stuff!

</Big Rabbit Hole>
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

User avatar
Chief Blur Buster
Site Admin
Posts: 9071
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: The Amazing Human Visible Feats Of The Millisecond

Post by Chief Blur Buster » 31 May 2020, 13:45

The latency training context is super-important.

Metaphorically, pre-shooting an archery arrow at a horizontally-moving archery target. That's the latency training effect.

Also known as "muscle memory" or "latency familiarity" or other terms, it's training towards a predictable lag.

Latency Aim Training Effect

Latency Training Context on Sudden Hardware Lag Changes
If you're slewing your aim at 8000 pixels per second, a one-millisecond change (1ms) creates a 8 pixel misaim. So you're aiming predictably, perhaps even trained to aim ahead of target like pre-shooting an archery arrow towards a moving target (also an artillery tactic, but it also applies to simple FPS shooting too). You get familiar with the amount of time you need to pre-aim. But if latency changes suddenly (lower or higher), the amount of pre-aiming you need to do changes! So you miss (overshoot or undershoot) because of the latency change. Even a tiny sudden latency change in your setup sometimes creates a "Why am I not getting my hits?" or "I seem to be scoring crap on this monitor".

Latency Training Context on Sudden Temporary Loss Of Scoring When Switching Monitors (Even Hz Upgrades)
Even settings changes (sync technology, refresh rate, etc) creates a situation where latency-retraining is needed. The champion doesn't need to feel the millisecond directly, they just know "why am I scoring crap?" when latency suddenly changes a lot. Then their ability improves again if the latency remains consistent after that. Fixed-lag changes are common when changing setups or upgrading setups. Scoring problems happens more often during lag-increases but can also happen with lag-decreases (sudden temporary loss of scoring ability during a 240Hz upgrade), then they start scoring better than their 144Hz monitor, after one week of familiarization. So you have to fight through a lag-training penalty when you upgrade your rig, sometimes.
Scientific studies comparing Hz for professional elite players, in longer-term results, should allow extra time to compensate for the latency training effect (BTW, yoohoo NVIDIA researchers -- want to generate more impressive Hz-vs-Hz graphs? Then compensate for this factor in a scientific way. Quite a few follow Blur Busters these days. ;) ...)

Latency Training Context on Network Jitter
This also inteferes badly with network latency. Even a big change in latency jitter (e.g. Internet peak), such as 5ms TCP/UDP ping jitter can create hitreg problems, especially on 128-tick servers (since latency jitter can jitter the hitreg between ticks), since that's a whopping variable hitbox offset at fast motionspeeds. YouTubers such as Battle(non)sense covers a lot of this in the universe. It's easier to train to a predictable 30ms latency, than a random 10-25ms latency. Predictable latency is like pre-aiming an archery arrow ahead of a horizontally-moving archery target, to try to get a latecy-compensated hitreg. Now if you shoot players and it doesn't hitreg, that's milliseconds fighting against you in the network (human visible lagged/accelerated enemy position versus actual invisible hitbox location is essentially out of sync), alas... I've mentioned this context in the earlier post, but it's a situation of a user having to continually re-train for changing latencies throughout the day, which perpetually keeps a player a bad player. To fight against this problem, some professionals (A) upgrade to a business connection instead, (B) upgrade to high performance router and use direct Ethernet, and (C) dedicate it just to gaming computer (some pro players with big budgets get two FTTH connections; one only for gaming PC, the rest for the family/streaming/WiFi/etc), as well as (D) switch to a gaming VPN to bypass their bad-jitter backbone, and even (E) All Of The Above

Note: Other Error Margins and Variables
Different equipmentfeel, different tactileness, and other issues can create other training issues. Bear this in mind when building scientific tests that attempts to measure latency training effects, or in potentially creating new blind tests using identical-equipment (with different internals / firmwares / settings / etc).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

User avatar
Chief Blur Buster
Site Admin
Posts: 9071
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: The Amazing Human Visible Feats Of The Millisecond

Post by Chief Blur Buster » 31 May 2020, 14:00

We're very famous raw material for new researcher study material, so I'm going to be keeping this thread up to date.

Different Human Reaction Time Responses To GtG Pixel Response
Those familiar with Pixel Response FAQ: GtG versus MPRT, as well as LCD Overdrive Artifacts show that different humans have very different reaction time behaviours to different levels of overdrive.
--> Some users prefer no overdrive at all (vision gets distracted by coronas, slowing them down)
--> Some users want super-excess blatant overdrive (BenQ AMA Premium) because it's like a tracer-bullet assist feature
--> Freezing homes (artic) will slow pixel response, requiring slightly higher overdrive
--> Hot homes (tropics) will speed pixel response, requiring slightly less overdrive
--> Some users have a preference to faster pixel response with slight coronas

So different humans have different reaction-time responses to different GtG/overdrive settings. An excessively fast 0.5ms GtG may actually slow a player down because coronas distract them. But may speed up other players because they're trained to treat coronas as a "highlight marker" for movement.

At the time of this writing, the website, PROSETTINGS.NET show that about ~50% of esports players are using BenQ monitors (famous for AMA with exaggerated overdrive), and many of them are using the AMA feature as a "motion highlight assist" feature, similar to shadow boost and other esportsy features.

For a long time, Blur Busters has been dissapointed at fixed overdrive (calibrated at 20C), which is why Blur Busters is an advocate of the 100-Level Overdrive Gain Slider (like Brightess/Contrast), it should never be locked, and should be a User Defined option in main monitor menus. As well as User Defined Overrdrive Lookup Tables (since Blur Busters can generate better LUTs than many scaler/TCON vendors), because there are over 60,000 GtG numbers on a LCD surface

In the past, manufacturers didn't want to add extra overdrive to monitor menus because it complicates laypeople. However, it should at least be a "User Defined" setting hidden in the same area as RGB adjustments or ULMB Pulse Width or other advanced adjustments (among other needed features such as 60Hz single-strobe, for MAME arcade-machine enthusiasts). Monitor manufacturers sadly limit flexibility to keep things easier for users (but hurt the market for other users). It is all often just 1-line firmware programming changes to exaggerate overdrive, or re-add features that expand the market sideways (even features that don't push the refresh rate race upwards).

Anyway, we've amazingly noticed how overdrive is an unexpected "esports assist" and why it's very popular on BenQ monitors. People don't believe Blur Busters until the researchers test these out, and they grudgingly say "Blur Busters Is Right", years ahead of schedule: It appears that people react very differently (lagged reactions & accelerated reactions) to pixel response behaviours such as overdrive.

*(Thread replies can be added by VIP invitation -- reputable researchers/scientists, send me email mark [at] blurbusters.com)
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

Locked