Interesting project about mouse/ gamepad latency

Everything about input lag. Tips, testing methods, mouse lag, display lag, game engine lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.
do0om
Posts: 34
Joined: 15 Jan 2018, 13:14

Re: Interesting project about mouse/ gamepad latency

Post by do0om » 07 Jul 2019, 11:29

The 150 ms results in their database are from cheating/guessing, there's no point in referencing them.

Given current home testing setups, average measurements of 140 ms - 160 ms are achievable from audio reaction times. For visual reaction times with current monitor/mouse input tech, 160 ms average measurements are probably possible, and 140 ms averages are not.
Do you have proof of this ?
I can often get below 160ms on 5 measurements (best is 149ms on 5 measurements).
Here Dafran (top Overwatch player) is at 148 ms : https://clips.twitch.tv/GorgeousRichPlumImGlitch
Here Tseini (top Overwatch player) is at 144ms : https://twitter.com/TseiniOW/status/1135559607529680896
Here is ZP (Overwatch League caster) : at 146ms https://www.reddit.com/r/Competitiveove ... dium=web2x

I don't see them cheating at all.

ad8e
Posts: 62
Joined: 18 Sep 2018, 00:29

Re: Interesting project about mouse/ gamepad latency

Post by ad8e » 07 Jul 2019, 13:11

do0om wrote:
The 150 ms results in their database are from cheating/guessing, there's no point in referencing them.

Given current home testing setups, average measurements of 140 ms - 160 ms are achievable from audio reaction times. For visual reaction times with current monitor/mouse input tech, 160 ms average measurements are probably possible, and 140 ms averages are not.
Do you have proof of this ?
I can often get below 160ms on 5 measurements (best is 149ms on 5 measurements).
Here Dafran (top Overwatch player) is at 148 ms : https://clips.twitch.tv/GorgeousRichPlumImGlitch
Here Tseini (top Overwatch player) is at 144ms : https://twitter.com/TseiniOW/status/1135559607529680896
Here is ZP (Overwatch League caster) : at 146ms https://www.reddit.com/r/Competitiveove ... dium=web2x

I don't see them cheating at all.
The results you linked are all real or close to real, and those players' average reaction times are within 0-10 ms of the measurements you showed. But the lower times in humanbenchmark's database lower times are usually from fraud, rather than such people. Back when they showed a leaderboard, you could look at who was setting good times: https://web.archive.org/web/20170814021 ... ctiontime/. Click back and forth in time to see various leaderboards. There's a cluster at 100 ms (the fastest allowed by humanbenchmark), then sparse times from 110-140 ms, then a cluster after 150 ms with real results. I'm not saying that 150 ms isn't achievable, just that pulling times from humanbenchmark is not evidence that it is. Citing specific people, like you did, is much more reasonable.

User avatar
Chief Blur Buster
Site Admin
Posts: 6857
Joined: 05 Dec 2013, 15:44

Re: Interesting project about mouse/ gamepad latency

Post by Chief Blur Buster » 07 Jul 2019, 13:11

Strictly speaking to doubts of how low button-to-pixels lag can become....

I've seen mousebutton-to-photons latency of less than 10ms in CS:GO in some setups. Older Quake can be even less, given their capability to go 1000fps+.

Private conversations with multiple parties such as NVIDIA confirmed this can happen on the system side. So system is capable of becoming closer to a rounding error now. Even the older GSYNC 101 tests with uncapped VSYNC OFF got several 12ms button-to-photons latency results with the 1000fps high speed camera.
3CDA381A-4165-4F8F-97AF-1C2584336DA2.jpeg
3CDA381A-4165-4F8F-97AF-1C2584336DA2.jpeg (371.53 KiB) Viewed 700 times
The video files are accurate on disk and single-frame-steppable, but when uploaded to YouTube, frames can get decimated. IIRC 40 samples per test for this graph (over 200 high speed video clip-segments made this specfic chart!).

So system part of lag can be nearly a rounding error in some highly optimized situations.

Web response-time speed tests are invalid due to Windows VSYNC ON compositing and browser framebuffering, so ignore web-based response tests, which will have far higher reaction time numbers than the well-oiled pipeline of older Quake-engine/Source-engine games which are capable of achieving subrefresh latencies in "several frameslices per refresh cycle" situations.

far in excess of refreshrate, can lead to subrefresh latencies nowadays in these older engines, especially at lower refresh rates. One can use peripheral vision to notice stimuli like full-screen injury-flashes that appears at other parts of scanout. So single-dot measurements are invalid here, as pro players already respond to stimuli away from gazepoint.

Remember, our video lag tests used "first-anywhere-on-screen" reaction, and not "first reaction at THIS pixel". So that lowers the numbers, and players have peripheral vision.

So thanks to VSYNC OFF, seasoned players reaction time clock trigger to an earlier random frameslice far away from X-hairs (like the first time something is flashing of a full screen explosion or a red-flash of an injury -- or even perhaps long vertical beam of a ray-emitting gun). So, there you go, subrefresh latency factor!

For those unfamiliar with scanout lag, see http://www.blurbusters.com/scanout
Current esports TN panels are capable of syncing cable scanout to panel scanout, for subrefesh latency at the display side, using only a tiny rolling buffer (few pixel rows) for processing and DisplayPort micropacket dejittering (microseconds league). And players eyes begin getting photons early in the GtG curve, e.g. 10ms GtG often have pixels visibly transitioned early in the curve, as per VESA response time computation - GtG does not need to be 100% before the human reacts to the pixel. So the reaction time clock can start to be triggered before full GtG. Now, the faster GtG, the less error margin this is, but this is mentioned for completeness' sake.

So, those are knowble error margins, and some games are now indeed confirmed capable of sub-refresh button-to-photons latency if you think carefully about all the above.

Now, the unknowable or less-knowable error margins...

...

Now beyond system side, can be a series of unknown error margins -- Marwan is away, but I'll try to have him address the questions about error margins. The article did warn that there was potentially predictive factors. An addendum page could be useful for further analysis. Totally good questions!
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

ad8e
Posts: 62
Joined: 18 Sep 2018, 00:29

Re: Interesting project about mouse/ gamepad latency

Post by ad8e » 07 Jul 2019, 13:47

Chief Blur Buster wrote:I've seen mousebutton-to-photons latency of less than 10ms in CS:GO in some setups. Older Quake can be even less, given their capability to go 1000fps+.
Yes, and rendering just-in-time vsync should cut frame latency by another 30-40% of the frame time, depending on jitter. For reaction tests, which can get by with just screen blanking, my Intel HD graphics card can run at 500 fps.
Chief Blur Buster wrote:Web response-time speed tests are invalid due to Windows VSYNC ON compositing and browser framebuffering, so ignore web-based response tests, which will have far higher reaction time numbers than the well-oiled pipeline of older Quake-engine/Source-engine games which are capable of achieving subrefresh latencies in "several frameslices per refresh cycle" situations.
Specifically, the browser lag is two frames on Chrome. https://www.reddit.com/r/firefox/commen ... _behavior/

On Linux, these two frames are fixed at 60 fps, but I think Chrome on Windows is able to run at your monitor refresh rate.

User avatar
Chief Blur Buster
Site Admin
Posts: 6857
Joined: 05 Dec 2013, 15:44

Re: Interesting project about mouse/ gamepad latency

Post by Chief Blur Buster » 07 Jul 2019, 14:20

ad8e wrote:
Chief Blur Buster wrote:I've seen mousebutton-to-photons latency of less than 10ms in CS:GO in some setups. Older Quake can be even less, given their capability to go 1000fps+.
Yes, and rendering just-in-time vsync should cut frame latency by another 30-40% of the frame time, depending on jitter. For reaction tests, which can get by with just screen blanking, my Intel HD graphics card can run at 500 fps.
Even rendering just-in-time VSYNC (beam racing the VBI) will have a latency gradient of [0ms....(refresh cycle)] from top to bottom. Stimuli at middle of screen will have half a refresh lag added due to scanout, and stimuli at the bottom of screen will have almost a full refresh cycle lag.
ad8e wrote:
Chief Blur Buster wrote:Web response-time speed tests are invalid due to Windows VSYNC ON compositing and browser framebuffering, so ignore web-based response tests, which will have far higher reaction time numbers than the well-oiled pipeline of older Quake-engine/Source-engine games which are capable of achieving subrefresh latencies in "several frameslices per refresh cycle" situations.
Specifically, the browser lag is two frames on Chrome. https://www.reddit.com/r/firefox/commen ... _behavior/

On Linux, these two frames are fixed at 60 fps, but I think Chrome on Windows is able to run at your monitor refresh rate.
I've seen Windows compositing add/remove a frame (refresh cycle) of latency depending on setting. One thing I noticed is that using GSYNC windowed at desktop, causes the compositing manager to be a little shallower in latency -- since frame presentation is now on a timer (equalling max Hz), simulating fixed-Hz via max-framerate GSYNC.

I even see it in http://www.testufo.com/refreshrate when GSYNC is enabled (enable all 8 digits) -- the numbers jitter more after 30 minutes -- than if GSYNC is fully disabled. Simulating fixed-Hz via GSYNC is slightly more jittery (e.g. 2nd or 3rd digit may jitter) than fixed-Hz (e.g. 4th or 5th digit may jitter), it's amazing I can see this in a web-based refresh rate averaging test. Just fresh-reboot and make browser window the only window, you can see the GSYNC jitter after several minutes (to 30 minutes) of running the browser only on that page.

Do one run with GSYNC at desktop, and one run without GSYNC at desktop, and make sure to keep the window windowed (not fullscreened, as sometimes WIndows overrides compositing modes after a single app is fullscreened for a few seconds -- it's a new behaviour of Windows 10.

The Windows "chrome --disable-gpu-vsync --disable-frame-rate-limit" combination of two command line switches only automaticaly shows tearlines about 2 seconds after the window is fullscreen mode, if you have configured NVIDIA Control Panel full screen applications to run in VSYNC OFF mode. That is the new desktop-based VSYNC OFF mode for borderless windowed full screen apps.... Seeing this triggered with Chrome browser means it accidentally triggers on non-game apps nowadays too. It often won't show in the mousearrow-vs-browser tests, as the mouse arrow sometimes runs its own independent sprite (that may be stuck in VSYNC ON) even when the underlying framebuffers are true tearline-torn VSYNC OFF. So mousearrow-frame-count determinations are not 100% accurate indicators of framebuffer lag. Regardless of the mousearrow, you can still can easily generate proof that the lag of browser testers dramatically changes depending on whether Chrome is full screen versus when Chrome is not full screen -- AND depending on how the drivers are configured. It may still be VSYNC ON versus VSYNC ON but may have different buffering depths. The divergence becomes human visible if you use those two command line options, and now you've got non-tearing in windowed mode but tearing in full screen (F11) of Chrome. So don't trust browser lag tests without knowing how WIndows automatically configured the compositing workflow in realtime.

Now, back to GSYNC refresh-cycle jitter test .... launch http://www.testufo.com/refreshrate .... enable all 8 decimal digits. Let settle for 10-30 mins. Eventually it'll permanently jitter at a specific digit and less significant. This threshold is different for fixed-Hz versus GSYNC (check monitor OSD). Even the windows compositing lag is different in these GSYNc and non-GSYNC modes too, on top of the jitter. Fresh reboot between runs or at least quit-rerun the browser after changing windowed GSYNC settings. Make sure monitor OSD is telling you GSYNC is enabled or disabled. Now, at the same time, the OS is pipelining the composited frames a bit shallower. So even the windows compositor varies in latency depending on outside-browser factors, such as whether GSYNC is enabled or not during Windows. All these Windows composite-lag variances are all additional error margin not found with VSYNC OFF which streams out those frameslices in real time,

So, error margins I already know -- windows compositing, drivers settings, already interfere with browser lag tests. Even the WIndows mouse pointer is often lagging differently from the Windows compositor, so even the WIndows mouse pointer versus browser divergence (seen at vsynctester) is not a fully perfect test.

The only way to bypass this latency-error mess is VSYNC OFF where Present() (and flushing it quickly) means the next scanline being output at the GPU is the new frames. VSYNC OFF bypasses these error margins if you're trying to aim for the lowest possible absolute latencies and maximize peripheral-vision reaction-time opportunities.

Now,

This is important when players react to full-screen stimuli like explosions, like injury-flashes (red flashes), or like large close movements or long vertical stimuli (e.g. a long ray gun extending more than half a screen tall); allowing opportunities to reaction-time at a random frameslice rather than start-of-VBI. Games running framerates far in excess of refreshrate, are "accidentally" beam-racing in a random sense (being, VSYNC OFF tearlines are simply the raster splices of new frame buffers) -- a fresher frameslice means certain stimuli becomes visible to peripheral vision well before centre. Single-point testers (Leo Bodnar) neglect this real world factor. As does a lot of lag test methodologies that don't realize the complex world of latency gradients of all the sync technologies, etc.

This kind of stuff is difficult for most readers to understand -- and it is easy for readers to be misinformed with inadequate expansions -- but as (ad8e) you understand the beam racing (since you are christened a fellow Tearline Jedi :) ...) you already follow along with many of these concepts and probably concede that the numbers can reduce slightly for real-world games thanks to the peripheral-vision factor that is ever only possible with VSYNC OFF. (Even with random-placed unsynchronized frameslices, thanks to the fullscreen nature of some stimuli).

Even RTSS Scanline Sync (Beam raced sync) forces a delay between inpuread and framebuffer flip -- still adding a slight lag -- while VSYNC OFF means there is no intentional delay between inputread-to-Present, and some stimuli might be only a 2/3rd or 3/4th screen height without the top edge, so VSYNC OFF for mostly-full-screen stimuli can lead to lower numbers versus the best-possible just-in-time correctly-inputread-delayed beam-raced "VSYNC ON" lookalike for full-framebuffered 3D rendered games.

Now, that said, certainly useful questions about the error margins beyond the system portion... (spacediver?).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

User avatar
Chief Blur Buster
Site Admin
Posts: 6857
Joined: 05 Dec 2013, 15:44

Re: Interesting project about mouse/ gamepad latency

Post by Chief Blur Buster » 07 Jul 2019, 17:53

BTW, on my system, I was able to run Tearline Jedi -- with a Direct3D-drawn rectangle under the cursor in VSYNC OFF mode. I moved the mouse arrow while running at 8000 frames per second on the buffer underneath the mouse.

The mouse arrow lag (relative to Direct3D-drawn rectangle underneath) varied massively depending on how high on the screen the mouse arrow was -- indicating that the Windows mouse arrow behaved as essentially a VSYNC ON sprite that updated position only once per refresh cycle, with the underlying graphics capable of a beamraced-like workflow (VSYNC OFF frame slicing with tearlines). The lag varied as much as 16 milliseconds!

At 8000fps VSYNC OFFthe top edge, the software-draw was in perfect-sync with the hardware sprite cursor.
At the bottom edge, the software-draw mouse cursor was moving AHEAD of the Windows mouse sprite cursor (indicating that the Direct3D was lower lag than the Windows mouse cursor!!) Soft-cursor AHEAD of hardware-cursor by as much as a SHOCKING ~16ms at 60Hz (yoo hoo, lag scientists!) --

And that is an AORUS GeForce GTX 1080 Ti Extreme Overclocked. Not the kind of card you'd expect a hardware cursor to be more laggy than VSYNC OFF.

The software cursor was as much as 16 milliseconds AHEAD of the Windows mouse cursor, when at the bottom edge of the screen.
Also, not all drivers / not all GPUs may use mouse-sprites. Your driver/system/etc mileage may vary.

In an actual game, runs in rawinput mode so all 1000Hz is available much more freshly to the game, than is available to a browser. I've seen browsers do 1000Hz mice only once every 1/60sec, so a click can be much more lagged in a browser than in a game.

Even the new PointerEvents HTML5 API will often batch the 1000Hz samples in roughly 60 groups of approximately 16-17 events each, on a 60Hz system, so even those 1000Hz browser mouse events are somewhat lagged.

So....to zeroout system latencies as much as possible,

(A) Direct3D or OpenGL VSYNC OFF
(B) Framerates well above Hz
(C) Mouse raw input mode at 1000Hz with atomic non-batched events
(D) Software cursor or software mouse response (lower lag than hardware cursor in "fps above Hz" VSYNC OFF situations).

Or void, full stop, period.

To get subrefresh latencies AND avoid all the error margin shit that browsers do, and mousesprites do, etc.

Besides, even the LinusTechTips video with the SloMo guys, was as low as ~130ms in some of the samples. I confirmed that their test meets my A+B+C+D criteria, so I believe it, based on the information that I know, and my last few replies.

Certainly the territory outside of the system can be full of some unexpected error margins (e.g. network lag, human "predicting" behaviours, etc) -- and readers can easily misunderstand, misquote or misuse statistics -- which may have happened. Yes. That happens.

Olympics athletes can react faster at the starting line, AND there is already achieveable sub-10ms button-to-photon latencies in some real world game engines (old Quake engines, old source engines). The Olympics benchmark is a 100ms false start. Add 10ms to that given the known button-to-photons errorband can manage to get that tight in some situations. You've got 110ms. The 130ms numbers of the streamlined system (VSYNC OFF + fps>hz + software-cursor/draw + rawinput, OR ELSE) is well within this window, given the existence of both visual+aural stimuli.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

ad8e
Posts: 62
Joined: 18 Sep 2018, 00:29

Re: Interesting project about mouse/ gamepad latency

Post by ad8e » 07 Jul 2019, 19:11

Olympic athletes can't react faster than 100 ms under starting line conditions, only with special equipment. No sprinter comes anywhere close to the 100 ms false start threshold in a race, unless they guess. As for the measured <100 ms reaction times from the research papers, those doesn't require Olympic-level athletes, the regular college athletes tested in Pain and Hibbs can achieve them.

A person could try emulating that setup with a piezo force sensor, giving up tactility. It's not clear to me that this will be able to achieve the same response times that the papers achieve, since Pain/Hibbs find the leading edge retroactively after seeing the full pulse. When finding the leading edge in real-time, more conservative thresholds may be needed to avoid false positives. Maybe it's possible to cut 20 ms off a regular mouse button this way while avoiding false clicks.

As described in viewtopic.php?f=10&t=5513#p42206, the software cursor racing ahead of the hardware cursor is only because it doesn't operate under the same constraints. The graphics card needs to fix the cursor at the beginning of scanout to avoid having multiple cursors, and it's not possible for software to do better than the graphics card, as far as I can tell. Being able to race ahead of the hardware cursor is because you're willing to give up on that constraint, or willing to have tearlines moving up and down the screen. (Maybe it's possible to beat the graphics card by 4% of frame time if the graphics card fixes the cursor at the beginning of the vblank interval.)

For other purposes, like aiming in an FPS game, the hardware/software cursor is no longer relevant and latency becomes a matter of where a player is willing to put tearlines. I like running a separate thread that polls input at 1000 Hz, although I find this causes some problems. Reflex Arena also did this.

I think all commonplace graphics cards on x86-64 systems have hardware cursors.

I didn't watch Linus's video in full, but his results can be found by skipping through and looking for a red and green number in the top left corner. Note that his times are in divisions of 60, so https://youtu.be/tV8P6T5tTYs?t=809 is a 148 ms time to move; that was the fastest I found.

If we want to speculate, here's my current guesses:
1. on the humanbenchmark test, the top Overwatch players that do0om cites can probably achieve about 157+-4 ms population average if allowed to throw away the 10% slowest results. Some people are probably faster than that by 7 ms or so, we shouldn't expect Overwatch players to have optimal reaction times.
2. with the right knowledge, one can shave 15 ms off the 155 ms reaction times with a better reaction test (black->white transition, knowledge of syncs, correctly-chosen commercially-available equipment). This is where it becomes very important to keep in mind the equipment that everyone has in common.
3. with the right knowledge, one can shave off a further 25 ms if allowed to build custom equipment, but still maintaining reasonable conditions (no false positives, no needles, visual reactions only but peripheral reactions allowed).
Last edited by ad8e on 07 Jul 2019, 19:28, edited 3 times in total.

User avatar
Chief Blur Buster
Site Admin
Posts: 6857
Joined: 05 Dec 2013, 15:44

Re: Interesting project about mouse/ gamepad latency

Post by Chief Blur Buster » 07 Jul 2019, 19:18

ad8e wrote:Olympic athletes can't react faster than 100 ms under starting line conditions, only with special equipment. No sprinter comes anywhere close to the 100 ms false start threshold in a race, unless they guess. As for the measured <100 ms reaction times from the research papers, those doesn't require Olympic-level athletes, the regular college athletes tested in Pain and Hibbs can achieve them.
The guessing factor can be real error margin outside system latencies, and it's hard to measure that.
ad8e wrote:and it's not possible for software to do better than the graphics card, as far as I can tell. Being able to race ahead of the hardware cursor is because you're willing to give up on that constraint, or willing to have tearlines moving up and down the screen.
Yes, that's the constraint -- you have to accept tearlines to get subframe reactions on full-framebuffer video games.

At 1000fps at 60Hz, you can have roughly 16 to 17 tearlines per refresh cycle (that is 16 to 17 thin frame slices stacked on top of each other) -- fortunately the disjoints can be extremely tiny. At 4000 pixels/sec mouseflick, that's only a 4 pixel disjoint for 1000fps Quake. So tearing approaches invisibility, the more framerate approaches near infinity (er, the horizontal scanrate), until it produces a natural skew (like an anti-skew to http://www.testufo.com/scanskew ....) if frame rates matched horizontal scan rate.

Such massive fullrender overkill, just to only show less than 10% of a rendered frame for 1000fps at 60Hz, but at least a portion of all 1000 frames of Quake is emitting photons. (Assuming none of the frameslices are thin enough to fully fit in the VBI, often only ~1% to ~5% of a refresh cycle).

Random-distribution of extremely-tiny-offsetted briefly-visible tearlines, can mean that framerates well in excess of refresh rate, with short tearline visibility periods (e.g. 1/240sec at 240Hz). So it's a worthy compromise for many.

Also, each VSYNC OFF frameslice often effectively turns into individual lag gradient of [+0ms...+frametime] due to scanout latency as the next frame renders. This is an overlapping “screen-area-specific” + “depth-into-frameslice” lag factor that can create a complex looking lag heat map if you measure per pixel lag of VSYNC OFF refresh cycle containing many frameslices. The lag gradient of VSYNC OFF can be a sawtooth but with all peaks well below refresh cycle lag if frame times are far shorter than refresh times. No FPS=Hz situation can ever get that low lag as overkill frame rates far beyond Hz, even though lag consistency may be affected by the sawtoothed lag gradient.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

User avatar
Chief Blur Buster
Site Admin
Posts: 6857
Joined: 05 Dec 2013, 15:44

Re: Interesting project about mouse/ gamepad latency

Post by Chief Blur Buster » 07 Jul 2019, 19:35

ad8e wrote:If we want to speculate, here's my current guesses:
1. on the humanbenchmark test, the top Overwatch players that do0om cites can probably achieve about 157+-4 ms population average if allowed to throw away the 10% slowest results. Some people are probably faster than that by 7 ms or so, we shouldn't expect Overwatch players to have optimal reaction times.
2. with the right knowledge, one can shave 15 ms off the 155 ms reaction times with a better reaction test (black->white transition, knowledge of syncs, correctly-chosen commercially-available equipment). This is where it becomes very important to keep in mind the equipment that everyone has in common.
3. with the right knowledge, one can shave off a further 25 ms if allowed to build custom equipment, but still maintaining reasonable conditions (no false positives, no needles, visual reactions only but peripheral reactions allowed).
Enough tests have been done to show some game stimuli (especially bright full screen flashes) is not necessarily identical to Olympic pistol gunshot stimuli, there is likely a fuzzy zone of multi-ten milliseconds that is poorly studied because of some amplified game-specific stimuli, and the ease of pressing a mouse button less than a millimeter, versus a VHS quality high speed camera pointed at an athletes foot.

We need more researchers on this, who understand to account for the “accidentally beamraced” A+B+C+D test situation.

Do you have ResearchGate credentials? Would like to see your published work.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

User avatar
Chief Blur Buster
Site Admin
Posts: 6857
Joined: 05 Dec 2013, 15:44

Re: Interesting project about mouse/ gamepad latency

Post by Chief Blur Buster » 07 Jul 2019, 19:53

Here's ezQuake doing 960fps at 144Hz generating approximately (960/144) = about 6-7 tearlines per refresh cycle. Lower the refresh rate to 60 Hz, and this turns into 16-17 tearlines per refresh cycle with even tinier offsets.

Image

That unfortunately means roughly 85% of rendering is wasted in order to achieve guaranteed subrefresh latency, since full frames are rendered, only to stream a thin frameslice out of the GPU on the spot, for an extremely brief moment -- but what you gain is true subrefresh button-to-photons latency on some gaming monitors that does realtime cable-to-panel synchronous scanout.

If you properly enable A+B+C+D in my post -- and you use a monitor capable of subrefresh latency -- then it is already confirmed that the topmost frameslices photons are already hitting your eyes, long before even the bottommost frameslices are even begun to be rendered!

Even the bottommost frameslices photons can be beginning to hit your eyes when the topmost frameslice of the next refresh cycle is begin to be rendered. And even the wraparound frameslice (one frameslice that wraps around from bottom edge back to the top edge).

The refresh cycle thus is a metaphorical rolling ring buffer of latency, with the frameslices streamed at whatever point the raster is currently at. As long as you're VSYNC OFF, and as long as the game is blasting frames faster than Hz, you're already getting subrefresh latency button-to-GPU-output.

Effectively, you're getting accidental beamraced lag benefits simply from the sheer overkill framerate and sheer render wastage, just to provide an "low-lag-at-raster" effect since a fully rendered frame is available, so there's low-lag scanlines already available to deliver at whatever raster point. Now, going even excessively further we go 6000fps at 60Hz, we're wasting 99% of each frame, but 1% of each 6000 frames (except for those frameslices fitting in VBI) is made human visible -- at nearly 100 frame slices per refresh cycle. But we only need 300fps at 144Hz to begin to see accidental beamraced lag savings benefits that may not be factored in.

Unfortunately, not many researchers understand this, factoring all of these factors in. Researchers are often specialized experts for their respective areas -- the Olympics athlete -- and when they try to lag-measure screens, they don't understand the subrefresh nuances, lag gradients, and the interactions between GPU and monitor. But a game running VSYNC OFF at framerates far in excess of refresh rates -- is apparently giving accidental beamraced lag-reducing benefits by the sheer defacto effective streaming of frameslices in real time.

Assuming A=true, B=true, C=true, D=true, then game has always (historically) been found capable of giving sub-refresh latency from button-to-GPU-output, no matter what the game is. Simply, A+B+C+D simply needs to be true, in order to be confident about being able to achieve subrefresh latencies.
(A) VSYNC OFF
(B) fps > Hz by significant amount
(C) Mouse raw input, atomic events, choose a known low-lag mouse (to avoid laggy mice)
(D) Software rendering (not hardware sprites, e.g. mouse cursor)

Obviously, you'll have to choose a display that has subrefresh latency ability (using VSYNC OFF lag measurement metric, aka first-anywhere metric, rather than Leo Bodnar scanout-latency-affected measurement metric). But choose an "esports mouse" and an "esports display", they will usually (>50%-95% of the time) have low enough latencies not to be a human-reaction-time error margin. Even randomly selected 240Hz monitors and randomly selected 1000Hz mice, now tends to affect lag chain results only by less than a 10ms amplitude, which is impressively tight. At least mouse lag is equipment-measurable, and display lag is equipment-measurable, so these can be controllable.

That said, I think we need more esports research that factors.
- scanout and understanding how VSYNC OFF becomes accidental beamracing
- first-anywhere vs first-singlepoint
- stimuli differences (visual flash vs audio gunshot) and multi-stimuli that often leapfrog each other (flash, movement, audio, direct-gazes, peripherals, etc) in ways that low-end scientific equipment is not always measuring.
- the hyper andrenaline factor (esports gamers are often really pumped like a skydiver at their peak)
- researching unexpected predictive factors (identifying unidentified earlier stimuli that may be occuring before the stimuli, such as onscreen ammo number decrementing before gunshot flash, etc)
Etc.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

Post Reply