Reduce Input Delay (Mouse & Keyboard Buffer Size)

Everything about latency. Tips, testing methods, mouse lag, display lag, game engine lag, network lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.
andrelip
Posts: 87
Joined: 21 Mar 2014, 17:50

Re: Reduce Input Delay (Mouse & Keyboard Buffer Size)

Post by andrelip » 07 Feb 2021, 20:27

I asked him to open Perfmon in the comment to check the C-State in use, and I even mentioned that "disable idle" helps in the cases of Windows ignoring the "disable C-State" or "max C state" bit flag from Bios. With Perfmon, we could check if it was already in C0 from the Bios; if that was the case, then OBVIOUSLY, it wouldn't make a change or even make it worse (especially with HT ON) because of the removal of the "Idle Thread" from Windows.

Promote and Demote Idle in 100 will force C1 as the max state, and Perfmon can confirm that for you. Compare the C1 and Idle graph, and both should be the same, as all idle is in C1 (non-idle in c0 ofc). Perfmon would also show the different states when they are not entirely at 100, gradually degrading the state with the worst case in 0 (ofc) for both settings. THIS IS ALL IN GRAPHICS AND PERFECTLY ANALYZABLE. You can see the advantages in mouse polling using MouseTester and comparing both extremes. Those settings are only relevant if you do not use the "disable idle: disabled" because that will ignore any other state setting. Bitsum, High Performance, and Ultimate don't have that in 100 and uses all the other C-States. You can search for a table of each C-State transition to C0 for your own processor.

You will find no good sources about what Disable Idle does in Windows, but you can search for the implementation in Linux that does the same thing to keep the processor in C0; search for idle="pool" in Linux context. Some PCs can behave better with the "disable idle: enabled" and DEMOTE and PROMOTE in 100 as C1 to C0 is already very fast. The advantages of having c0 all the time can be obfuscated by the side-effects of disabling the Idle Thread. The trade-off will depend on the processor's Power Saving technologies, the C1 to C0 latency of the processor, the amount of context switching, how the interrupts are being scheduled, and the number of driver latencies, bottlenecks, interrupts or threads blocking the processor, the type of load and background threads, and so on. With the C1->C0 transition being very fast, then having the re-scheduling and other Idle Threads features will help. You can confirm this is forcing c0 in Perfmon as all the C-States will be zero.

He did shit videos for years blinded following roach-like tutorials. Then, a few weeks ago, he decided to test things properly with input lag tools, which is an excellent path to follow, but suddenly became a fucking arrogant prick that thinks he is the test methodology paladin. He spread placebo for thousands of people, and ironically, most of the things were refuted by his own tests. Here I'm talking about something easily measure by Microsoft's internal tools, and he complained about being a flat earth tweak, fucking asshole. I'm not going to mention his test methodology, which prefers to run dozens of small tests 1 time, totally messing the distinction between signal and noise. Also, many of his input delay tests use actions that depend on animation-delay or are tied to tickrate. Just press ESQ, toggle the TAB, change the brightness, or any other real-time event from the game.

deama
Posts: 243
Joined: 07 Aug 2019, 12:00

Re: Reduce Input Delay (Mouse & Keyboard Buffer Size)

Post by deama » 08 Feb 2021, 11:26

andrelip wrote:
07 Feb 2021, 20:27
I'm not going to mention his test methodology, which prefers to run dozens of small tests 1 time, totally messing the distinction between signal and noise. Also, many of his input delay tests use actions that depend on animation-delay or are tied to tickrate. Just press ESQ, toggle the TAB, change the brightness, or any other real-time event from the game.
The animation-delay isn't a problem as he uses the same characters/guns/areas in those situations, so as long as it all is the same, it won't matter.

matro
Posts: 1
Joined: 08 Feb 2021, 17:25

Re: Reduce Input Delay (Mouse & Keyboard Buffer Size)

Post by matro » 08 Feb 2021, 17:26

thizito wrote:
26 Jan 2021, 03:02

-> Using PMT (gets lower performance and way smoother, depends the choice of the game.. old games are perfect)
what exactly is PMT and how do I enable it? I play games from 2005-2015 so dx9 and lower I'm curious to try out "PMT"

andrelip
Posts: 87
Joined: 21 Mar 2014, 17:50

Re: Reduce Input Delay (Mouse & Keyboard Buffer Size)

Post by andrelip » 10 Feb 2021, 12:20

deama wrote:
08 Feb 2021, 11:26
andrelip wrote:
07 Feb 2021, 20:27
I'm not going to mention his test methodology, which prefers to run dozens of small tests 1 time, totally messing the distinction between signal and noise. Also, many of his input delay tests use actions that depend on animation-delay or are tied to tickrate. Just press ESQ, toggle the TAB, change the brightness, or any other real-time event from the game.
The animation-delay isn't a problem as he uses the same characters/guns/areas in those situations, so as long as it all is the same, it won't matter.
The animation-delay is OKAY as long as it is not considered the actual value of input lag and is used only to make a relative comparison between measurements. It is also important to note that it adds a fixed amount of input lag and could lead to jittering. Many games have some slight movement in the Standing Still pose or even different versions of the same animation type to create some dynamics. The interpolation could generate different intermediate frames between the last state the first key-frame. If the Arduino sensitivity is perfectly set only to capture the gun-fire, this variation should probably be mitigated.

Network events and Lag Compensation algorithms are a different story.

Disconnect the Internet and press mouse1 before the timeout. Did the game fire trigger the animation? If yes, then it is an acceptable way of testing. Otherwise, it will jitter with the networking/tickrate and require much more samples to extract the data from the noise.

About tick rate, if that is limiting the event, it will be like benchmarking a programming function that has a sleep() at the end of the call that is way higher than the execution time. Also, note that this will affect even local servers.

The safe real-time event of most games is the camera angle. It will work in loss of connectivity; the internet roundtrip will not block it, either the internal game server or lag compensation. Just use mouse1 to turn with a very high sensitivity to perform a 180" in a single event from a place with 2 distinct wall colors.

thizito
Posts: 52
Joined: 14 Mar 2014, 09:46

Re: Reduce Input Delay (Mouse & Keyboard Buffer Size)

Post by thizito » 14 Feb 2021, 18:39

matro wrote:
08 Feb 2021, 17:26
thizito wrote:
26 Jan 2021, 03:02

-> Using PMT (gets lower performance and way smoother, depends the choice of the game.. old games are perfect)
what exactly is PMT and how do I enable it? I play games from 2005-2015 so dx9 and lower I'm curious to try out "PMT"
go to calypto.us and ctrl+F PMT, cause he explains better than me posting here
edit: i had positive experience in windows 8, idk about 10 and 7, cause i use 8 to play valorant

empleat
Posts: 78
Joined: 28 Feb 2020, 21:06

Re: Reduce Input Delay (Mouse & Keyboard Buffer Size)

Post by empleat » 26 Feb 2021, 16:12

It is crazy how many bottlenecks exist! FYI: you have to also disable smoothing in key: Computer\HKEY_CURRENT_USER\Control Panel\Mouse by deleting data for values SmoothMouseXCurve & SmoothMouseYCurve! It should say zero-length binary value then! This disables smoothing and reduces input lag! And it should give 1:1 mouse movement ratio!

I should have already 1:1 mouse movement ratio...

1. Windows 6/11
2. enhanced mouse precision off
3. in Computer\HKEY_CURRENT_USER\Control Panel\Mouse also deleted all data for SmoothMouseXCurve & SmoothMouseYCurve, no other setting which would cause acceleration is on here
4. my mouse sensor doesn't cause acceleration

Then don't know why my aim improved and feels more consistent after using the raw input! Even raw input didn't prevent acceleration set from Windows, so I don't know why it even exists, besides that it should provide lower latency, because game can get data straight from a mouse and doesn't have to go through Windows! Mouse feels more consistent with raw input on, but strangely: there is higher input lag when using raw input! This may be in some cases - because some games also activate smoothing, when you switch raw input on! Like CS GO.

Don't know why, this is as stupid as hardcoding negative acceleration into games... While I Am complaining: also no idea why all games don't have same sensitivity setting, like 1.00 for 1:1 mouse movement and no interpolation! Every game has this on something else like: 10/20/50. It is pain for gamers to find this value, as you don't want your sensitivity to differ per game for muscle memory! Also some games write to config programmatically, so I Am not even sure, if you can translate bits there to words to find out what line is sensitivity, but I met with this only in 1 SP game, so I didn't bother to find out...

I do not feel like I have acceleration with raw input off, but it is not as consistent - no idea why! Maybe dynamic lag, caused something in windows? I should have mouse ratio 1:1 already, if so, then it must be something else, which cause inconsistent feeling. And I don't like using raw input, because it cause huge lag for me in many games! E.g. CSS was fine with raw input on tho, as I heard it didn't use smoothing with it.

Setting mouse buffer size to 20, input lag drops drastically even with raw input on! But mouse movement becomes little bit inconsistent, everyone has to find their own setting for balance between consistency and input lag on their system...

It would be interesting to know, what exactly happens with mouse packet from time it is sent by a mouse, until it renders on a screen. My limited understanding:
1. (skipped part when mouse picks up movement)
2. mouse sends packet to a USB
3. USB controller polls for data (as it is fast and not to tax CPU, up to 8khz on Intel mobos)
4. when it picks up data, it sends interrupt to a CPU through USB chipset, so that's why there is some additional latency like on USB3, as it is complex
5. upon CPU receiving interrupt: ISR is executed and then DPC is scheduled
6. once DPC is handled by OS, there is another bottleneck called timer resolution window, which allows to update data to a CPU only each 0.5ms, so if you miss this window - you will have to wait on a next window to occur, before program can receive data from a mouse

I don't know where in this procedure exactly mouse buffering occurs. But since in registry you edit driver value. I would guess somewhere on the level of a USB/OS part of USB driver this happens. This must be some relic from Windows XP, or earlier, so your crappy CPU/drivers won't lag out and mouse clicks are registered, not to mention high DPC latency, which would cause sound pops/clicks!

I have no idea, if buffering can reduce number of interrupts. As I read purpose of it, is to retrieve data at optimal speed, until none is left in the buffer. So yeah this affects consistency, as if your crappy CPU was taxed, it wouldn't register clicks. Or some shenaniganry would happen and mouse movement would become inconsistent at best! I don't see why, after CPU handles interrupt it would buffer it. So I thought, perhaps it buffers data before, to handle them at optimal speed? But no idea how this works so... It is just a guess.

andrelip
Posts: 87
Joined: 21 Mar 2014, 17:50

Re: Reduce Input Delay (Mouse & Keyboard Buffer Size)

Post by andrelip » 26 Feb 2021, 17:15

Negative sensitivity is an exciting subject.

Some considerations:

1: If FPS and Hz are not harmonically updated; it will not feel like 1:1

2: Do the game use an async thread to read mouse inputs?

3: The way the CPU is being limited may affect the IO.

3.1 Is the CPU 100% busy?
3.2 Do it have an FPS capping? How the game implements the sleep() function?
3.3 Is there any sleep() happening in the present hooks (RTSS)?
3.4 Is the CPU being blocked by Max-Rendered Queue of 1 while waiting for the GPU.
3.5 Do V-Sync is limiting the game?

4: Is there a race condition for mouse interrupt processing?
4.1 Is the system buffering and processing it in batches?

5: Is the processor going to deeper C-States?

These are just some things that can drop or jitter events and generate the negative acceleration feeling even with perfect hardware.

empleat
Posts: 78
Joined: 28 Feb 2020, 21:06

Re: Reduce Input Delay (Mouse & Keyboard Buffer Size)

Post by empleat » 26 Feb 2021, 19:33

andrelip wrote:
26 Feb 2021, 17:15
Negative sensitivity is an exciting subject.
I was talking in the middle about negative acceleration in general, since I touched negative things about mouse movement.

I do not feel like I have strong negative acceleration currently. I do not feel acceleration, it moves pretty much about same distance. But it is more like dynamic lag. I did not even know, that you can cause negative acceleration by all those things you have mentioned. But there are games that have hardcoded negative acceleration into them, e.g. Arma 3, Horizons Zero New Dawn!

1. I use G-SYNC with fps capped at 140, it doesn't even overshoot, constant stable.
2. Game in question is BF4, or BF1, no idea what it uses. Also nothing I can do with that.

3.1 Probably not 100% busy, but BF is CPU demanding. It is 9600kf, so it should be more then fine for both games! I have also enabled all CPU cores, as they are not enabled by default.
3.2 Yep capped fps to 140, I read it is best to cap fps in game engine for lower input lag, but riva tuner is also a good option , maybe I can try it too!
3.3 Not sure what does that mean, I didn't use RTSS much, except for fps overlay. Nor I use it currently actively (also they ban in BF for reshade etc. not sure if you talk about hooking application to the game for some measurement perhaps) - I don't want to do that!
3.4 BF uses render queue -1, not sure how it works exactly, as my understanding is: you need at least 1 pre-rendered frame from CPU so GPU has to do work. Maybe something as Nvidia Ultra Low Latency Mode, just implemented in the game to reduce render queue. But it is set -1 on default and feels best, input lag is significant on 1!
3.5 no vsync

4. Probably not, I have DPC latency average 100us, lately I saw some spikes for ntoskrnl.exe to 200 us (interrupt to user process latency) which is still ok, wddf1000.sys has like 33us maximum execution time, tested outside game 30 minutes!
4.1 Don't know how to monitor buffering, I recently tried this tweak and reducing it makes mouse more inconsistent, but lowers input lag. However I felt same, before using this!

5. I use Ultimate Power Profile + disable idle saver, there is no C1 time, or idle time in perfmon. C states disabled in BIOS, only enabled Intel Speed Shift and turbo.
Also no overheating, thermal throttling, no oc.

mossfalt
Posts: 29
Joined: 23 Nov 2020, 08:43

Re: Reduce Input Delay (Mouse & Keyboard Buffer Size)

Post by mossfalt » 27 Feb 2021, 17:47

howiec wrote:
03 Feb 2021, 18:46
Oh and btw, MouseDataQueueSize does make a difference..
what kind of difference are you experiencing :idea:

howiec
Posts: 167
Joined: 17 Jun 2014, 15:36

Re: Reduce Input Delay (Mouse & Keyboard Buffer Size)

Post by howiec » 28 Feb 2021, 00:25

mossfalt wrote:
27 Feb 2021, 17:47
howiec wrote:
03 Feb 2021, 18:46
Oh and btw, MouseDataQueueSize does make a difference..
what kind of difference are you experiencing :idea:
What you'd expect due to a smaller buffer. Feels more responsive / accurate and precise. Too low of a value can obviously lead to issues though.

Post Reply