Reduce Input Delay (Mouse & Keyboard Buffer Size)

Everything about latency. Tips, testing methods, mouse lag, display lag, game engine lag, network lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.
Ferr0
Posts: 31
Joined: 26 Jan 2021, 10:40

Re: Reduce Input Delay (Mouse & Keyboard Buffer Size)

Post by Ferr0 » 03 Feb 2021, 03:14

Brainlet wrote:
03 Feb 2021, 01:29
DO NOT disable idle when using Hyperthreading. His test is meaningless since he only tested it with HT on... In general I'd take his "testing" with as much scepticism as e.g. r0ach's list. Both make claims without proof. No, writing down some numbers in a spreadsheet with 0 context or how they've been measured is not "proof". Just because he has a tool it doesn't mean his data or testing methodology is correct nor do his settings and general setup match everyone else's. As usual, test things on YOUR OWN system and be aware of dependencies like these.
I completely agree with you about testing on your own system and views on general skepticism regarding this stuff, and perhaps the way I phrased my initial comment regarding disabling idle was too matter-of-fact. It was meant to be taken as a possible alternative to what was proposed earlier in the thread. Ultimately what seems better to you on your own system is going to be best. I would like to point out though, he did actually disable HT in his tests. He was on a 10900k and had 10 cores in task manager. He has also provided context as to how he does his measurements, the videos are on his channel. Obviously no singular "experiment" should be taken as fact but it does offer something to take into possible consideration.

One last thing though is that character attacks are not helpful in any manner. Things like, "the tests are as usefull as fr33thy as human being" really only serve the purpose of being childish.

Brainlet
Posts: 100
Joined: 30 May 2020, 12:39
Contact:

Re: Reduce Input Delay (Mouse & Keyboard Buffer Size)

Post by Brainlet » 03 Feb 2021, 03:30

Ferr0 wrote:
03 Feb 2021, 03:14
Brainlet wrote:
03 Feb 2021, 01:29
DO NOT disable idle when using Hyperthreading. His test is meaningless since he only tested it with HT on... In general I'd take his "testing" with as much scepticism as e.g. r0ach's list. Both make claims without proof. No, writing down some numbers in a spreadsheet with 0 context or how they've been measured is not "proof". Just because he has a tool it doesn't mean his data or testing methodology is correct nor do his settings and general setup match everyone else's. As usual, test things on YOUR OWN system and be aware of dependencies like these.
I completely agree with you about testing on your own system, ultimately what feels better to you is going to be best. I would like to point out though, he did actually disable HT in his tests. He was on a 10900k and had 10 cores in task manager.
Got a link to where he tested idle with HT off? The single thread performance results and the fact that he specified "HT OFF" in #53 and #55 suggest he has been testing everything with HT on.
Starting point for beginners: PC Optimization Hub

Brainlet
Posts: 100
Joined: 30 May 2020, 12:39
Contact:

Re: Reduce Input Delay (Mouse & Keyboard Buffer Size)

Post by Brainlet » 03 Feb 2021, 05:43

Alright, I skimmed through the hour-long videos to find it. Let's break it down into a TLDR of the things I gathered:
- He's running 5.2 GHz which suggests he hasn't properly stress tested for stability (likely the same applies to RAM).
- He turned off HT in BIOS.
- He changed two more settings at once besides "disable idle".
- The CPU-Z single threaded benchmark is, even according to him, "the same" proving that #19 in the spread sheet was tested with HT on. Very misleading.
- Rebooting rerolls the "ideal processor" selection (check with Process Explorer) of threads which makes testing things in general very tiresome due to inconsistencies across reboots.

All in all, I'd be VERY careful with blindly following his advice (anyone's advice in general). If you're determined to tweak your system, be aware that it will be a long process of information gathering and testing on your own system. Guides show you WHAT can impact your system, even if it's not always clear HOW it will impact it.
Ferr0 wrote:
03 Feb 2021, 03:14
One last thing though is that character attacks are not helpful in any manner. Things like, "the tests are as usefull as fr33thy as human being" really only serve the purpose of being childish.
If you're referring to me, I didn't post this comment.
Starting point for beginners: PC Optimization Hub

Ferr0
Posts: 31
Joined: 26 Jan 2021, 10:40

Re: Reduce Input Delay (Mouse & Keyboard Buffer Size)

Post by Ferr0 » 03 Feb 2021, 10:18

Brainlet wrote:
03 Feb 2021, 05:43
Alright, I skimmed through the hour-long videos to find it. Let's break it down into a TLDR of the things I gathered:
- He's running 5.2 GHz which suggests he hasn't properly stress tested for stability (likely the same applies to RAM).
- He turned off HT in BIOS.
- He changed two more settings at once besides "disable idle".
- The CPU-Z single threaded benchmark is, even according to him, "the same" proving that #19 in the spread sheet was tested with HT on. Very misleading.
- Rebooting rerolls the "ideal processor" selection (check with Process Explorer) of threads which makes testing things in general very tiresome due to inconsistencies across reboots.

All in all, I'd be VERY careful with blindly following his advice (anyone's advice in general). If you're determined to tweak your system, be aware that it will be a long process of information gathering and testing on your own system. Guides show you WHAT can impact your system, even if it's not always clear HOW it will impact it.
Ferr0 wrote:
03 Feb 2021, 03:14
One last thing though is that character attacks are not helpful in any manner. Things like, "the tests are as usefull as fr33thy as human being" really only serve the purpose of being childish.
If you're referring to me, I didn't post this comment.
Again, I'm not saying anyone should blindly follow his advice. People should definitely take what he says with a big grain of salt, and that goes for any tweak or anything people read about.

It's very possible that he hasn't properly stress tested but its also possible he has, it's not impossible to hit a stable 5.2ghz on a 10900k. That just reinforces the idea that people should take what they read with a lot of skepticism but I don't think that means this should be flat dismissed or ignored based on conjecture, it should hold as much wait as any other tweak until individuals test it on their system and decide what's best. I'm also not saying there aren't inherent problems with his testing methodology, I'm just saying he did disable hyper threading and his methodology is actually documented.

Again though, I'm not saying you should take his test results as fact. I'm saying there's a possibility that the advice from earlier in this thread is actually wrong and you should test the values yourself.

That last comment about childish behavior wasn't aimed at you, I should have quoted the guy that did.

howiec
Posts: 183
Joined: 17 Jun 2014, 15:36

Re: Reduce Input Delay (Mouse & Keyboard Buffer Size)

Post by howiec » 03 Feb 2021, 18:46

I wouldn't bash his suggestion for disabling CPU idle because in theory it can have merits.

I've been running with latency optimized BIOS and Windows power plan settings for a very long time now (C-states disabled in BIOS, disable core parking, adjusting various promote/demote thresholds, etc.).

He is technically correct that properly disabling CPU idle should put your CPU in C0 100% of the time and will significantly increase minimum CPU temps/power consumption even without actually loading the CPU with something like Prime95.

Now in practice the latency penalty from transitioning to C0 from C1 should be on the order of single digit or lower microseconds in modern Intel CPUs. I'm not sure how Zen 3 fares but I'm willing to bet that they're similar now.

Whether or not you will see a noticeable benefit depends on many factors including your sensitivities and your use-case, so you'll have to test it yourself. Some programs, if not written to handle the possibility of C1 being disabled, may mistakenly view the CPU as fully loaded and resource constrained, and thus respond inappropriately but I haven't run into this myself.

The obvious tradeoff here is a large jump in "idle" CPU temps and power draw for a relatively small latency improvement in most cases.

So, you could choose to selectively disable CPU idle during periods of specific program/gaming sessions assuming you see a latency/performance benefit at the expense of power/heat, instead of just leaving idle disabled all the time.

Oh and btw, MouseDataQueueSize does make a difference.

andrelip
Posts: 160
Joined: 21 Mar 2014, 17:50

Re: Reduce Input Delay (Mouse & Keyboard Buffer Size)

Post by andrelip » 07 Feb 2021, 20:27

I asked him to open Perfmon in the comment to check the C-State in use, and I even mentioned that "disable idle" helps in the cases of Windows ignoring the "disable C-State" or "max C state" bit flag from Bios. With Perfmon, we could check if it was already in C0 from the Bios; if that was the case, then OBVIOUSLY, it wouldn't make a change or even make it worse (especially with HT ON) because of the removal of the "Idle Thread" from Windows.

Promote and Demote Idle in 100 will force C1 as the max state, and Perfmon can confirm that for you. Compare the C1 and Idle graph, and both should be the same, as all idle is in C1 (non-idle in c0 ofc). Perfmon would also show the different states when they are not entirely at 100, gradually degrading the state with the worst case in 0 (ofc) for both settings. THIS IS ALL IN GRAPHICS AND PERFECTLY ANALYZABLE. You can see the advantages in mouse polling using MouseTester and comparing both extremes. Those settings are only relevant if you do not use the "disable idle: disabled" because that will ignore any other state setting. Bitsum, High Performance, and Ultimate don't have that in 100 and uses all the other C-States. You can search for a table of each C-State transition to C0 for your own processor.

You will find no good sources about what Disable Idle does in Windows, but you can search for the implementation in Linux that does the same thing to keep the processor in C0; search for idle="pool" in Linux context. Some PCs can behave better with the "disable idle: enabled" and DEMOTE and PROMOTE in 100 as C1 to C0 is already very fast. The advantages of having c0 all the time can be obfuscated by the side-effects of disabling the Idle Thread. The trade-off will depend on the processor's Power Saving technologies, the C1 to C0 latency of the processor, the amount of context switching, how the interrupts are being scheduled, and the number of driver latencies, bottlenecks, interrupts or threads blocking the processor, the type of load and background threads, and so on. With the C1->C0 transition being very fast, then having the re-scheduling and other Idle Threads features will help. You can confirm this is forcing c0 in Perfmon as all the C-States will be zero.

He did shit videos for years blinded following roach-like tutorials. Then, a few weeks ago, he decided to test things properly with input lag tools, which is an excellent path to follow, but suddenly became a fucking arrogant prick that thinks he is the test methodology paladin. He spread placebo for thousands of people, and ironically, most of the things were refuted by his own tests. Here I'm talking about something easily measure by Microsoft's internal tools, and he complained about being a flat earth tweak, fucking asshole. I'm not going to mention his test methodology, which prefers to run dozens of small tests 1 time, totally messing the distinction between signal and noise. Also, many of his input delay tests use actions that depend on animation-delay or are tied to tickrate. Just press ESQ, toggle the TAB, change the brightness, or any other real-time event from the game.

deama
Posts: 368
Joined: 07 Aug 2019, 12:00

Re: Reduce Input Delay (Mouse & Keyboard Buffer Size)

Post by deama » 08 Feb 2021, 11:26

andrelip wrote:
07 Feb 2021, 20:27
I'm not going to mention his test methodology, which prefers to run dozens of small tests 1 time, totally messing the distinction between signal and noise. Also, many of his input delay tests use actions that depend on animation-delay or are tied to tickrate. Just press ESQ, toggle the TAB, change the brightness, or any other real-time event from the game.
The animation-delay isn't a problem as he uses the same characters/guns/areas in those situations, so as long as it all is the same, it won't matter.

matro
Posts: 1
Joined: 08 Feb 2021, 17:25

Re: Reduce Input Delay (Mouse & Keyboard Buffer Size)

Post by matro » 08 Feb 2021, 17:26

thizito wrote:
26 Jan 2021, 03:02

-> Using PMT (gets lower performance and way smoother, depends the choice of the game.. old games are perfect)
what exactly is PMT and how do I enable it? I play games from 2005-2015 so dx9 and lower I'm curious to try out "PMT"

andrelip
Posts: 160
Joined: 21 Mar 2014, 17:50

Re: Reduce Input Delay (Mouse & Keyboard Buffer Size)

Post by andrelip » 10 Feb 2021, 12:20

deama wrote:
08 Feb 2021, 11:26
andrelip wrote:
07 Feb 2021, 20:27
I'm not going to mention his test methodology, which prefers to run dozens of small tests 1 time, totally messing the distinction between signal and noise. Also, many of his input delay tests use actions that depend on animation-delay or are tied to tickrate. Just press ESQ, toggle the TAB, change the brightness, or any other real-time event from the game.
The animation-delay isn't a problem as he uses the same characters/guns/areas in those situations, so as long as it all is the same, it won't matter.
The animation-delay is OKAY as long as it is not considered the actual value of input lag and is used only to make a relative comparison between measurements. It is also important to note that it adds a fixed amount of input lag and could lead to jittering. Many games have some slight movement in the Standing Still pose or even different versions of the same animation type to create some dynamics. The interpolation could generate different intermediate frames between the last state the first key-frame. If the Arduino sensitivity is perfectly set only to capture the gun-fire, this variation should probably be mitigated.

Network events and Lag Compensation algorithms are a different story.

Disconnect the Internet and press mouse1 before the timeout. Did the game fire trigger the animation? If yes, then it is an acceptable way of testing. Otherwise, it will jitter with the networking/tickrate and require much more samples to extract the data from the noise.

About tick rate, if that is limiting the event, it will be like benchmarking a programming function that has a sleep() at the end of the call that is way higher than the execution time. Also, note that this will affect even local servers.

The safe real-time event of most games is the camera angle. It will work in loss of connectivity; the internet roundtrip will not block it, either the internal game server or lag compensation. Just use mouse1 to turn with a very high sensitivity to perform a 180" in a single event from a place with 2 distinct wall colors.

empleat
Posts: 149
Joined: 28 Feb 2020, 21:06

Re: Reduce Input Delay (Mouse & Keyboard Buffer Size)

Post by empleat » 26 Feb 2021, 16:12

It is crazy how many bottlenecks exist! FYI: you have to also disable smoothing in key: Computer\HKEY_CURRENT_USER\Control Panel\Mouse by deleting data for values SmoothMouseXCurve & SmoothMouseYCurve! It should say zero-length binary value then! This disables smoothing and reduces input lag! And it should give 1:1 mouse movement ratio!

I should have already 1:1 mouse movement ratio...

1. Windows 6/11
2. enhanced mouse precision off
3. in Computer\HKEY_CURRENT_USER\Control Panel\Mouse also deleted all data for SmoothMouseXCurve & SmoothMouseYCurve, no other setting which would cause acceleration is on here
4. my mouse sensor doesn't cause acceleration

Then don't know why my aim improved and feels more consistent after using the raw input! Even raw input didn't prevent acceleration set from Windows, so I don't know why it even exists, besides that it should provide lower latency, because game can get data straight from a mouse and doesn't have to go through Windows! Mouse feels more consistent with raw input on, but strangely: there is higher input lag when using raw input! This may be in some cases - because some games also activate smoothing, when you switch raw input on! Like CS GO.

Don't know why, this is as stupid as hardcoding negative acceleration into games... While I Am complaining: also no idea why all games don't have same sensitivity setting, like 1.00 for 1:1 mouse movement and no interpolation! Every game has this on something else like: 10/20/50. It is pain for gamers to find this value, as you don't want your sensitivity to differ per game for muscle memory! Also some games write to config programmatically, so I Am not even sure, if you can translate bits there to words to find out what line is sensitivity, but I met with this only in 1 SP game, so I didn't bother to find out...

I do not feel like I have acceleration with raw input off, but it is not as consistent - no idea why! Maybe dynamic lag, caused something in windows? I should have mouse ratio 1:1 already, if so, then it must be something else, which cause inconsistent feeling. And I don't like using raw input, because it cause huge lag for me in many games! E.g. CSS was fine with raw input on tho, as I heard it didn't use smoothing with it.

Setting mouse buffer size to 20, input lag drops drastically even with raw input on! But mouse movement becomes little bit inconsistent, everyone has to find their own setting for balance between consistency and input lag on their system...

It would be interesting to know, what exactly happens with mouse packet from time it is sent by a mouse, until it renders on a screen. My limited understanding:
1. (skipped part when mouse picks up movement)
2. mouse sends packet to a USB
3. USB controller polls for data (as it is fast and not to tax CPU, up to 8khz on Intel mobos)
4. when it picks up data, it sends interrupt to a CPU through USB chipset, so that's why there is some additional latency like on USB3, as it is complex
5. upon CPU receiving interrupt: ISR is executed and then DPC is scheduled
6. once DPC is handled by OS, there is another bottleneck called timer resolution window, which allows to update data to a CPU only each 0.5ms, so if you miss this window - you will have to wait on a next window to occur, before program can receive data from a mouse

I don't know where in this procedure exactly mouse buffering occurs. But since in registry you edit driver value. I would guess somewhere on the level of a USB/OS part of USB driver this happens. This must be some relic from Windows XP, or earlier, so your crappy CPU/drivers won't lag out and mouse clicks are registered, not to mention high DPC latency, which would cause sound pops/clicks!

I have no idea, if buffering can reduce number of interrupts. As I read purpose of it, is to retrieve data at optimal speed, until none is left in the buffer. So yeah this affects consistency, as if your crappy CPU was taxed, it wouldn't register clicks. Or some shenaniganry would happen and mouse movement would become inconsistent at best! I don't see why, after CPU handles interrupt it would buffer it. So I thought, perhaps it buffers data before, to handle them at optimal speed? But no idea how this works so... It is just a guess.

Post Reply