Post
by empleat » 26 Feb 2021, 16:12
It is crazy how many bottlenecks exist! FYI: you have to also disable smoothing in key: Computer\HKEY_CURRENT_USER\Control Panel\Mouse by deleting data for values SmoothMouseXCurve & SmoothMouseYCurve! It should say zero-length binary value then! This disables smoothing and reduces input lag! And it should give 1:1 mouse movement ratio!
I should have already 1:1 mouse movement ratio...
1. Windows 6/11
2. enhanced mouse precision off
3. in Computer\HKEY_CURRENT_USER\Control Panel\Mouse also deleted all data for SmoothMouseXCurve & SmoothMouseYCurve, no other setting which would cause acceleration is on here
4. my mouse sensor doesn't cause acceleration
Then don't know why my aim improved and feels more consistent after using the raw input! Even raw input didn't prevent acceleration set from Windows, so I don't know why it even exists, besides that it should provide lower latency, because game can get data straight from a mouse and doesn't have to go through Windows! Mouse feels more consistent with raw input on, but strangely: there is higher input lag when using raw input! This may be in some cases - because some games also activate smoothing, when you switch raw input on! Like CS GO.
Don't know why, this is as stupid as hardcoding negative acceleration into games... While I Am complaining: also no idea why all games don't have same sensitivity setting, like 1.00 for 1:1 mouse movement and no interpolation! Every game has this on something else like: 10/20/50. It is pain for gamers to find this value, as you don't want your sensitivity to differ per game for muscle memory! Also some games write to config programmatically, so I Am not even sure, if you can translate bits there to words to find out what line is sensitivity, but I met with this only in 1 SP game, so I didn't bother to find out...
I do not feel like I have acceleration with raw input off, but it is not as consistent - no idea why! Maybe dynamic lag, caused something in windows? I should have mouse ratio 1:1 already, if so, then it must be something else, which cause inconsistent feeling. And I don't like using raw input, because it cause huge lag for me in many games! E.g. CSS was fine with raw input on tho, as I heard it didn't use smoothing with it.
Setting mouse buffer size to 20, input lag drops drastically even with raw input on! But mouse movement becomes little bit inconsistent, everyone has to find their own setting for balance between consistency and input lag on their system...
It would be interesting to know, what exactly happens with mouse packet from time it is sent by a mouse, until it renders on a screen. My limited understanding:
1. (skipped part when mouse picks up movement)
2. mouse sends packet to a USB
3. USB controller polls for data (as it is fast and not to tax CPU, up to 8khz on Intel mobos)
4. when it picks up data, it sends interrupt to a CPU through USB chipset, so that's why there is some additional latency like on USB3, as it is complex
5. upon CPU receiving interrupt: ISR is executed and then DPC is scheduled
6. once DPC is handled by OS, there is another bottleneck called timer resolution window, which allows to update data to a CPU only each 0.5ms, so if you miss this window - you will have to wait on a next window to occur, before program can receive data from a mouse
I don't know where in this procedure exactly mouse buffering occurs. But since in registry you edit driver value. I would guess somewhere on the level of a USB/OS part of USB driver this happens. This must be some relic from Windows XP, or earlier, so your crappy CPU/drivers won't lag out and mouse clicks are registered, not to mention high DPC latency, which would cause sound pops/clicks!
I have no idea, if buffering can reduce number of interrupts. As I read purpose of it, is to retrieve data at optimal speed, until none is left in the buffer. So yeah this affects consistency, as if your crappy CPU was taxed, it wouldn't register clicks. Or some shenaniganry would happen and mouse movement would become inconsistent at best! I don't see why, after CPU handles interrupt it would buffer it. So I thought, perhaps it buffers data before, to handle them at optimal speed? But no idea how this works so... It is just a guess.