TTT wrote: ↑15 Apr 2021, 18:26
The reason people use lower DPI, usually 400 or 800 is because apparently you get less jitter from the sensor or something along them lines, to be honest I'm not really sure how true that even is and if it is the mouse and sensor tech is much better now than it was when that came about.
When you say spin like crazy, do you mean the mouse control in game?
Just lower the ingame sensitivity if it is too fast, you don't adjust your mouse DPI depending on what a certain in game mouse sens is. Set a DPI and then change the different game settings to match each other.
I might be wrong here but what I think you are saying is you don't change in game default mouse sens, but you change your mouse DPI instead? Which if I'm right, read the last paragraph.
I mean, you put your MegaUberMouse to 10K DPI, and you blow gently on your mouse and your character does 1080 degree spin. Or your cursor goes from one corner of the screen to another in 1milisecond.
Of course, you adjusted your sens in-game, I never adjusted DPI to sens per game
Anyway, years ago it was recommended that you set your mouse sens in Windows to 6/11, that'd give you 1:1 mouse input.
Today, we usually have games handle input directly, via raw_input. Some apply interpolation/smoothing, but that's another issue, and can be turned off.
Thing is, regardless if your DPI is 400, 1600, or 12000, you have to adjust your sensitivity, right? But doesn't game sensitivity operate on a finer granularity like Windows one does - you have a position where it's 1:1, and if you lower it, some input gets dropped, if you raise it, some interpolation or doubling happens.
Now what I'm talking about is that if you set your DPI to high, say 3200, you probably have to lower you game sens WAY lower. Which means that while your mouse might be tracking/reporting/whatever better, your game is actually dropping inputs way more frequently.
So my question is - is that bump in DPI worth it? Is a more "precise mouse" worth it if your software will have to cut that effect? Or am I just misinformed of how games handle sensitivity curves?