TIGEREXPERT wrote: ↑15 Sep 2020, 15:53
3. Oh, I never really understood why high sensitivity gamers preferred to have 1600DPI with 10% X&Y, vs 800 DPI with 20% X&Y.
Since the metric to be considered is EDPI (DPI x Polling rate), and both scenarios are equal (EDPI = 160), how come having a higher DPI or Lower DPI assuming the same EDPI (since X&Y sensitivities will be different 10% for 1600DPI vs. 20% for 800DPI) change anything? I also thought that having a lower DPI means more precision when moving the mouse slightly.
If I switch to 1600DPI x 10% (maintaining the same EDPI of 160) will that change anything in aiming performance?
Wrong metric.
This is not the important metric for the reasons of high DPI. While what you say is important, there's an additional factor: Jitteriness during mouse slowturns (not mouse fastturns).
When you have really low mouse sensitivity and really high in-game sensitivity (Temporarily, test 400dpi + really high in-game sensitivity). When you do that, a small mouse movement (1dpi) can translate into a big pixel jump, like 10 pixel on-screen movement.
So your mouse turns are steppy-steppy. So if you move your mouse only 1/8th inch over the period of 1 second (Common Use Case: Tactical slowscan through a gun scope. etc), there are only 50 mouse positions (1/8th of 400dpi), and your mouse slowturns will only run at the 50 frames per second because there's no intermediate positions. So you sabotage the frame rate of your refresh rate race... Scan slower (1/16th inch), 25 frames per second, scan even slower, steppy steppy as the sharp corners of 8-bit pixels, ouchie.
Different games will convert 1dpi to different degrees of on-screen movement, depending on sensitivity settings, and not necessarily less than 1 pixel. But you do really want to oversample it, because 3D games render things in subpixels. Even a 0.5 pixel movement is noticeable, if you're using a sufficiently low rez display (1080p or less).
Result: Your mouse slowturns become steppy-steppy.
(Test it now: Configure to 400dpi, then slide in-game sensitivity really high, then mouseturn really slow at millimeters per second. Notice the steppy-steppy-steppy effect?)
Increasing refresh rates + increasing screen resolutions, means DPI needs to go up to keep up with the refresh rate race, especially for
- Increasing refresh rate means the positions-per-second can become more frequently lower than the frames-per-second/refreshes-per-second.
- Increasing screen resolutions means the same physical inch can have more onscreen pixels; making dpi limitations more visible
As the
Vicious Cycle Effect continues, this needs to push up mouse poll Hz AND mouse DPI simultaneously, to keep up.
If you want your mouse slowturns to be silk ultrasmooth on a retina display at high refresh rates, in this refresh rate race -- thusly, mathematically it sounds logical that you definitely you need to shovel-on the DPI -- a good mouse will feel the same at 1600dpi + quarter sensitivity, as 400dpi. Good mice will be able to have fast flickturns that feel the same quality and same speed, as long as you adjust sensitivity-versus-DPI. But your slowturns will skyrocket in quality.
This is less important if you use low in-game-sensitivity with low-DPI and prefer to swipe your mouse many inches to move onscreen more slowly, but quality degradation of mouse slowturns are getting above human visibility noisefloors as the refresh rate race continues. The benefit of 144Hz versus 360Hz becomes less visible when a computer mouse is the weak link... 400dpi was fine when we were playing 640x480 60Hz GLQuake but feels much more jittery for 2560x1440 240Hz, resolution and refresh being much higher...
Depending on the game (not all of them) Now when you view scenery through a gun scope, it's sometimes as if you jacked the sensitivity slider higher -- revealing the DPI weak links when you're scanning scenery slowly through a gun scope. Assuming mouse sensitivity and mouse scope sensitivity are adjusted as such -- but most games use lower scope sensitivity, though and sometimes they are independently adjustable (much lower sensitivity for scope has been advantageous). However, regardless of DPI, there can be various situations that pop up in games that suddenly need more DPI for a moment. Activities that force slow mouse movements to do a specific objective, and higher DPI can make these more precise.
The best DPI is a DPI that translates to a tiny subpixel movement (to overcome all Nyquist/aliasing factors where possible) even for your slowest mouse slowturn, with enough positions per second to keep things jitter-free (even 0.5 pixel jitter can be visible, thanks to the way 3D graphics are rendered). With no effect/degradations of your other (faster) mouse movements (due to dpi-versus-sensitivity balance). Ideally, you want multiple mouse dpi per onscreen pixel even for slowturns, though there are diminishing returns when pixels become tinier.
The problem is higher DPI is often interpolated on many sensors. So you want a really good recent sensor that can do 1600dpi (or sometimes 3200dpi) accurately non-interpolated. Since the dpi-interpolation behavior can add a tiny bit of latency and mousefeel issues that some esports players can feel. It's one old-fashioned reason why many said to use 400dpi and 800dpi, but unbeknownst to many, a lot of mouse now accurately do 1600dpi and sometimes 3200dpi just as accurately now. Just don't crank to max (8000dpi or 12000dpi or whatever) since that's often interpolated territory. What you want is a DPI that doesn't degrade during fast flick turns. You don't want to feel the difference during normal fast flick turns.
It does mean your mouse pointer is really superfast when you exit the game, but you can use DPI buttons or mouse-profiles (steam.exe or game exe detection) to switch DPI when you launch/exit games.
This is off-topic from the original problem about FPS stability and has nothing to do with your FPS stability, but wanted to explain another reason why high DPI is useful.