I think I need to write a piece educating game developers ("Future Proof Your Game Engine" on how to properly add sensitivity settings. They need to be triple digit (1.000 down to 0.001) and always use double-precision mathematics (skip the float, skip the int, go straight to double). That's because gaming 400dpi at "1.000 Sensitivity" creates this situationsharknice wrote: ↑22 Oct 2020, 19:07Yeah a button on the mouse to switch DPI is very nice for that.
When I tested Valorant there was something wrong with the game at the very lowest sensitivity and it won't accurately pick up mouse movement. I can't remember the cutoff where it works accurately, but it still let's you get very low to take advantage of higher DPI. It runs like a dream when you get it tweaked properly. I tweaked my settings before I got to the target practice and got a perfect score the first try.
0400dpi at 1.0000 sensitivity
0800dpi at 0.5000 sensitivity
1600dpi at 0.2500 sensitivity
3200dpi at 0.1250 sensitivity
6400dpi at 0.0625 sensitivity
So future games need minimum 4 digits after the sensitivity, and allow keyboard typing-in the sensitivity, since sliders are too steppy/granular. Stop using 2 decimal digits for sensitivity settings.
Yes, it's overkill, but it's much easier to convert 400dpi to 6400dpi with zero fast flick turn feel. And when doing mathematics with other things (3D engine), floats can lose precision rather quickly over a long series of mathematics, so sometimes doubles are a good interim storage format between (int) then (double) then finally (float) since most GPUs are still generating 3D graphics at single precision to keep framerates high. Inside a monitor motherboard, it is like 10-bit color processing accepting the 8-bit GPU framebuffer to display onto an 8-bit monitor. It still helps reduce color banding (rounding errors from color processing). Likewise, you need higher-precision intermediate format sometimes to reduce rounding errors between source (mouse) and destination (3D render).
Some games are butter smooth on all monitors (50Hz through 360Hz) and all sync technologies (VSYNC ON, VSYNC OFF, VRR, G-SYNC, FreeSync) thanks to my help. For example, This Steam Release Notes credit me for helping refreshrate future-proof their game:
So, I've already helped developers in improving variable refresh rate support, Unity Developers: Easy G-SYNC & FreeSync Support in only 3 lines of programming code! and schooling software developers on the The Amazing Human Visible Feats Of The Millisecond. Although I can sometimes be wrong, I've been proven right so frequently in this refresh rate race to retina refresh rates, that I've been cited in over 20 peer reviewed science/research papers, most recently by Samsung developing a new motion blur measurement method. My reputation now speaks for itself. Even my casual writings on these forums almost become textbook reading material when they regret not reading something I was right about 5 years ago.Steam Announcement wrote:We would also like to thank community member Mark Rejhon of Blur Busters for his assistance and comprehensive testing of the stuttering issues.
Thus, I think by end of year, I'll be writing a Future Proof Your Game Engine article that touches on many weak links:
- Any-VRR compatibility (See VRR developer HOWTO, don't stutter improperly on VRR)
- Any-Hz compatibility (See 1000Hz Journey, the holy grail to achieve lagless fullbrightness strobeless ULMB blurless sample-hold)
- Any-Hz mouse compatibility including 8000Hz mice including accurate between-frame button trigger mathematics.
- 55fps 55Hz CRU trick to debug non-60fps engine stutter if you're stuck on garden-variety DELL 60Hz office monitors
- 4-digit decimal sensitivity settings.
- Separate sensitivity settings for inventory menus (versus existing separate sensitivity settings for FPS / for scope). You can make them say "AUTO" to match Windows mouse pointer speed (check system metrics), while allowing you to configure in-game-engine sensitivity AND scope sensitivity.
- Use doubles for mouse mathematics, not integers or floats. Cast all int coordinates early as possible to doubles before mathing them! Allows mouse dpi to scale FAR better.
- Developer must understand the Vicious Cycle Effect to understand why futureproofing is critical (higher resolution amplifies Hz limitations; higher Hz amplifies resolution limitations, etc).
- Developer must understand Stutter-to-Blur continuum (like VRR demo, identical vision physics to guitar strings: slow strings vibrate, fast strings blur), and must whac-a-mole all single-pixel microstutter that can blend to 1-pixel motion blur, even 1000Hz mousejitter can do it too;
- Optimize asset loads as much as possible to be completely jitterless on normal SSDs. Small asset loads should not even have a 1 pixel microjitter (invisible at 60Hz LCD but visible during 240Hz ULMB 0.5ms MPRT)
- Don't stop there. Also optimize asset load stutters by testing your game engine in a RAMdisk or Optane memory. A game company should have at least 1 test system decked out for this testing purpose. Any asset streaming stutter becomes assumed a CPU/GPU inefficiency, not a disk assumption. Then asset streaming stutter becomes much easier to debug, and more futureproofed to future faster SSDs
- Never Make Assumptions about Human Visibility
- And MANY other Developer Best Practices for Refresh Rate Race to Retina Refresh Rates
I'm not even dropping microphones anymore, they're now being golfed into holes-in-one. Let's golf that microphone 400 yards into a hole in one on the first try -- I've lost count of successfully holes-in-ones from my golfed microphones!