In addition to what Razer_TheFriend said:
axaro1 wrote: ↑10 Jan 2021, 05:28
̶Y̶e̶s̶,̶ ̶e̶r̶r̶o̶r̶ ̶r̶a̶t̶e̶ ̶i̶s̶ ̶r̶e̶l̶a̶t̶e̶d̶ ̶t̶o̶ ̶C̶P̶I̶ ̶d̶e̶v̶i̶a̶t̶i̶o̶n̶, the Makalu67 for example had 0%/0.6%/0.6%/1.6%
deviation respectively for 400/800/1600/3200 DPI.
"Deviation"?
Is this really confirmed if no zero-calibration disclosure was posted?
Okay, time for me to open a Pandora Box.
<PandoraBox>
Who decides the zero calibration?
0% for 400dpi suggest that was the zero-calibration.
The problem with 400dpi as the zero-calibration is that it might already be mathematically off internally, but the lowness of 400dpi rounds it off to a very neat zero calibration, where in reality there's a bigger error for 400dpi.
What's to say, if 3200dpi or 1600dpi was the 0%, the others would be the offsets?
I'd like to know more about the zero-calibration, including all data inside mouse reset to 0 (via firmware) when starting 400dpi tests, because if you move a mouse 0.4/400dpi, the mouse "remembers" that even all the way to beginning of your benchmark! Even though co-ordinates are still (0,0)! So the 400dpi may
actually have more error.
In theory, unplugging and plugging the mouse before moving the mouse, can do a fresh zero-calibrate (so it forgets its off setting like 0.4199468/400th or 0.33376/400th or whatever, from a previous mouse move. Since the previous mouse movement doesn't move your mouse to physically exact INTEGER/400th positions on a mousepad. Real world object positions are analog. So the internal mouse calculations may start your benchmark off by 0, e.g. 0.357/400th DPI, even though it reads 0 (instead of 0.357) via rawinput APIs. But then again, moving a mouse for first time often automatically runs some calibration code, so the fresh-plug benchmark may not be the best measure.
That's the tough thing -- no such thing as a "Zero Calibration Reset" API to reset all internal mouse registers back to 0, so all DPI tests stay accurate. In theory, a zero-reset API declares the current stationary mouse position as an integer (0,0) reference, allowing more accurate physical comparison to real world mouse. If one exists and was actually done as part of benchmarking, this should be part of the disclosure.
So, I admonish, who decided the zero-calibration to be 400dpi, without accounting for this?
It's fine that the errors are relative to each other -- and the benchmarks are in relativity accuracy -- but why was 0% assigned to 400dpi? An artificial claim of 400dpi being more accurate? Certainly a zero calibration might have been arbitrarily chosen (0%=400dpi) -- can someone publish the method of zero-calibration done during these earlier tests?
These certainly are good accuracy tests (at least for comparing relative-error) but I may disagree with the method of zero-calibration' without knowing more details about how they zero-calibrated before the benchmark, without erasing certain firmware registers back to zero for more accurate zero-calibration.
A zero-calibration disclosure is needed when claiming 3200dpi has an error relative to 400dpi, especially when we're on 3399's
Example Disclosure
It could be as simple as "
This specific test does not confirm which specif8ic DPI is the most physically real world perfect relative to the actual mouse pad. We have simply chosen 400dpi as the reference (0%). All other percentages are simply deviations relative to 400dpi. In actuality, other DPIs may actually be more physically accurate on the realworld mousepad. However, this is beyond the scope of this specific mouse benchmark, this mouse benchmark simply measures deviations relative to other DPIs."
</PandoraBox>
ERROR 1001 at Line 1: Failed to to close the <PandoraBox> tag
P.S. Also, some games will do the same, e.g. previous mouse movement will result in some internal float/double being away from an integer, but being rounded up/down to nearest, injecting more differences between 400/800/1600/3200. This is why the proposed
High Definition Mouse API outputs floating point mouse coordinates to encourage maximal precision and help solve this problem. Because it can be the software's fault rather than the mouse's fault for creating errors too, e.g. not respecting zero-calibration rules during integer mouse benchmarking. Sometimes 3200dpi is really worse, while at other times, 3200dpi is way more accurate -- depending on how properly the programming is done. The fact is, the more proper the programming (in both mouse and game), 3200dpi (at 1/8 sensitivity compared to 400dpi) can have less deviation than 400dpi.