Sparky wrote:Yup, you can certainly learn a lot from statistical analysis, but it still leaves some important questions unanswered, like latency, repeatability, and linearity. Sort of like how FCAT exposes some problems that FRAPS misses.
Agreed. I'll leave it to other software (and our future inventions). What I need is quick confidence of 2000 Hz testing, to be sure that the 2000 Hz is indeed truly useful.
Also most poll testers focus on competitive, but what about desktop use too? Window Desktop is always VSYNC ON.
Color-coded graph visualizations will easily show microstuttering caused by beat-frequency effects between mouse Hz and display Hz (especially for VSYNC ON operations, like composited window managers during ULMB). As everyone have seen my photographic proof of the beat-frequency microstutter oscillations of mouse cursor arrow...
Basically, the mouse cursor position report closest to right before your refresh rate's VBI, then the deltas between those VBIs between refresh cycles, then if the delta fluctuates too much: That's your refresh-rate beat-frequency mouse microstutter effect: Bad for desktop (and also correspondingly, less microstuttery ULMB -- e.g. solo RTS panning -- when supersmooth scrolling/panning is your priority).
Also, never measured by mouse manufacturers, but if you're using a mouse on VRR or ULMB, mouse microstutter that's not human-detectable, still slightly increases motion blur. This is because ultra-tiny microstutter (e.g. microstutter from a 1000Hz mouse on a 240Hz VRR display) increases display motion blur slightly -- even by just 1 or 2 pixels.
Just like a higher-frequency guitar string becomes blurry because it vibrates so fast -- High Frequency Stutter Blends Into Motion Blur
. e.g. 1ms-error microstutter = 1 pixel of stutter amplitude during 1000 pixels/second = additive above-and-beyond display persistence. If a mouse has 1ms of ultra-rapid-microstutter error, that can turn 4ms persistence blur (240Hz) into a 5ms persistence blur. On a 4K low-persistence display, panning at 8000 pixels/second, 1ms ultra-high-frequency microstutter can create an additional 8 pixels of motion blur from ultra-high-frequency microstuttering (stuttering invisible to human eyes that is just a blurred edge rather than vibrating edge).
120 microstutters per second on my experimental 480Hz
display tests, simply generated extra persistence-based motion blur. 120 stutters per second is so fast you can't see the stutter, but it is /definitely/ extra motion blur -- just like a plucked guitar string. So for those "2000Hz doesn't matter" manufacturers, shut up and release 2000Hz mice.
Yes, yes, yes, if mice with 12Kreport rates internally do it well (internally averages to really accurate 1000Hz) then there'll be very little microstutter errors between position readouts. That's really good! Unfortunately, that doesn't solve temporal aliasing-effects between monitor refresh rate (or framebuffer flip on VRR monitors) and the poll time of the mouse. There's always at least +/- 0.5ms timestamp difference between the nanosecond-exact refresh rate timing, and the nanosecond-exact mouse poll timing for a 1000Hz mouse. That's at least 0.5ms of extra motion blur (generated by ultra-high-frequency microstutter that blends into extra blur) for a low persistence display.
For example, 4K strobed monitor (coming this year) -- 2 screenwidth per second FPS turning -- 0.5ms persistence (caused by any means, including UHF microstutters) translates to 4 pixels of extra display motion blur -- lowering the horizontal motion resolution of an 4K display to the equivalent of a 960-pixel-wide display. Chrissakes, 1000Hz ain't enough anymore, we're not in Kansas anymore, Dorothy! How is it that mouse manufacturers have taken so long to release 2000 Hz?
That's ultra-tiny imperceptible microstutter. But when displays becomes low-persistence, it becomes a noticeable motion blur error margin. At 240Hz, you're already down to 4ms persistence for non-strobed. And if you're strobing, 1ms persistence (e.g. Some G-SYNC monitors with "ULMB" turned on) -- this becomes a big error margin. A 0.5ms ultra-high-frequency microstutter adds another 50% to persistence.
You can tell the difference between 0.5ms persistence and 1.0ms persistence (if you have an NVIDIA G-SYNC monitor with ULMB) by doing this test.
1. Go to the TestUFO Panning Map test
, 3000 pixels per second
2. Maximize the window.
3. Enable NVIDIA ULMB.
4. Adjust ULMB Pulse Width down to under 50% (this does 0.5ms persistence ULMB)
5. The street name labels of that fast-moving map, magically becomes readable!
Remember, FPS mouseflicks can have faster panning than this
At 3000 pixels/second motion:
60Hz non-strobed = 16.7ms = 50 pixels motion blur
120Hz non-strobed = 8.3ms = 25 pixels motion blur
240Hz non-strobed = 4.1ms = 12.5 pixels motion blur
ULMB 2ms persistence = 6 pixels motion blur
ULMB 1ms persistence = 3 pxiels motion blur
ULMB 0.5ms persistence = 1.5 pixels motion blur
So, we're forcing this planet to live with a +/-0.5ms persistence modifier caused by UHF microstutters (aliasing effect of 1000Hz report rate granularity in current 1000Hz mice). In the world of low-persistence (e.g. ULMB) and high-refresh rates (e.g. 240Hz and, soon, up), why are we still limiting mice to only 1000Hz?
As displays go ever-ever-ever higher resolutions (retina) and refresh rates go ever-ever-ever higher (retina), the pressure is to lower that 0.5ms aliasing error between frame delivery and mouse poll delivery. Smart engineers, but ones that don't understand the science of a true-1000Hz refresh rate display
(already in laboratory). Goddamnit, why do some mouse manufacturers say "1000Hz doesn't matter" (one of them actually said that) -- that's not true anymore with 240Hz monitors, and will become more of a limiting factor with strobeless ULMB (blurless sample and hold displays)!
And I'm a coauthor of a peer-reviewed conference paper
of a display testing technique (Blur Busters coauthored with NIST.gov, NOKIA and Keltek), so I should know what I am talking about! It is not tinfoilhattery stuff.
Over the coming years and decades -- as display reaches 240Hz, 480Hz, even 1000Hz (expected by year 2025
), even higher-Hz mice will eventually be needed to prevent mouse microstutters (even if imperceptible) from increasing the persistence of the display. 1000Hz mice was definitely good enough for 60Hz displays, but as we now have 240Hz gaming monitors -- a mouse microstutter error margin now starts to become a double-digit percentage
of the display's persistence. 0.5ms extra persistence is 12.5% additional motion blur for 240fps@240Hz sample-and-hold.
It also affects VSYNC OFF and variable refresh rate gaming (e.g. mouse aliasing errors to frametimes) but the beat-frequency artifacts are much more visually noticeable with synchronized fix-Hz operation, and very easily benchmarkable as proof of 2000Hz reducing mouse-versus-display Hz beat-frequency microstutter.
Most mouse manufacturers DO NOT understand the UHF microstutter effect that I have seen (Beat-frequency microstutters too fast to be seen by eye = extra display motion blur = like a plucked guitar string). We are the world's first website to test
a true-480Hz display.
Absolute lag is also truly important too and many other tests do that well already. I want to see additional benchmarking metrics for the future of benchmarking computer mice in the world of "Better Than 60Hz" displays -- making sure 2000Hz mice actually help high-Hz displays
I do give mucho kudos to Cougar for doing 2000Hz. I'm praying it's the Real McCoy. That's why I am writing a visualization test as we speak. I'll post in this thread ASAP.
Nudge, nudge -- all mouse manufacturers. 2000Hz or bust.