schizobeyondpills wrote: ↑23 Jun 2020, 17:22
Whenever you or anyone else is ready to sponsor equipment I will prove all of my "schizo" placebo facts to be true.
Only the big companies can afford sufficient enough equipment to audit the entire Rube Goldberg contraption -- e.g. Intel, that expends a lot of effort in trying to reduce various latencies.
There is already tools/equipment that measures tiny fractions of the chain -- including those mentioned in these forums -- but to measure how all those billions of transistors cascade nanorseconds into milliseconds, takes some really serious time & effort. Currently, this is not Blur Busters speciality or budget to expend testing efforts in this area -- however, we encourage innovation by end users to find ways to reliably generate proof.
Now.... Some good points; For example, I've seen how power management (at low frame rates) and thermal throttling (at high % utilization) generate random input lag; when a CPU/GPU was underutilized or overutilized and they downclocked to powersave or to protect. I've even seen leak into TestUFO frametime volatility analysis as web browsers power-managed itself. Especially laptops. Watching how those sub-milliseconds sometimes cascade to TestUFO stutters in realtime with this colorcoded visualization is quite mesmerizing on about 1 in 5 systems that had weird power management behaviours with certain web browsers, since realtime CPU upclocks/downclocks become hugely visible in this TestUFO chart, and we already know frametime is one part of the lag chain. I am totally not surprised that this happens to game engines to a lesser extent, and also is one of the many causes of VRR stutter leakage (gametime:photontime divergences). This testing area IS more of Blur Busters purview -- watching the Rube Goldberg machination in this particular more easily-testable areas where I am able to innovate on various tests.
Also, rather than reviewing displays directly (Except for special editions like G-SYNC 101) -- Blur Busters is a display laboratory covering a lot of
display temporal topics (latency, VRR, refresh rate, frame rate, GtG, MPRT, strobing, pixel response, etc) and
invent test methods for other display reviewers.
Hats off to those researchers (usually proprietary, hired by the big companies) who actually do this. Like Intel or Sony (PS5) or others. One can just use a simple tool to measure tiny glimpses and hints, but for a complete audit trail of proof -- one has to essentially go thousands times more detailed (scholar worthy), really takes a lot of time and effort. And often those big-money endeavours are often proprietary research that is not typically shared on a reviewer/blogger site -- and sometimes you have to fight a paywall to access some academic papers that don't even go sufficiently detailed on these matters.
CPU/RAM latency testing just simply isn't a Blur Busters speciality at this time, nor have the resources to do so. However, this topic is forum-worthy because it's about computer hardware & latency, despite being understudied for direct esports corroborations (e.g. realtime RAM latency analysis mid-esports-game, to monitor how it all cascades up).
schizobeyondpills wrote: ↑23 Jun 2020, 17:22
- your 1000Hz mouse cannot be made 8000Hz because it takes at least 8 * 1000 writes/reads to RAM per second to get those packets to the game engine, and not even that 1000Hz is stable/consistent
However...Indirectly, Blur Busters may help in ways. For example, a
TestUFO Mouse Benchmark is being built, thanks to new HTML5 APIs for full-pollrate raw input modes, which now makes possible webpages that measure a gaming mouse (as long as the . One click away from new mouse benchmark relevant to refresh rate race to retina refresh rates (unlike those existing simple numbers-outputting mouse benchmarkers).
We're big time advocates of increasing mouse poll rate for the refresh rate race to retina refresh rates -- and realize some of the timing/latency error margins like what you say can Rube-Goldberg itself into visible problems, including for improved poll rates, so at some point the universes may overlap, forcing manufacturers to improve hardware (CPU / mobo / RAM / etc) to make poll benchmarks look good. Browsers do have limitations in real-world (they're not reliable reproductions of a game engine's influence on latencies) but more public mouse data, the merrier.
In tomorrow's world, an example debate could theoretically be "Why does 2000 Hz poll look so jittery/dirty" type of benchmarks, and changing various system configuration parameters might suddenly improve/worsen the look of polling in realtime, to real-world mouse microstutters and such. Those users can say "Look at how my two TestUFO Mouse Poll Tests look, before/after my tweaks! And it actually made my CS:GO game feel better too!". It may be a simple domino, but we'll try out this domino and see how it cascades to industry change (in theory).
..And that's the typical pioneer role of Blur Busters to play in lifting all boats in a trustworthy way. Manufacturers notice and they optimize. Just as they already do for displays, thanks to all the tests Blur Busters has invented...