Discussion about Legitimacy of HTML5 Mouse Testing

All Unidentified Forum Objects go in this area! Any fun alien talk goes in this U.F.O. Abduction Lounge, even topics other than monitors or computers. Introduce yourself!
User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Discussion about Legitimacy of HTML5 Mouse Testing

Post by Chief Blur Buster » 07 Nov 2020, 22:11

Chief Blur Buster wrote:Notice About This Forum Thread
Cherrypicked posts moved from the original thread I have the new Razer 8000 Hz prototype gaming mouse on my desk. These posts are mostly offtopic but are useful discussion for the researcher community.

This thread is about a FUTURE UPCOMING TEST to be launched
IMPORTANT: TestUFO Mouse Tester is NOT LAUNCHED YET.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
schizobeyondpills
Posts: 103
Joined: 06 Jun 2020, 04:00

Discussion about Legitimacy of HTML5 Mouse Testing

Post by schizobeyondpills » 07 Nov 2020, 22:20

thats fine but your goals create tooling and research/articles which are causing harm to the actual and only community which benefits from these high performance devices, especially mice. without warning labels about your >goals< these in browser tests harm those who fight for latency in their games because people call them insane or silence them by linking to your articles/testing tools, and ofcourse without having very deep knowledge about software engineering and mathematics/perception they are incapable of defending their position, and they get silenced. so you are harming the actual and only benefitors of your own ideals(regarding mouse, monitors require no interaction to use)
Chief Blur Buster wrote:
25 Oct 2020, 18:32
Perfect is the enemy of Good Enough.
Good Enough is the enemy of Evolution which is done by chasing Perfection.

its "good enough" temporarily because someone concluded without thinking ahead(or not ahead enough) that it is good enough, until evolution comes in and now that good enough is bad enough because it wasnt perfect, and all the research,findings, evaluations found using "good enough" tooling cause entire timeline from when that good enough tool was released to the point someone figured out its not perfect but "good enough" to be destroyed, it is not evolutionary progress, but compensated progress. see how that turns out in the long run?
Chief Blur Buster wrote:
25 Oct 2020, 18:32
Our goals are different. Browsers are now reasonably high performance animals under the hood, it's not like 1990s where you can't run bitcoin mining & emulators (Internet Archive).
why would one compare current fraction of technology against 1990s and not against todays tech? how does browser compare to a user mode native(precompiled) real time rendering application at 240hz/360hz, you cannot have enough access in browser compared to native windows API to have any accuracy or precision needed to remove uncertainty and gain some resolution.
we are talking about high performance real-time simulation here that uses 144hz+ displays, 1000Hz+ mice, 128 tickrate server, atleast 144FPS+ rendering of 10 players, not bitcoin mining or emulation that are a paper plane complexity and system load compared to F-16 fighter jet being eSports game.
Chief Blur Buster wrote:
25 Oct 2020, 18:32
You have seen me write Tearline Jedi in C#. Check this thread out of microseconds being done in C#.
space and time are unified and one tradeoffs the other, meaning you can have nanoseconds in C# but only if your computation doesnt take much computation, aka raster chasing compared to polling/processing input 1000-8000 times a second, processing 128 tickrate for 10 players, rendering at 144+ FPS etc in CONSTANT INTERACTIVE-REAL TIME. (did i mention all of this has to be jumped through managed into native world - and garbage collected afterwards? - per frame ofcourse =) )
Chief Blur Buster wrote:
25 Oct 2020, 18:32
I program in C# about 10x faster than I can program C++, since it is a Rapid Application Development language to me.
so then your use case is not for high performance low latency scenarios and that includes research/articles/findings derived from your tooling that u created for average/casual/mass audience. why do you refuse to accept this? all we "elitists" ask for is a disclaimer your findings dont apply for highly optimized low latency user-interactive systems targeted for esports, but rather than for "hardware lovers" and below.
Chief Blur Buster wrote:
25 Oct 2020, 18:32
A program that exists is always better than no program at all. I can do C++ but I leave most of it to other people.
no, a program that exists is always better than nothing only if users are aware of limitations/gotcha's of that program.
Chief Blur Buster wrote:
25 Oct 2020, 18:32
I'm bringing mouse testing to the mainstream, whether you like it or not. :)
i dont like when people take pictures of their food/plate because i feel sorry for those less fortunate who are starving or worse. thats my opinion, but a 1000hz-8000hz (or more in future) measuring tool being flawed from the start due to being made in a browser is not my personal opinion but a fact (which i backed up with multiple objective arguments)
Chief Blur Buster wrote:
25 Oct 2020, 18:32
I would like someone to see do C++, be my guest.
i only play with legos at the speed of light, sorry. (meaning x64 ASM and C - and only for 6 figures or more).
Chief Blur Buster wrote:
25 Oct 2020, 18:32
the error margins of GPU-accelerated web browsers is powerful enough to be still comfortably beyond the Vicious Cycle Effect at least on a properly optimized computer and I can still autodetect when the browser is too errorprone
again, you miss the context of my argument, components work differently under higher loads for longer periods of time, a browser cannot even schedule your code on the same level as user mode app, it cannot be optimized to such levels (even with WASM). a mouse/display(and whole PC/system) perform far far worse under actual load ingame vs semi-idle browser load. not to mention the whole motherboard and CPU temperature rise that causes bigger and more impactful voltage ripple effects, harms signal integrity (that includes all components combined, at runtime - USB-NIC-sound-display-PCH-VRMs-CPU performance-traces-cables behind case due to exhausting case temp of huge GPUs working to render a game in real time)
Chief Blur Buster wrote:
25 Oct 2020, 18:32
Others can build better mousetraps than I can do; but I love to invent the easier mousetraps. Puns intended.
nothing wrong with that, all tools and programming languages have their purpose, it is only when the users of good enough languages/tools for their use case refuse to be aware of this and become willingly ignorant, then there a problem.
Chief Blur Buster wrote:
25 Oct 2020, 18:32
I will create a bigger reply later, assuming you keep this thread turn-based -- I need time to reply. Appreciated!
chess as the most praised turn-based game is a flawed game in which one player hands out the most valueable thing in all of reality (time) to his opponent for every turn he makes. i prefer arena fps genre which tests real-time combat in the arena.
Chief Blur Buster wrote:
25 Oct 2020, 18:32
I Already Do Microseconds in HTML5 and C#, Even Defuzzing Meltdown/Spectre
microseconds rendering what? under what load of the whole system? thats the problem, its one thing to do microseconds on idle, another to do them ingame. to have accuracy and precision with good resolution of measuring how something performs under real life load for what it was meant to be used (esports) then you need to go that extra mile of writing it in C++/C/ASM and optimize it well so that those 3rd decimals(or deeper) reveal the actual statistics which happen during real life load (ingame). is that simple. u can have nanoseconds in browser but they fall apart once you observe their context.
Chief Blur Buster wrote:
25 Oct 2020, 18:32
TestUFO is completely handcoded, uses no third party frameworks (except Google Analytics), zero jquery, zero Angular, zero anything. Yes, yes, yes, C++ is nice and higher performing, but I'm all about one-click-awayness these days.
yeah, in javascript. which is the main issue here. also C++ is a very bloated language, highest performance gains come from real software engineering/architecture aka assembly. a compiler cannot possibly describe and reason about context of the program on same level as concious selfaware human who can access manuals/test/benchmark. (C++ to asm optimization is 1 to 1 million in scale of difficulty and perf gains)
Chief Blur Buster wrote:
25 Oct 2020, 18:32
fast enough that it even works on semi-older iPhones/Android/iPads, as long as it's at least a weakly GPU-accelerated browser.
where it works is nothing without asking what its doing, is it rendering at 240hz on 1920x1080 res? is it polling 1000 times a second for mouse and kb and processing 128 tickrate packets for 10 players? see, thats the difference, context, to compensate for lack of actual computational and interactive real time load one must go deeper into accuracy and precision to acquire necessary resolution.
you can only do that with basic level engineering aka assembly/C/C++ comes close enough due to compiler handholding.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Discussion about Legitimacy of HTML5 Mouse Testing

Post by Chief Blur Buster » 07 Nov 2020, 23:22

schizobeyondpills wrote:
07 Nov 2020, 22:20
thats fine but your goals create tooling and research/articles which are causing harm to the actual and only community which benefits from these high performance devices, especially mice. without warning labels about your >goals< these in browser tests harm those who fight for latency in their games because people call them insane or silence them by linking to your articles/testing tools, and ofcourse without having very deep knowledge about software engineering and mathematics/perception they are incapable of defending their position, and they get silenced. so you are harming the actual and only benefitors of your own ideals(regarding mouse, monitors require no interaction to use)
Actually, the way you write this is you're pre-judging a test you haven't seen yet.

From my read of your writing, it sounds like that my TestUFO is more different than MouseTester.exe than you think because it creates a new method of mouse testing that is more user-agnostic and agreement-generating (fewer naysayers, more researchers who agree, etc). It is more mutual to your goals than you think. Also, it does not preclude other kinds of mouse testing that benefits from C++/ASM/etc. The visualization displayed in TestUFO is over 100x easier for an average Joe User to interpret than MouseTester, and already has gotten compliments by researchers. I suggest throttling your disseratiations as replying to these waste a lot of my time, and does a disservice to research. Peer-review doesn't happen without seeing the very thing being reviewed.

You are welcome to create your own separate mouse tester and contribute it to the research community.
schizobeyondpills wrote:
07 Nov 2020, 22:20
Good Enough is the enemy of Evolution which is done by chasing Perfection.
This is precisely why people disbelieved 30fps vs 60fps for longer than necessary. This paragraph is so ridiculous it does not deserve a reply, given that pursuit of perfect can take 100 years instead of 1 year or 1 month. We have to do this more incrementally. Also, TestUFO Mouse Tester isn't a monopoly. You're just wasting my time trying to assert something that isn't necessary to assert hard.

Also, adjusting the level of patience is a strategic move of a researcher to allow time for a researcher to generate peer reviewed content. I suggest you pause further critical commentary until reviewing the actual upcoming tests -- you're criticizing something that hasn't even been published.

This TestUFO mouse tester is not intended to emulate a real-world game, but is far more representative of real-world than MouseTester is, because the internals of a browser has a lot more in common with a game engine, than the MouseTester.exe executable. I can even play WebXR VR content at a rock steady 72fps or 90fps in my Quest 2 VR headset, and already PointerEvents+PointerLock HTML5 API already gives 1000Hz rawinput while allowing simultaneous graphics rendering (whether TestUFO graphics or full WebGL).
schizobeyondpills wrote:
07 Nov 2020, 22:20
why would one compare current fraction of technology against 1990s and not against todays tech? how does browser compare to a user mode native(precompiled) real time rendering application at 240hz/360hz, you cannot have enough access in browser compared to native windows API to have any accuracy or precision needed to remove uncertainty and gain some resolution.
You're thinking about this from the wrong angle (Although you'd be surprised at what I can do in JavaScript nowadays). It's still intended to be just a (good) synthetic mouse tester like MouseTester.exe but far more real-world than MouseTester.exe because it pushes CPU/GPU/mouse simultaneously a bit harder than MouseTester.exe does. Most benchmarks are synthetic anyway and the more benchmarks the merrier. If you want to write a C/C++/ASM benchmark, please go ahead, but it's more synthetic unless you write the mouse benchmark in a game engine.

Also, we aren't just only about esports here. High Hz is for everybody, even non-gamers. Many games I play are written in C# -- Unity Engine games. Many want to popularize mainstreamization angles, improved precision in easy languages, etc. (Remember: Even things like an upcoming future mouse API in 2025 that timestamps sensor reads, so that the game engine can keep sensor reads in sync with gametimes, despite engine jitter etc). Even things like browser bar scrolling needs to be subpixel capable in the future, where dragging the scrollbar looks as smooth as keyboard up/down even for a large multipage scroll. An upcoming high definition mouse API solves a lot of legacy things, while allowing developers to continue programming their games in easy languages while still getting perfect framepacing -- you know, I have an Oculus Rift and Oculus Quest 2 VR headset, and it's impressive how much better the framepacing optimization is in Oculus-based VR applications, as well as top-class SteamVR apps (Half Life Alyx), and I want to see more of those practices filtered down to non-VR applications.
schizobeyondpills wrote:
07 Nov 2020, 22:20
space and time are unified and one tradeoffs the other, meaning you can have nanoseconds in C# but only if your computation doesnt take much computation, aka raster chasing compared to polling/processing input 1000-8000 times a second, processing 128 tickrate for 10 players, rendering at 144+ FPS etc in CONSTANT INTERACTIVE-REAL TIME.
I know, but this still misses the point. Arguing the point further just wastes time because we are more in tune except we're still stubbornly thining of 2 different things because you haven't seen the mouse tester or perhaps even unaware of other behind-the-scenes works such as early talks for a high definition mouse extension API (ETA 2025) that involves timestamped pollreads and full sensorrate / full dpi / etc access (even at low poll, since the timestamped polls enables batched reads within USB limitations / jitter). Batched transmissions of mouse-side microsecond timestamped sensorreads vastly reduces systemload of 20KHz polling, without needing a 20KHz pollrate, while becoming more precise within the permitted jitter window (potentially configurable, in a forgivingness-vs-latency tradeoff).

Systems will never be microsecond perfect, but this is fully compensateable via rethinking the mouse architecture in an industrywide standardization. Remember I have experience working with industry standards (e.g. XMPP Extensions XEP-0301, because I am deaf). I have no preference to a Razor for the 8000Hz move except simply that Razer is the most willingly co-operative of my requests/asks and sending me an 8000 Hz mouse -- so I reward companies that want to cooperate with me but I also vociferously disagree with my vendors when they try to do things that are contrary to progress. Those who know me well, respect my neutrality (if you've been paying attention to those threads).

Rising tides lifts all boats, and I'm happy to accept an AtomPalm (8000Hz) or a Cougar Mios X5 (2000Hz) but I have every intention to get all mouse vendors to look at the High Definition Mouse Extensions Specification (ETA 2025). For that, I am going to create a new forum thread about the proposal of the High Definition Mouse Extensions API -- a rethink of mouse architecture. We are also going to try to get at least 3 or 4 game vendors aboard too. I spill the beans early, only to shut you up... Wait a bit, THEN peer review what I do...\\
schizobeyondpills wrote:
07 Nov 2020, 22:20
so then your use case is not for high performance low latency scenarios and that includes research/articles/findings derived from your tooling that u created for average/casual/mass audience. why do you refuse to accept this? all we "elitists" ask for is a disclaimer your findings dont apply for highly optimized low latency user-interactive systems targeted for esports, but rather than for "hardware lovers" and below.
Read above. What I am doing is already loved by majority of hardware lovers
schizobeyondpills wrote:
07 Nov 2020, 22:20
no, a program that exists is always better than nothing only if users are aware of limitations/gotcha's of that program.
So? That's exactly what I'm doing. I already said I will be posting a disclaimer. Repeating something you falsely claim I won't do, does not endear you very well with other researchers.
schizobeyondpills wrote:
07 Nov 2020, 22:20
measuring tool being flawed from the start due to being made in a browser is not my personal opinion but a fact
This creates a disagreement when there is none necessary.

All benchmarks are flawed to an extent, whether be synthetic or limited or only one engine etc. A browser based benchmark increase in timing imperfections is more than compensated by being more perfect in other respects (browsers simulate the mechanics of a game engine more than MouseTester.exe, AND expands access to test results). Besides, the timing imperfections are well below the noise floor of timing imperfections in a typical C# Unity Engine game, I have more precision than an average game engine, yet still being concurrently more CPU/GPU/mouse loaded than MouseTester.exe since it's drawing realtime color-visualized OpenGL style graphs during mosue moves. It's absolutely beautiful to watch how the visualization changes in realtime, and is highly screenshottable to show things like cyclic patterns, nouse patterns, precision patterns. Also, seeing precise mouse timings despite a browser, means things are being done correctly, becasuse if a system is so well-tuned, that a browser mouse tester produces results more accurate than expected, it means a well optimized system. The browser is simply a realworld engine/obstruction to mouse precision that a user can optimize for, and such optimized systems will usually perform better in a game. You're not thinking of all the real world variables, and are too focussed on an excessively synthetic concept of mouse tester (e.g. assembly language mouse tester, when mose games are written in C++/C#/etc). Sure, there's utility to different angles of synthetic mouse tests. I don't have a monopoly on mouse testing. If my tester inspires others people to write better open source ANSI C / C++ mouse tester app for people LIKE YOU, then I have succeeded with TestUFO mouse tester. Hounding and wasting my time with your incessant nitpicking does the research community a disservice, not me.
schizobeyondpills wrote:
07 Nov 2020, 22:20
i only play with legos at the speed of light, sorry. (meaning x64 ASM and C - and only for 6 figures or more).
Perfect. Go work for SpaceX. I'm doing the refresh rate equivalent of turning billion-dollar AEGIS phased array antennas to a $500 commoditized Starlink phased array antenna. Many are surprised what i can do with browser engines.

schizobeyondpills wrote:
07 Nov 2020, 22:20
again, you miss the context of my argument, components work differently under higher loads for longer periods of time, a browser cannot even schedule your code on the same level as user mode app
You miss the context of my argument too. You have a good argument, and I have a good argument. Just because your argument is good doesn't mean my argument is bad. Read the above and you can then realize that multiple different kinds of mouse tests can coexist. There are many different tools in the same toolbox.

Besides, if the test looks amazingly accurate, then you've optimized the system brilliantly that things look so accurate in a browser. It's a failsafe, not a faildanger, in that sense. It also forces browser vendors to optimize too, when users complain (e.g. 1000Hz works, but 8000Hz doesn't in browser, because one constant in Chromium engine trunctates the PointerEvents size unnecessarily, there are some silly bugs in Chromium that massively improves accuracy with small changes). But even that is not the point.

The point is, it produces a new easy realtime mouse visualization that will push forward improvements in all angles (more people releasing more mouse testers, more people releasing high-Hz mice, more people interested in High Definition Mouse Extensions API, etc). Your narrow vision doesn't respect/acknowledge the huge benefits industrywide, to other researchers, to users, to precision, in a vendor agnostic way.
schizobeyondpills wrote:
07 Nov 2020, 22:20
it cannot be optimized to such levels (even with WASM).
As we go to 1000Hz, games are being written in higher and higher level languages, and we need to architecture them in a way that futureproofs their accuracy better with less jitter.

We may disagree on the standards of how tech progress go, so I'll leave this here:

Image

Create your standard, I'll create mine, and we can see which one lights the fire under the seats of the industry faster... And we can merge standards to put fires out. Ha. But it must be C# + HTML5 + etc compatible (like timestamped mousepolls that can survive the jitter of USB and garbage collects). Anything that "survives" C# and HTML5 will be shockingly accurate with C++ and ASM.

Despite disagreements, our goals are more in common than you think: You see, create a mouse method that survives imprecisions of modern game development with spectacular latency and framepacing, and it becomes even more uncannily accurate in C/C++/ASM. See? Our goals aren't THAT different. So you are truly arguing unnecessarily against a test that will already have a disclaimer anyway. I would appreciate a bit more HTML5 respect from you as a result of this paragraph's explanation. Go work with VXworks RTOS on your favourite satellite, but we're doing C#/C++ games nowadays.

I have already done some amazing stuff helping a C# Unity developer reduce gametime:photontime sync errors by over orders of magnitude, it's called CloudPunk. Game developer credit. The divergences used to be up to 16ms stutters for multiple reasons, now it's greatly reduced to sub-1ms stutters most of the time, with many sustained moments of sub-250us gametime:photontime relative divergence. Hz-futureproofed, much lower lag at all mouse poll rates, better VRR-compatible, better VSYNC OFF compatible, better odd-Hz VSYNC ON compatible, less latency yo-yo effects. Over 50 users posted amazing acclaim on DIscord several weeks ago to this release, replacing the complaints of the major stutter and mouse lag.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Discussion about Legitimacy of HTML5 Mouse Testing

Post by Chief Blur Buster » 08 Nov 2020, 01:07

schizobeyondpills wrote:
07 Nov 2020, 22:20
[...]
Linking relevant thread:

High Definition Mouse Extensions API

This was in my mental works for more than a year, but now is a good time for this to be mentioned -- since Blur Busters' Mission as a temporal hobby-turned-business (GtG, MPRT, lag, VRR, framepace, jitter, stutter, etc), to whac-a-mole a lot of weak links in the refresh rate race to retina refresh rates in a vendor-agnostic way.

Disclaimer: Other than the ads/affiliate links, I derive no direct funds from this specific initiative (no financial contribution was made for the upcoming 8000 Hz tests and increased mouse focus). The only financial disclaimer is I received free samples (Razer 8000Hz mouse) -- I welcome all mouse vendors of >1000Hz mouse to send theirs too, because we are fans of lifting all Hz boats. Razer simply is the most co-operative mouse vendor for Blur Busters, so this is an opportunity to kickstart things. However, I brought up the idea publicly first and they liked it. But I intend to shop the specification to all mouse vendors since a standard only happens when multiple vendors agree. People see me frequently disagree with vendors despite me having ads/affiliate, and others are familiar with my neutrality if they've read the other threads, as I leave lots of money on the table to uphold Blur Busters' principles. However, I am always glad to work with vendors that work with me to lift all Hz boats for everybody industry wide. More improvements industry wide, the merrier.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
schizobeyondpills
Posts: 103
Joined: 06 Jun 2020, 04:00

Re: Discussion about Legitimacy of HTML5 Mouse Testing

Post by schizobeyondpills » 08 Nov 2020, 02:15

Chief Blur Buster wrote:
07 Nov 2020, 23:22
Actually, the way you write this is you're pre-judging a test you haven't seen yet.
no pre-judging, I block people who play games with browser open. Chrome is well known bloated browser, to the point people make memes out of it. the only thing worse engineered for low latency/high performance and more bloated than Chrome is Windows 10. Chrome comes pretty close on 2nd place.
Chief Blur Buster wrote:
07 Nov 2020, 23:22
You are welcome to create your own separate mouse tester and contribute it to the research community.
i dont contribute to groups who write fairy tales aka researchers, i roll with those who change the world.

schizobeyondpills wrote:
07 Nov 2020, 22:20
Good Enough is the enemy of Evolution which is done by chasing Perfection.
Chief Blur Buster wrote:
07 Nov 2020, 23:22
This paragraph is so ridiculous it does not deserve a reply, given that pursuit of perfect can take 100 years instead of 1 year or 1 month. You're just wasting my time trying to assert something that isn't necessary to assert hard.
It does deserve a reply because evolution takes blood sweat and tears to progress forward. its painful to know how much you dont know, to go into the unknown, to fail a thousand times, be on brink of giving up, and then succeed. thats evolution, people fear it and run from it into "good enough" engineering and lying to themselves to feel good with whatever they have just to not feel evolution. the moment that "good enough" engineering produces something they start repeating the classic delusion "dont reinvent the wheel" just to further reinforce their delusion bubble made to shield them from the fear of the evolution.
Chief Blur Buster wrote:
07 Nov 2020, 23:22
Also, adjusting the level of patience is a strategic move of a researcher to allow time for a researcher to generate peer reviewed content. I suggest you pause further critical commentary until reviewing the actual upcoming tests -- you're criticizing something that hasn't even been published.
i dont have to wait if i can conclude that Chrome is bloated software and have evidence of it varying 10-30% per release in 5 different tests. again, for your use case to Joe its great tool, but for those who walk beyond the decimal seperator its nothing but headache to look at/use.

comparing 1000Hz and 1000.0 Hz, the difference between having 1000.0Hz is 10 times harder to achieve/measure/observe. especially under loaded system.
Chief Blur Buster wrote:
07 Nov 2020, 23:22
This TestUFO mouse tester is not intended to emulate a real-world game, but is far more representative of real-world than MouseTester is, because the internals of a browser has a lot more in common with a game engine, than the MouseTester.exe executable. I can even play WebXR VR content at a rock steady 72fps or 90fps in my Quest 2 VR headset, and already PointerEvents+PointerLock HTML5 API already gives 1000Hz rawinput while allowing simultaneous graphics rendering (whether TestUFO graphics or full WebGL).
did i mention i block people that send me MouseTester graphs? its a joke of a tool for 10 reasons I wrote before, and i can write 50 more, and all of them affect eachother, so please dont use MouseTester as anything of quality reference or argument. MouseTester is more pointless than in browser measures how flawed it is.
Chief Blur Buster wrote:
07 Nov 2020, 23:22
You're thinking about this from the wrong angle (Although you'd be surprised at what I can do in JavaScript nowadays). It's still intended to be just a (good) synthetic mouse tester like MouseTester.exe but far more real-world than MouseTester.exe because it pushes CPU/GPU/mouse simultaneously a bit harder than MouseTester.exe does. Most benchmarks are synthetic anyway and the more benchmarks the merrier. If you want to write a C/C++/ASM benchmark, please go ahead, but it's more synthetic unless you write the mouse benchmark in a game engine.
good is to great what bad is to good. high frequency devices/system require great engineering, not good, deff NOT good enough.

yes benchmarks are synthetic but this is not a benchmarking discussion, but testing discussion, testing needs to be such that you engineer for accuracy of real world usage with minimum overhead. is absorbing the sun (Chrome) a reflection of such engineering?
Chief Blur Buster wrote:
07 Nov 2020, 23:22
Also, we aren't just only about esports here. High Hz is for everybody, even non-gamers.
unfortunately no, high frequency/low latency components are only for a niche % of human race with high enough mental clockrate to just be able to perceive them, 99% of human race is very happy with 24 FPS movies/Netflix.
Chief Blur Buster wrote:
07 Nov 2020, 23:22
Many games I play are written in C# -- Unity Engine games. Many want to popularize mainstreamization angles, improved precision in easy languages, etc. (Remember: Even things like an upcoming future mouse API in 2025 that timestamps sensor reads, so that the game engine can keep sensor reads in sync with gametimes, despite engine jitter etc).
Yeah, but why would you need to timestamp reads? timestamped reads are nothing but compensation for latency bridge or jitter/desync, which should be the main fixing problem, reducing it past a certain point so that you dont need timestamps.
Chief Blur Buster wrote:
07 Nov 2020, 23:22
Systems will never be microsecond perfect, but this is fully compensateable via rethinking the mouse architecture in an industrywide standardization. Remember I have experience working with industry standards (e.g. XMPP Extensions XEP-0301, because I am deaf). I have no preference to a Razor for the 8000Hz move except simply that Razer is the most willingly co-operative of my requests/asks and sending me an 8000 Hz mouse -- so I reward companies that want to cooperate with me but I also vociferously disagree with my vendors when they try to do things that are contrary to progress. Those who know me well, respect my neutrality (if you've been paying attention to those threads).
Depends on the system hardware and software components. Again, there can be no compensation for something real time.

https://cis.gvsu.edu/~wolffe/courses/cs ... n-CACM.pdf
There is an old network saying:
Bandwidth problems can be cured
with money. Latency problems are
harder because the speed of light is
fixed—you can’t bribe God.”
Chief Blur Buster wrote:
07 Nov 2020, 23:22
So? That's exactly what I'm doing. I already said I will be posting a disclaimer. Repeating something you falsely claim I won't do, does not endear you very well with other researchers.
Yes, but what kind of a disclaimer? ultra low latency sensitive disclaimer or a general one? big difference.
Chief Blur Buster wrote:
07 Nov 2020, 23:22
This creates a disagreement when there is none necessary.
there's no disagreement if one side provided evidence of exactly that browser having 10-30% performance variations per version, on a benchmark optimized stripped OS/browser without any load like interrupts or extensions in JS, or multiple tabs etc.
Chief Blur Buster wrote:
07 Nov 2020, 23:22
All benchmarks are flawed to an extent, whether be synthetic or limited or only one engine etc. A browser based benchmark increase in timing imperfections is more than compensated by being more perfect in other respects (browsers simulate the mechanics of a game engine more than MouseTester.exe, AND expands access to test results).
Yes, benchmarks are useless if they are not engineered to reflect real world usage scenario with 10x-100x level of accuracy/precision and resolution of measures so that you can observe that real world usage to gain as much info as possible out of it.
Chief Blur Buster wrote:
07 Nov 2020, 23:22
Also, seeing precise mouse timings despite a browser, means things are being done correctly, becasuse if a system is so well-tuned, that a browser mouse tester produces results more accurate than expected, it means a well optimized system.
Thats nice but again, its about >HOW< precise those mouse timings are.

I can turn the power switch on and off of my PSU and say my method of measuring CPU transistor switch time is precise, and then Intel engineer would step in and tell me they measure in sub picoseconds.
Intel's 15-nm transistor (gate-length) is a CMOS-based, 0.8-Volt device, said to handle switching speeds of 0.38-ps–or 2.63 trillion switches per second
Image


precision is just timing, accuracy is measuring that timing. browsers arent accurate enough due to bloat and high level.

XXXX.YYYY (in Hz)
XXXX= precision
everything past the dot in decimals is >ACCURACY<
accuracy is affected by latency/jitter/bloat/de-sync.
Chief Blur Buster wrote:
07 Nov 2020, 23:22
The browser is simply a realworld engine/obstruction to mouse precision that a user can optimize for, and such optimized systems will usually perform better in a game. You're not thinking of all the real world variables, and are too focussed on an excessively synthetic concept of mouse tester (e.g. assembly language mouse tester, when mose games are written in C++/C#/etc). Sure, there's utility to different angles of synthetic mouse tests. I don't have a monopoly on mouse testing. If my tester inspires others people to write better open source ANSI C / C++ mouse tester app for people LIKE YOU, then I have succeeded with TestUFO mouse tester. Hounding and wasting my time with your incessant nitpicking does the research community a disservice, not me.
Browser is average Joe's best tool and as such Joe has 50 tabs open, 10 toolbars installed, 30 extensions collecting his data and few malware/bloat programs running on his system.

I'm defending accuracy of measuring mouse latency, since your browser tool lacks high level accuracy due to being made in HTML/JS/WASM for browser it doesnt have enough accuracy to be used for ultra sensitive low latency scenarios such as optimized eSports.

I dont care about researchers, I care about end users who are affected by no one nitpicking and hounding(idk what is it with people using this word so often @ me ...) researchers/vendors/companies exploiting them for more profits/compensating.
Chief Blur Buster wrote:
07 Nov 2020, 23:22
Perfect. Go work for SpaceX. I'm doing the refresh rate equivalent of turning billion-dollar AEGIS phased array antennas to a $500 commoditized Starlink phased array antenna. Many are surprised what i can do with browser engines.
i dont work, atm im busy making sure your Starlink antenna works as intended.
Chief Blur Buster wrote:
07 Nov 2020, 23:22
You miss the context of my argument too. You have a good argument, and I have a good argument. Just because your argument is good doesn't mean my argument is bad. Read the above and you can then realize that multiple different kinds of mouse tests can coexist. There are many different tools in the same toolbox.
truth between different perspectives can be observed with objective reasoning. so it comes down to very simple 2 lines

is Chrome/HTML5/JS/OGL/WASM browser mouse testing tool good?

[average Joe perspective]
its great! see 8000Hz! some 7950, but its ok probably didnt move mouse fast enough, i'm just a casual :shock:

[ultra sensitve low latency enthusiast perspecitve]
no, load is wrong, i can't change thread affinity of raw input polling to my USB controller interrupt core affinity, I close my browser when ingame because its bloated, the accuracy is in integers, where's the decimals, where's the source code to see what it's doing, wtf is this made for Joe?
Chief Blur Buster wrote:
07 Nov 2020, 23:22
Besides, if the test looks amazingly accurate, then you've optimized the system brilliantly that things look so accurate in a browser. It's a failsafe, not a faildanger, in that sense. It also forces browser vendors to optimize too, when users complain (e.g. 1000Hz works, but 8000Hz doesn't in browser, because one constant in Chromium engine trunctates the PointerEvents size unnecessarily, there are some silly bugs in Chromium that massively improves accuracy with small changes). But even that is not the point.


The point is, it produces a new easy realtime mouse visualization that will push forward improvements in all angles (more people releasing more mouse testers, more people releasing high-Hz mice, more people interested in High Definition Mouse Extensions API, etc). Your narrow vision doesn't respect/acknowledge the huge benefits industrywide, to other researchers, to users, to precision, in a vendor agnostic way.
something looking amazyingly accurate only looks "amazingly" because average Joe doesnt know what hides behind the curtains, simple as that. you haven't optimized, but compensated, which means the thing you are compensating for is still there when you can just close it, as you should.

again, Chrome team can optimize for 8000Hz, but for 8000.0Hz never in this universes life. 8000.00Hz, only if Agner Fog steps in with unlimited resources.

time is an N-dimensional temporal vector, meaning as you observe it (with numbers) first part(integer) represents frequency, after that decimal points represent consistency, jitter, skew, etc. 8000Hz is just the "can it walk?" gatekeeping check.
XXXX.Y+++
XXXX = interval measure of time
Y+++= consistency/jitter/skew/strech etc..

Chief Blur Buster wrote:
07 Nov 2020, 23:22
As we go to 1000Hz, games are being written in higher and higher level languages, and we need to architecture them in a way that futureproofs their accuracy better with less jitter.

We may disagree on the standards of how tech progress go, so I'll leave this here:

Image

Create your standard, I'll create mine, and we can see which one lights the fire under the seats of the industry faster... And we can merge standards to put fires out. Ha. But it must be C# + HTML5 + etc compatible (like timestamped mousepolls that can survive the jitter of USB and garbage collects). Anything that "survives" C# and HTML5 will be shockingly accurate with C++ and ASM.

Despite disagreements, our goals are more in common than you think: You see, create a mouse method that survives imprecisions of modern game development with spectacular latency and framepacing, and it becomes even more uncannily accurate in C/C++/ASM. See? Our goals aren't THAT different. So you are truly arguing unnecessarily against a test that will already have a disclaimer anyway. I would appreciate a bit more HTML5 respect from you as a result of this paragraph's explanation. Go work with VXworks RTOS on your favourite satellite, but we're doing C#/C++ games nowadays.

I have already done some amazing stuff helping a C# Unity developer reduce gametime:photontime sync errors by over orders of magnitude, it's called CloudPunk. Game developer credit. The divergences used to be up to 16ms stutters for multiple reasons, now it's greatly reduced to sub-1ms stutters most of the time, with many sustained moments of sub-250us gametime:photontime relative divergence. Hz-futureproofed, much lower lag at all mouse poll rates, better VRR-compatible, better VSYNC OFF compatible, better odd-Hz VSYNC ON compatible, less latency yo-yo effects. Over 50 users posted amazing acclaim on DIscord several weeks ago to this release, replacing the complaints of the major stutter and mouse lag.
yes, evolution makes it easier to create/use/develop things of past cycles, but harder to push forward.
so yeah there are Unity/C#/JS/Electron apps/games being made in higher level languages due to increase of processing power, but that doesnt mean those games are well engineered, just "games". if you scale up all parts of the system then you didnt evolve that system because that evolution is now handling scaled up stress, so back to same thing. if someone creates a game in C#/Unity/JS carried on the back of high processing power capability of CPUs then they end up with bloated/slow apps because they derailed themselves in two opposite directions.

the moment evolution is denied for "progress" through compensation it becomes a free fall that can't be stopped. "i cant go up i'm falling"
is what games being made in C#/C++ with Unity/Unreal engines sound like. yeah no s**t its hard to go up (into raw ASM/C without game engine) because industry is in free fall. We are supposed to be in 10kHz range by now, nanosecond latency and 50GHz+ computers instead I have RGB using 10-15% of RAM DIMM PCB for glowing lights and games with reflected floor tiles and my monitor nerfing its own refresh rate to sync for all the bloated copy pasted engineering of modern developers in games sweating to have 60 fps.

just like those who can drive a bike dont need training wheels and removed that compensation due to evolution of learning to ride, same needs to be done for protocols/drivers/software/hardware in general. theres no need for USB mouse timestamp API compensating for latency/jitter/desync, we had PS/2 for mouse and keyboard, and we should have evolved that into dedicated mouse and dedicated keyboard ports that would have 0 latency issues. not into USB and then in 2025 into usb timestamped compensation for latency API.

you might come back and say that evolution is exactly that, being able to compensate for jitter/latency problems, so i will write it ahead of time and say that no, evolution means you are prepared before it even happens so that it never does and set that non-correct chance to 0% or 0.000000X%.

instead of real-time timestamped compensation there needs to be ahead of time preparation with engineering a proper protocol thats not USB and made specifically for mouse data, all the way from mouse -> cable-> mouse protocol motherboard slot -> proper circutry to CPU -> proper low latency mouse protocol driver -> proper mouse protocol user mode driver -> proper mouse protocol API.

this is equivalent to having proper roads and checking the weather so your car tires dont need chains in case of a winter storm. you just check ahead of time what the weather/road is like. and in this analogy you know its safer to not go drive, same for delivering mouse data on a jitter/latency problematic unstable path.

again, game is not an arena. its just a game, however arena is something where men die. both irl and in arena fps games. both require insane amount of skill and mental awareness, a game doesnt.

there's good enough and theres great. big difference.

as for "passing" C#/HTML, no thats not how it works, i dont need training wheels in case i will crash going fast. they are just a bottleneck, same as for any compensation of data during real time transfer. thats supposed to be void and done ahead of time with proper engineering which is exactly what evolution does, it solves all current problems and sets them to 0(ie doesnt exist at run time).

if you have uncertain wire, then you use TCP, if you know no packet will get lost then you use UDP, and then you strip away all compensation like TCP timestamps. (just like games use UDP rather than TCP).

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Discussion about Legitimacy of HTML5 Mouse Testing

Post by Chief Blur Buster » 25 Nov 2020, 18:48

schizobeyondpills wrote:
08 Nov 2020, 02:15
no pre-judging, I block people who play games with browser open. Chrome is well known bloated browser, to the point people make memes out of it. the only thing worse engineered for low latency/high performance and more bloated than Chrome is Windows 10. Chrome comes pretty close on 2nd place.
Exactly. We're in 100% total agreement with this. If good benchmark/test/diagnostic data punches through the bloat, then I'm doing things successfully.

The best-mouse-results forum maches will also reveal the highest quality browser installations and highest quality browser implementations too. Low quality benchmarks/tests/diagnostics, obviously will be posted, and the debate will occur whether it's the mouse or the browser (that's to be expected, and I will obviously have that disclaimer).

W3C did something brilliantly correct: PointerEvents is a time-stamped array of mouse polls that is largely immune to JavaScript performance jitter. There are other issues though, like too limited maximum array size that can be fixed by enlarging a constant in Chromium source code -- to allow it to buffer up enough 8000 Hz polls between PointerEvent arrays relayed to the event handler, but that's at least a fixable thing.

Also, noise caused by browser doesn't really show in PointerEvents because it already timstamps the mouse polls and queues them up in an array and relays the mouse poll timestamps. I only need about 50 to 100 PointerEvents arrays per second to capture 1000Hz mouse data as reliably as MouseTester. So it's immune to JavaScript jitter for the purposes of refreshrate-vs-pollrate harmonics analyses (a bigger priority of the mouse benchmarks/tests/diagnostics too).

It's not a benchmark/test/diagnostic designed for RTOS engineers or military simulator engineers or high-funded latency precision outfits, Right Tool for Right Job. It's a tool for the mass market consumer, to push the pro-territory down further into mainstream-territory. Future advanced mouse tools will copycat and be better than TestUFO.

On a blanket basis (no financial payments have ever been made to Blur Busters about anything mouse-related, by the way) -- I have an ulterior personal interest to legitimate high-Hz mouse first which is still often dismissed as voodoo stuff by many people less knowledgeable about the need for high-precision mice. Mass market mouse testing/benchmark/diagnostic tools that is popularized will be simultaneously testing both the browser AND the mouse (and it will disclose as such).

Putting formerly realtime analysis tasks to something non-realtime, is possible by creative moves like timestamped poll arrays, which is something W3C/WHATWG thankfully went with. As long as the poll loop is at the highest priority above other browser tasks, its data can be more useful than expected in post-analysis, delicious clear patterns show behaviours that look patterned (ISR style waveforms, CPU-spike style waveforms, browser-performance style waveforms).
schizobeyondpills wrote:
08 Nov 2020, 02:15
It does deserve a reply because evolution takes blood sweat and tears to progress forward. its painful to know how much you dont know, to go into the unknown, to fail a thousand times, be on brink of giving up, and then succeed. thats evolution, people fear it and run from it into "good enough" engineering and lying to themselves to feel good with whatever they have just to not feel evolution. the moment that "good enough" engineering produces something they start repeating the classic delusion "dont reinvent the wheel" just to further reinforce their delusion bubble made to shield them from the fear of the evolution.
I generally agree with the gist of the paragraph so we're mutual on that, even if we have totally different approaches to this (in details, in popularization sequence & in levels of diplomacy).
schizobeyondpills wrote:
08 Nov 2020, 02:15
did i mention i block people that send me MouseTester graphs? its a joke of a tool for 10 reasons I wrote before, and i can write 50 more, and all of them affect eachother, so please dont use MouseTester as anything of quality reference or argument. MouseTester is more pointless than in browser measures how flawed it is.
Fine. I know MouseTester is a flawed tool, albiet not completely useless. It successfully determined one of my systems couldn't handle 8000Hz at all without a Windows reinstall (ISR on bogged Windows install was too slow to do 8000 polls on one CPU core). But, at least, I used the right tool for the right job -- more of a blunt hammer rather than a surgical knife.

I will say this pre-emptively, you probably will continue blocking people who send you TestUFO Mouse Tester results. Even when people read the "Disclaimer: TestUFO Mouse Tester tests both the browser AND the mouse. Results also represent browser performance on mouse handling too."

However, by power of suggestion, the color coded visualization will reveal spikes suggestive of browser misperformance versus operating system misperformance, so at least it's easy to reply to "That definitely looks like background performance spikes, fix those first before you send me anymore screenshots". Whereas MouseTester rescales the graph axes in a way that it's hard to see mouse jitter, because of that one datapoint outlier that forced scaling of axes, making the graph harder for advanced people (like me) to interpret. I have an ulterior motive to make sure graph screenshots are reasonably interpretable, including zooming graph axes on the median rather than on the outliers, and colorcoding the outliers instead (depending on how far above the top edge of graph they are), just like my pre-existing animation time graphs. In addition to this, mouse telemetry is embedded as text (speed, volatility / stddev, etc) at the corner of the graph, so it's more useful for people like me.

Fundamentally, it's still massmarket drivel to you, but at least I will be able to adequately interpret the graphs with high likelihood of browser-issue blame (etc) or background-software blame. There may also later (2021+) be a mode that also display WebGL graphics and other activity, to ensure a bit more system loading during mouse testing, but that will not come in the first version.

Anyway, I'm not changing this plan, but I will update disclaimers to cover people like you.

You will still delete emails with TestUFO Mouse Tester, probably.
schizobeyondpills wrote:
08 Nov 2020, 02:15
Chief Blur Buster wrote:
07 Nov 2020, 23:22
Also, we aren't just only about esports here. High Hz is for everybody, even non-gamers.
unfortunately no, high frequency/low latency components are only for a niche % of human race with high enough mental clockrate to just be able to perceive them, 99% of human race is very happy with 24 FPS movies/Netflix.
Correct. Grandma not telling DVD versus HDTV, yada, yada, yada, yada.... I've heard the story many times. Perspective! ;)

However,
Mainstream did not ask for 1080p
Mainstream did not ask for 4K
Mainstream did not ask for 8K

What is happening:
- Apple and Samsung are slowly commoditizing 120Hz
- New consoles now do 120Hz
- Almost all 4K HDTVs now support 120Hz
- DELL and HP are going to be adding 120Hz to office monitors as a standard feature later this decade

Yes, Apple postponed 120Hz to the next iPhone, but you see the trend that 120Hz becomes a freebie (slowly), much like 4K is an included feature of almost any television you buy today. Suuuuuure, many run 4K TVs at only 1080p, but that is not my point. And yes, many run 120Hz displays at only 60Hz, though increasingly when you plug in a computer to an LG CX OLED or the newest XBox into a recent Samsung, it more frequently autodetects 120Hz and jumps to it. Much like new Netflix players automatically use 4K now if plugged into a 4K TV.

But, guess what? 120Hz is now being considered as a standards-addition to office monitors. Laptops and PCs plugged into them, autodetect them as 120Hz and people observe it scrolls smooth like a 120Hz Apple iPad. Also, upcoming Apple laptops/monitors are rumored to start including 120Hz in the coming years, until the entire Apple suite is all universally 120Hz for user-experience considerations.

So, don't teach me 120Hz is not being mainstreamed later this century. Suuuure, we may disagree when it becomes mainstream, but when I can buy a new XBox and a new TV, and it already automatically goes straight to 120Hz, one starts to realize what I'm talking about.

By end of this decade, it will be increasingly harder to buy a display that doesn't support 120Hz, much like it's hard to buy 720p unless going bottom-barrel. People start seeing the difference, much like when 4K was pushed onto people. Browser scrolling halves in motion blur.

My parents also even remarked the impressive improvement in laptop scrolling performance when I showed their 4K TV supports 120Hz from the laptop.

120Hz is being commoditized slowly whether you like it or not. Ever since we discontinued CRTs, the slow discovery of the need to increase refresh rates have yielded new discoveries and understandings (that before then, only few believed/knew about). Duh.

Also, there is a discovery of a nausea uncanny valley where 48fps and 120fps HFR creates more headaches than 1000fps UltraHFR. I'm a fan of 24fps Hollywood Movie Maker mode. I worked for RUNCO / Key Digital / dScaler / etc and was the inventor of the world's first Open Source 3:2 Pull Down Deinterlacer Algorithm in year 2000 (twenty years ago) -- if you saw my Internet Archive link in my 3:2 pulldown thread.

Eventually, it outperformed a Faroudja line double back in the day when John Adcock implemented my algorithm in MMX assembly language, allowing real time 3:2 pulldown deinterlacing and 24fps->72Hz conversion in year 2000 from a HTPC connected to my 72Hz-capable NEC XG135 CRT projector. Originally working on a Hauppauge TV card, I upgraded it to a SDI-capable card, so it was now deinterlacing 720x480 near flawlessly from an analog input signal from a SDI-capable DVD player.

So, yessiere, I have Hollywood Moviemaker snob cred. But, my favourite frame rates are 24fps (classic) and 1000fps (holodeck) as a result. I'm a bit disdainful of 48fps and 120fps non-strobed HFR though. Still too much motion blur, especially since source blur (camera) and destination blur (persistence) is additive.

But I am smart enough to know the HFR problems and soap opera effect problems too. The motion sickness-nausea uncanny valley where there's still a bit motion blur, but stutter is now gone, is sometimes a headache-creating combination for some. (It's similar for VR, but it also surprisingly applies to discrete flat panel displays too). You might have seen my UltraHFR work at www.blurbusters.com/ultrahfr

Strobing is a great band-aid for this region of frame rates above flicker fusion threshold, but too low for blurless sample-and-hold. Many are still headached by "smooth motion that still has motion blur" (aka 120fps HFR). But a subsegment gets sick from strobing. 60Hz vs 120Hz is an 8.3ms improvement, and we need 120Hz->1000Hz for almost equal improvement (7.3ms improvement), as a dramatic spike up the diminishing curve of returns, avoiding the incrementalism of 240Hz and 480Hz. We're big fans of going straight to 1000Hz for UltraHFR, actually.

The strobeless blur reduction route (1000fps 1000Hz) has far fewer headaches, due to no flicker-induced sicknesses and no blur-induced sicknesses. Everybody sees differently and gets bothered differently. Hate brightness or hate flicker or hate tearing. 12% are color blind. Everybody's glasses have different prescriptions. Everybody reacts differently. Some barfs at the flicker. Others barf at the motion blur (motion blur sickness). Yet more barfs at stutter (motion fluidity sickness). The way you feel about motion is not the same as the way the other person feels about motion.

But the fact remains: Jumping over the HFR uncanny valley is also part of Blur Busters' goals too to simultaneously solve flicker/blur considerations.
1. Fix source stroboscopic effect (camera): Must use a 360-degree camera shutter;
2. Fix destination stroboscopic effect (display): Must use a sample-and-hold display;
3. Fix source motion blur (camera): Must use a short camera exposure per frame;
4. Fix destination motion blur (display): Must use a short persistence per refresh cycle.
Therefore;
A. Ultra high frame rate with 360-degree camera shutter is also short camera exposure per frame;
B. Ultra high refresh rate with sample-and-hold display is also short persistence per refresh cycle,

Until recently, there was no technology to prove this. But now that there is (Phantom Flex 1000fps sped-up on prototype 1000Hz displays), it.... Just dropped more than a dozen jaws, including from many Hollywood Moviemaker snobs that were anti-SOE. All the blur dizzy/nausea/sicknesses completely disappeared, and all the flicker-derived dizzy/nausea/sickness cases completely disappeared. It wasn't five-sigma comfort, but it added another nine (or two) to the discomfort-reduction in early casual tests. The flat-panel version of VR-sickness-elimination. As long as you're not doing too much vertigo-inducing stuff (so movie producer technique for "IMAX simulator footage" style use cases will quickly optimize to 1000fps 1000Hz comfort), it's miraculous in being less nauseous than semiblurry SOE-feeling 120fps HFR. You either stay low (24fps) for comfort, or you go all the way (low persistence sample and hold), "Go Big Or Go Home" when it comes to HFR, go straight to UltraHFR.

Research papers will come out by 2030s, but the early skunkworks just micdropped a lot of questions. It's an astounding amount of work leaping 120Hz to 1000Hz, but fortunately it has now ceased to be technological unobtainium. There is now an engineering path to a gigantic 8K 1000Hz for a specialized display (hyper expensive simulator display) in less than a mere decade involving full-bitdepth refresh cycles (no DLP temporal dithering shit). The GPU horsepower is almost there (in custom software written for 4-SLI) generating ~240-400fps at 8K in tightly optimized SLI-only framepacing on all those cards -- with some difficulty. But that's less an order of magnitude away from becoming a successful realtime source for the already engineered 8K 1000Hz plan -- it will take only a few more years to push 8K 1000fps Unreal 5 quality for "cost no object" situations, assisted partially by some frame rate amplification algorithms. There is obviously SLI latency of frame pipelining, but, at least it's adequately realtime for simulator-priority scenarios. (It's still less than VR strobe lag).

Even at still 1000fps, there is still 8 pixels of motion blur per 8000 pixels/sec, but the motionblur is so low that most footage doesn't even reaveal any of this; with most pans being comfortable (no phantom arrays, no stroboscopics, no motion blur) no matter what your eye gaze is doing relative to the display (fixed eye view, tracking eye view); it looks closer to analog motion! Like real life than any external direct-view display has ever be. Majority of motion blur limitations now shifts to human brain limitations, instead of display enforcing motion blur on you above-and-beyond source blur and display persistence. With natural behavior of brain, less nausea from side effects of the humankind invention of digital frame rates to emulate analog motion. Even 1000fps 1000Hz is not the final frontier, but it's such a magical punch of a jump up the diminishing curve of returns that is no longer an Apollo mission anymore or Star Trek unobtainium -- more technologically achievable like a jet flight to altitude.

By 2040, with the boom of 1000Hz for esports/VR/reality emulation/holodeck/simulator rigs/industrial/amusement rides/etc use cases in the elite spheres quickly making 120fps cinematic HFR minor stepping stone undesirably obsolete -- plus also 120Hz and 240Hz are engineerably-into-freebie (like retina and 4K) since the BOM and math checks out to a viable commoditization paths that I've been hearing about behind the scenes. 120Hz is following the path of retina screens as we speak, although the path-to-commoditization-as-freebie-includedness is about approximately 2x slower.

We're more like the 1970s and 1980s people conceptualizing HDTV right now; you heard it here first: UltraHFR is the HFR Holy Grail, to the point where there will probably only be two still-popular new-content-creation framerates by year 2100: 24fps (or a low below-flicker-fusion-threshold framerate), and >1000fps (far beyond other annoying thresholds). A small remaining subsegment of population will still always have motion sickness that is only solvable by 24fps. But a lot of those 48fps&120fps discomfort cases are fixed at 1000fps@1000Hz. All other frame rates eventually becomes niche or legacy (like viewing last century's 60Hz videos). That's the far-futurist in me speaking, but ultimately, high-Hz isn't expensive. After witnessing 1000fps 1000Hz, non-strobed 48fps and 120fps HFR video is now garbage incrementalism.

There will always be people who get sick from any motion, but fewer people get sick from 1000fps@1000Hz than from 48fps or 120fps HFR (strobed or nonstrobed).

We're strobing only because refresh rates are not high enough for low-persistence flicker-free operation (low persistence sample-and-hold). Our name sake is Blur Busters, and while we're also latency focussed, we are also hugely smoothness/blur focussed.

Anyway, offtopic.

The point is, it's so blatantly obvious that 120Hz is going be freebie mainstream by ~2030 (you're living under a rock if you can't see the trojan horsing), and 240Hz is going to be freebie mainstream by ~2040 due to ergonomic browser scrolling motion blur considerations and simple mudane stuff like that. Like 4K is now a freebie in televisions, even all those $299 Walmart sales. (Some of those even now has 120Hz in its PnP DisplayID/E-EDID too!) High Hz fundamentally isn't an expensive technology anymore because the performance required is getting cheaper and cheaper, and the migration to reality emulation requires a good big jump over the uncanny valley all the way to strobeless CRT clarity for the proper video wow effect.
schizobeyondpills wrote:
08 Nov 2020, 02:15
Yeah, but why would you need to timestamp reads? timestamped reads are nothing but compensation for latency bridge or jitter/desync, which should be the main fixing problem, reducing it past a certain point so that you dont need timestamps.
The tool primary purpose isn't intended to be latency (for now), so nonrelevant here.

It will indirectly help that by pushing innovation in low-CPU-utilization ultraprecise mouse of the future (HD Mouse API) as well as other things, but my goal is to explode the popularity of mouse flaws, even if it utilizes some somewhat-synthetic mouse benchmarks. Sometimes the extra noise produces massively useful data in the streams, especially if the data is way easier to interpret than MouseTester data (see above for explanation)

I welcome others to come up with mouse latency testing, but the testing tool's current goal is relative precision/smoothness/motion quality/blur reduction/etc. Mouse jitter adds motion blur, given the stutter-to-blur continuum (low frequency vibrates visibly like slow guitar strings, high frequency jitter blends to blur like fast guitar strings), and I want to see mouse become increasingly smoother in the refresh rate race to retina refresh rates. It's a whac-a-mole issue for blur busting, aka name sake.
schizobeyondpills wrote:
08 Nov 2020, 02:15
Yes, but what kind of a disclaimer? ultra low latency sensitive disclaimer or a general one? big difference.
TestUFO Mouse Tester is not currently designed to be a mouse latency tester. Move along
schizobeyondpills wrote:
08 Nov 2020, 02:15
Browser is average Joe's best tool and as such Joe has 50 tabs open, 10 toolbars installed, 30 extensions collecting his data and few malware/bloat programs running on his system.
Read earlier posts above. The visualization is intended to be designed in a way that makes it easy for me to interpret whether it's probably background processing or the mouse issue, etc. Smart automatically formatted graph design is designed to be third-party-analysis friendly (including by me), and by corollary, useful to smoothing-priority research.

Some correlations will be subjective but a lot of it is very convincing -- e.g. spikey versus general noise increase versus etc. -- the graphs hones on them an emphases/colorcodes them, rather than unformatted MouseTester graphs etc. With additional data beyond MouseTester, such as number statistics on the corner of the graph (stddev, etc)). Collect enough data, and if 99 out of 100 of graphs of the same pattern occurs because of browser tabs in background, then I can easily blame the user for having browser tabs open, and dismiss the graphs, allowing smart people to surgically hone onto the most interesting graphs that users post. The fact that it's one click away with no software, ensures a lot more data can be accumulated, even if a lot of reveals computer/browser flaws (which is useful data to me and the world). Browser vendors have often noticed "why is TestUFO Animation Time Graph so good on that system and not on this one?" and optimized their browsers too, in the vairous github/bugzilla items.

And in those cases where browser is not the weak link (many graphs will clearly show that), it pushes the mouse vendors. I'm mouse vendor agnostic, but I can acknowledge Razer is simply the most generous in their first supplying of 8000Hz mouse samples for general Blur Busters advocacy purposes. Kudos to them for being brave with the potential flaws of 8000Hz, and the revealation of the need to push the needle in the refresh rate race to retina refresh rates. It's fine to hate upon commercialism and paid researchers (but a lot of research projects -- including this one -- has no payment associated to it even if I indirectly benefit from increased site publicity and its ads/affiliate links).

Also there is a feature to visualize based on poll imperfections, as well as poll-vs-Hz imperfections, to separately reveal flaws on mouse fluidity for VSYNC OFF situation, versus mouse fluidity for VSYNC ON situation, I have enough data to be able to colorcode slightly differently (or show to concurrent graphs) based on what use case the user wants to optimize their mouse for. The synced visualization shows really interesting artifacts for 1000Hz mice on 360Hz monitors in a visually easier-to-identify manner than a mouse pointer stroboscopic photograph, but they beautifully corroborate each other!

Anyway...

Rising tides lifts all boats. First 1000 Hz mice were also laughed at initially too and had many performance problems, that took years to improve. In 2021-2022 I will post major articles about the need to address the Vicious Cycle Effect much more seriously (precision, latency, etc), in various rethinks of the entire architecture, but it depends on how much pull I have after popularizing the mouse tester, etc which will help convince mouse vendors to adopt HD Mouse API -- I will certainly be advertising HD Mouse API as part of TestUFO Mouse Tester, as part of my ulterior goal of open industry wide vendor agnostic initiative, even if Razer is the generous one providing the first >1000Hz mice to Blur Busters to starting-pistol this initiative...

Mouse latency testing is badly needed, but that can be another concurrent tool developed in the same toolchest (written in C++/asm). Not all tools need to be made of the same metal/plastic and of the same age and of the same precision (steel vs titanium vs bakelite vs acrylic vs whatever) but I am introducing something that adds useful data to the world -- it just isn't data you're wanting (latency)
schizobeyondpills wrote:
08 Nov 2020, 02:15
Image
Good graph, but I wrote my HD Mouse API thread before I even noticed this post, and you appear to (at least partially) agree with the theory of an HD Mouse API. I already know this. We have an overlapping venn diagram, even if our agreement-overlap is subject to debate.

For this older post, you bring up good and valid points -- some which definitely need other tests/benchmarks/tools created for it -- but you're in the wrong forest. I'm looking at different trees in a different forest.

Consequently, since I am only writing a test tool focussed on relative precision (precision of poll-to-poll), rather than the tapedelay element (absolute latency offset) -- this thusly is a low priority offtopic forum thread to me, even if the discussion is useful reading for other people who wishes to create different, new mouse tests.

Higher poll helps lower lag, and there are already lag tests for mice elsewhere, use them. The tool I am focussed on is more precision-related that benchmarks both the browser+mouse chain (surprisingly well for the intended testing goal and convincing all manufacturers to tackle the problem).

Just because I agree mouse latency testing versus mouse smoothness testing are both needed -- it doesn't mean TestUFO will test both. Sometimes both has to be tested at once, but that's a job for an executable (agree).

But for relative-precision (including microstutter of 1000Hz mouse on 360Hz monitors), TestUFO is spectacular reveal of that proof in a more blatant color-coded beat-frequency visualization (difference of theoretical perfect position versus closest-poll to current composited VSYNC refresh cycle), which just literally micdrops the proof that 1000Hz isn't smooth enough for 360Hz monitors, and some technology innovation needs to be done (high poll, HD Mouse API, whatever, manufacturers, dammit, fix the problems).

Sometimes the blanket increase of good visual statistics data helps, especially if the graphs are easy to interpret by industry. Whether optimizing a motherboard for fewer spikes, and optimizing the mouse to have less poll jitter, and fixing drivers/software to be smoother. All of these can have side effects interacting with each other (including improved latencies too, by sheer side effect of higher sampling rates and better USB refinements, or faster ISRs, etc). It all Rube Goldbergs beautifully, even if haphazardly sometimes.

An EXE requires more time-and-effort and requires a download and doesn't shake the market hard enough to get the manufacturers do something about it. In a metaphorical, it generates "too little data/statistics surface to attack industry with" for the time available. A superior 10x-more-refined HTML5 utility (for the programming time & money effort spent) versus a poorly-made poorly-marketed executable. Rock, meet Hard Place. Thus, "Low Lying Apple" opportunities. Instead, as soon as HTML5 standards catches up on something beautifully good (already for 1000Hz mice), I sieze it as part of popularizing a test. I gotta compromise, shake things up using HTML5 to increase complaint surface, to get the blanket of manufacturers industry-wide to notice, and finally do something about it.

Goals are achieved, except by different concurrent streams of pressures/market demand/etc to manufacturers industrywide. Others create superior power user tools that measures other aspects (latency), but there's a huge need to at least incubate a mouse testing need in this rare decadal upgrade supercycle (shattering the 1000Hz poll barrier).

Sometimes these are necessarily separate tools for the constraints of capabilities of specific environments (e.g. HTML5 PointerEvents + PointerLock API is accurate enough for benchmarking relative-precision and Hz-beatfreq effects, but bad for measuring latency). The disclaimer already mentions it does not test mouse lag and the disclaimer will certainly be refined even further (browser screen space allowing) based on a mass of advanced-user feedback (including from you).

Move along, TestUFO Mouse Tester isn't something you're interested in -- it does not test mouse lag at this time, although it will indirectly (Rube Goldberg) spin off to pushing the needle on those considerations. Nontheless I will continue to check this thread approximately weekly.

Cheers,
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Discussion about Legitimacy of HTML5 Mouse Testing

Post by Chief Blur Buster » 25 Nov 2020, 20:02

Schizobeyondpills, please allow me to micdrop my ability to identify certain system problems by spike-pattern analysis in a HTML5 screenshot:
https://forums.blurbusters.com/viewtopic.php?f=19&t=7806#p60310

I've been doing that analysis for seven years. I see so many TestUFO graphs, that I even sometimes recognize SPECIFIC software from spike patterns.

Hell, maybe someone else could write heuristics AI in HTML5/JavaScript for this, one can probably nowadays identify specific software from specific spike patterns nowadays. (Though I don't do that for privacy reasons...and browsers can't collect lists of running processes anyway). Who knows?
masterboss wrote:
25 Nov 2020, 02:37
Image
Then I immediately recognized it was probably one of the system tray software running, possibly the RGB stuff -- it definitely didn't look like browser processing (familiar with spike patterns of that) and it definitely didn't look like other common Windows patterns (Defrag, etc), so I gave a surgical fix list based on glancing at the TestUFO graph.
Don't believe me? See
conversation history
.

...Certainly the graphing algorithm is imperfect; but the algorithm is simple/raw otherwise and straight-to-the-point, it visually speaks of the system like music to my ears (even though I am born deaf).
Chief Blur Buster wrote:
masterboss wrote:
25 Nov 2020, 02:37
You were absolutely right the rgb software was the cause for those spikes. Is that normal though or happens to everybody? Should I just get rid of the RGB?
:D .... I was right just by looking at the graph!

Different brands of RGB has less overheads by several orders of magnitude. My Razer keyboard custom RGB (ripples from keyboards) has zero impact on TestUFO.

Disable all your RGB and test one RGB accessory at a time, measuring impacts on TestUFO. Find the RGB that kills your CPU and get rid of it. Purchase a different RGB brand/accessory that eats zero or near-zero CPU.
And
that graph in that thread
is only approximately 1/10th as revealing / fancy visualization as the TestUFO Mouse Tester graph being designed.

Anyhoo.... further repeated-chickenlittleing about the uselessness of browser-based tests is mostly wrong-forest speil. Saying various forms of TestUFO being essentially worthless isn't productive around here. I don't usually use such harsh words, but you've been continuously pressuring the uselessness of browser tests. :roll: You're engineering-smart, but all your replies have largely missed the chief purposes/usefulnesses of mass-market browser based mouse testing.

TL;DR: Goals of TestUFO Mouse Tester

  • Move the industry needle faster sooner with less time & effort;
  • Make support easier too by making graphs easier for smart people to interpret;
  • Improve mouse weak link to unlock further refresh rate advancements;
  • Enlarge advocacy attack surface to manufacturers to adopt HD Mouse API;
  • Produce useful comparable data between users (in relative smoothness critera).

(This is not priority order nor an exhaustive list, it's just TL;DR)

Seeing graphs and identifying problems quickly is a goal. Even enabling/disabling sensor-side smoothing shows up in the data in a visually identifiable way, etc. I can see data reveal itself surprisingly in HTML5 graphs. It is why I am intentionally designing the graphs to be easy to interpret by people like me for high troubleshoot success rate;

The smart people will automatically reply to those people who post flawed TestUFO Mouse Tester results, helping to discard the bad test samples by sheer public consensus (turning former "Calculus Chinese" or "Go To University To Understand" dismissive snobbishness into easy "30fps vs 60fps show and tell").

Blur Busters support is essentially unpaid work (forums = volunteer stuff, since the forum-only ads/revenues don't even pay for minimum wage for the amount of time I spend writing replies on these forums), and it is my ulterior goal to make it easier for me to help people in a friendly Carl Sagan / Chris Hadfield / Neil DeGrasse Tyson manner.

You see -- most of Blur Busters funds comes from industry services which in turns funds my Blur Busters passion work (like a metaphorical equivalent of the Google 20%). The reality is that I (nor Blur Busters) have no mouse-related income, but it so tightly ties into the passions of the refresh rate race to retina refresh rates, as a new weak link revealed by shattering the 240Hz barrier.

So I want to make mouse smoothness troubleshooting personally easier for me and my users. (I'll still have to get Chromium team to fix the 8000Hz PointerEvents array overflow, but after that, I'll be able to troubleshoot 8000 Hz better. The graphs even show those exact moments of array overflows, so even the graphs are great browser debugging for the browser vendors too. The browser-fault graphshapes looks show up like a christmas tree, so I already know when to blame the browser or not!)
  • I can also tell when someone is moving the mouse fast or slow
  • I can see graph texture differences of different mouse pad surfaces
  • I can see all of this in PointerEvents in HTML5, the data punches through to my eyes!
  • Even the data pollution (browser performance) doesn't erase all of this
  • I can see the multilayered distortions into the mouse test (browser factor, mouse performance factor, system performance factor, etc).
  • Sometimes the data is too polluted, but surprisingly lots of signal punches through noisefloor. Metaphorically speaking, this mouse tester screenshots communicates more information to me MouseTester.exe does.
My algorithms intentionally clips data (start/ends) and intentionally scales closer to medians / standard deviations than those outliers that frequently makes MouseTester.exe graphs unidentifiable.

And I colorcode the outliers based on how far beyond the top/bottom edge of graphs they are. I can hone onto those micro and macros -- the same graph is interpretable in all octaves and all orders of magnitude, in smart graph design, buddy. I intentionally design the automagicalness of the graphs to be useful to me (and other smart people similar to me). It's not intended for people like you who have a bigger need of other kinds of mouse statistics. It's almost steganographical how I design simple barebones colorcoded-visualized graphs to have simultaneous narrow-scaling and broad-scaling, in one graph instead of two graphs based on the technique of colorcoding the magnitudes of the overshot outliers, while more usefully autofocussing/autozooming on the average/standard deviation/etc jitter. All in realtime while you move the mouse, too, so you can watch the graph vary in realtime (with an automatic screenshot history -- no more PrintSc key -- that you can rewind in a movie filmreel ribbonbar at bottom of graphs), then right click to save PNG. See, I have put a lot of Steve Jobsian and Elon Muskian thought into the graph design, to make mouse smoothness troubleshooting 10x-100x easier for ME too.
(Disclaimer for people anxiously awaiting TestUFO Mouse Tester: Graph design subject to change and improve even further)

The graphlook already separates those out to my own personal eyes. It will be one of the most advanced TestUFO tests, with more work put into it than my custom made NVIDIA 360Hz DOTA test with VRR emulation built into it.

I love having fun helping people, and seeing other people help others. Even the kids or the advanced prosumers, and TestUFO mouse tester will facilitate that goal too, in addition to my ulterior goal of "Rising Tides Lifts All Boat" goals, like an open HD Mouse API and all.

Mind you, 2020 increase of forum traffic and increase admin/moderator workload; has caused some difficulty in keeping up, which drives me to improve diagnostics in a big-bang-for-effort basis. And collateral side effect of industry improvement as additional metaphorical virtual birds being hit with the same stone.

</MicrophoneSlamDunk>
(at electromagnetic cannon speeds)

...I'd close this thread now, but just like the famous humorous tale of Patent Office thinking of shutting down (circa 1899) because "all inventions has now been invented". Obviously that didn't happen. Even I can be line-item wrong sometimes, but my intentions are always genuine and visionary (but surprisingly reliably prescient). I can think like a kid, or I can think like a researcher, or I can think like a scholar. Few are able to merge all of that. Even if this thread is low priority personally to me, it is full of fascinating educational content for other readers who can desire to attack the other needed elements like better mouse lag tests or additional innovation in refresh rate race. So discussion probably will perpetually continue. :D
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Discussion about Legitimacy of HTML5 Mouse Testing

Post by Chief Blur Buster » 25 Nov 2020, 21:29

schizobeyondpills wrote:
07 Nov 2020, 22:20
so then your use case is not for high performance low latency scenarios and that includes research/articles/findings derived from your tooling that u created for average/casual/mass audience. why do you refuse to accept this? all we "elitists" ask for is a disclaimer your findings dont apply for highly optimized low latency user-interactive systems targeted for esports, but rather than for "hardware lovers" and below.
I full circle back to this paragraph as being assumptive/presumptive about TestUFO MouseTester purpose / capabilities, and the reason of why I consider this "TestUFO putdown dismiss" as engineer-snobberyism speil. Normally I am diplomatic to many people, but responses like this about TestUFO uselessness for esports mouse testing, continues to miss the point to the wrong forest.

I shall remind you, TestUFO Mouse Tester is not a mouse lag tester. I have not been dishonest about that. Have I ever said TestUFO Mouse Tester was a mouse latency tester?

Chief Blur Buster wrote:
25 Nov 2020, 20:02
So I want to make mouse smoothness troubleshooting personally easier for me and my users. (I'll still have to get Chromium team to fix the 8000Hz PointerEvents array overflow, but after that, I'll be able to troubleshoot 8000 Hz better. The graphs even show those exact moments of array overflows, so even the graphs are great browser debugging for the browser vendors too. The browser-fault graphshapes looks show up like a christmas tree, so I already know when to blame the browser or not!)
  • I can also tell when someone is moving the mouse fast or slow
  • I can see graph texture differences of different mouse pad surfaces
  • I can see all of this in PointerEvents in HTML5, the data punches through to my eyes!
  • Even the data pollution (browser performance) doesn't erase all of this
  • I can see the multilayered distortions into the mouse test (browser factor, mouse performance factor, system performance factor, etc).
  • Sometimes the data is too polluted, but surprisingly lots of signal punches through noisefloor. Metaphorically speaking, this mouse tester screenshots communicates more information to me MouseTester.exe does.
My algorithms intentionally clips data (start/ends) and intentionally scales closer to medians / standard deviations than those outliers that frequently makes MouseTester.exe graphs unidentifiable.

And I colorcode the outliers based on how far beyond the top/bottom edge of graphs they are. I can hone onto those micro and macros -- the same graph is interpretable in all octaves and all orders of magnitude, in smart graph design, buddy. I intentionally design the automagicalness of the graphs to be useful to me (and other smart people similar to me). It's not intended for people like you who have a bigger need of other kinds of mouse statistics. It's almost steganographical how I design simple barebones colorcoded-visualized graphs to have simultaneous narrow-scaling and broad-scaling, in one graph instead of two graphs based on the technique of colorcoding the magnitudes of the overshot outliers, while more usefully autofocussing/autozooming on the average/standard deviation/etc jitter. All in realtime while you move the mouse, too, so you can watch the graph vary in realtime (with an automatic screenshot history -- no more PrintSc key -- that you can rewind in a movie filmreel ribbonbar at bottom of graphs), then right click to save PNG. See, I have put a lot of Steve Jobsian and Elon Muskian thought into the graph design while being deceptively simple/barebones, although much bigger and much more elaborate than Animation Time Graph. To make mouse smoothness troubleshooting 10x-100x easier for ME too (And by corollary, other BlurBusters-think people).
(Disclaimer for people anxiously awaiting TestUFO Mouse Tester: Graph design subject to change and improve even further)

The graphlook already separates those out to my own personal eyes. It will be one of the most advanced TestUFO tests, with more work put into it than my custom made NVIDIA 360Hz DOTA test with VRR emulation built into it.
That's not it. One More Thing.

<Apple Music>

Eventually, saveable animated GIF/PNG.

Since with TestUFO Mouse Tester, you're actually watching an animated graph that updates at a few frames per second (as the polls repeat its left-to-right pass, like an advanced colorcoded multitrack oscilloscope for your mouse smoothness telemtry). And if you saw interesting data spikes, you can rewind using the bottom filmreel bar and save the screenshot. Or download all of them as as a single animated GIF/PNG, so I can see many seconds worth of the user's polling in a continuously animated visualized graph. Save-All won't be in the first version, but it's being designed to allow saveable GIF/PNG/video in a subsequent version of the TestUFO Mouse Tester. The design of the test is designed to allow eventual unlimited recording of mouse telemetry -- it looks almost like a music video how it realtime animates, betraying the multiple system factors simultaneously in pattern-exaggerations/highlightings that is 100x+ more system/software/mouse/OS-problem-storytelling than MouseTester screenshot.

There's enough data crammed into the graphs (even in the overshoots beyond top/bottom) to reproduce the cursor movements locally so I probably have to even include a disclaimer that full cursor tracking is embedded into the graphs for the time duration of the pointer lock (TestUFO Mouse Tester session). There are multiple overlapped data (like a multitrack oscillscope) for all the different varying telemetry (varying jitter, varying velocities, varying timestamps differentials, etc) so I can quickly hone the data using my "Blur Busters Einstein" mind just by a quick graph glance. I can easily tell whether the user is testing circled cursor or testing linear horizontal/vertical back-and-fourth sweeps -- it's 100x easier to interpret than a MouseTester graph to my own eyes too.

Again, it's a 10x more advanced version of the Animation Time Graph that crams more data into a single saveable asset in various clever ways. Making mouse testing actually fun rather than a chore, increasing mouse testing.

Even the different images tells you the type of mouse movement that occured too, like linear or circle mouse movement, and at whatever speed, via imprinted text within the graph (various telemetry/data/statistics in the corner of every graph screenshot). So I can tell people to try again and move mouse faster in circles. And I use PointerLock rawinput so even screen edges don't distort data.

Even when the user moves slower, I can tell the Hz from poll granularities via an approximate gridfit algorithm (I only need a few samples to heuristically identify 125/500/1000/2000/4000/8000 granularities), so I can quickly display Hz without poll saturation. ("Poll: 8000Hz (estimated)") with the estimated marker only added if test was never successfully saturated. Even the PointerEvent array overflow bug still doesn't erase useful poll data, since for brief surges it successfully capture striped segments of continuous 0.125us polls into the PointerEvents array, so some other system issues can even be analyzed ignoring the browser bug!

(Certainly this is more of a diagnostic tool rather than a benchmarker, but that's just semantics of engineer arguments in a lunchroom/watercooler, so I'm gonna put that aside. We already know this is never going to be a mouse latency testing tool, and so further whinings about it not being a mouse latency testing tool is useless.)

Suurre, browser imperfections pollute the data, but I'm easily able to identify all those different patterns of graphs, the coarsefeel of those spikes, white noises, patterned noises, etc, that reliably statistically coorrelates to various common culprits that are easily seen.

Story Examples Now Achievable (from internal tests)
  • "It's browser related, I see you're using ACMEv77, upgrade to v83"
  • "It's background software related"
  • "The mouse is creating that problem. Are you using sensor #####?"
  • "That looks like the fake 2000Hz overclock"
  • "Lower your poll Hz to 2000Hz. Your system is chopping up those 8000Hz polls with a repeatable 200ms of data gaps between 100ms of successful 0.125us polls, from browser overflow bug. Currently, 8000Hz is usually overkill for your 240Hz monitor until the software vendor upgrades more of the software."
  • "I see you're at 1000Hz poll for 360Hz, that's the cause of your bad mousefeel. Switch to 2000Hz at least and test again"
  • "That looks like OS is not keeping up with those random 0.125/0.250/0.375us dropouts during your full velocity circling. Try again at 2000Hz or 4000Hz and see if your mouse data starts to sing."
  • "That definitely looks like you have too many browser tabs, that spikepattern is consistent of having multiple browser tabs open"
  • "It looks like you have mouse smoothing enabled, is that what you wanted for your mouse? Usually that will add lag, though it can feel smoother."
  • "Try plugging your mouse into a less congested USB port, it looks like USB contention to me"
  • "That's too noisy for me to interpret, can you rerun rebooting without any system tray stuff, and only run one browser tab"
  • "That browser is probably adding Meltdown/Spectre timer fuzz to those PointerEvent timestamps, run again in Browser XXX instead."
  • "It looks like like very bad DPC noise or poll jitter. I'm not 100% sure, but other graphs looking like that was traced to that."
  • "Nice asynchronous compensation of your poll jitter; I even see the 6000Hz sensor behavior in your 1000Hz mouse. I don't often see that punch all the way to the HTML5 graph. May I guess you're using sensor XXXXXX?"
  • "I see some browser performance issues; but ignoring that, I also see the jitter punching through that noisefloor that reminds me of going without a mousepad on a smooth desk. Are you using a mousepad?"
  • "Great mouse performance! If you want to test the latency too, try [the other tool], to see how low latency your great smoothness is"
  • "Let me send that to the mouse manufacturer, that data definitely looks interesting."
  • "That's a lot of background processing that's definitely not caused by the browser. That will probably interfere with your games too."
  • "You didn't move your mouse fast enough during your test. Can you send another screenshot with you moving the mouse faster? What DPI are you currently at?"
  • "Interesting jitter difference between plastic mousepad, melamine desk, and cloth mousepad!"
  • "That's amazing mouse data even though slightly polluted by browser issues at erratic moments. Are your games smooth in mouse movements?"
  • "Your pollrate is too low for your refresh rate"
  • "You've got amazing smoothness compared to others. Great optimized test run. Oh! You were even already running that 3D benchmarker in an adjacent window in that desktop screenshot? I see only two asset-load spikes but otherwise virtually jitter-free relative to others. If so, that's great system performance. Even though TestUFO Mouse Tester doesn't always translates to mouse performance in games;' hopefully it translates well to your games."
  • etc.
  • etc.
  • etc.
TestUFO Mouse Tester helps me do all that. And as advanced users see me helping, they will recognize the patterns too (over learning experience). There will always be some graphs (and only a minority!) that are too polluted to be useful, but most of it produces glanceable identifiable patterns with higher-probabilities of troubleshoot success % ratios. I have a good mental filter for browser-related problems too via graph glances.

When TestUFO mouse test show great graph data, it already correlates with wonderfully smooth desktop performance (scrolling, dragging windows, etc), and usually correlates with wonderfully smooth gaming performance in my experience. And if users complain about latency with a smooth mice, a smart reviewer or mouse manufacturer will try to figure out how to achieve great smoothness at zero lag too, by using multiple tools (possibly concurrently). Reducing the pick-poison effect of smoothness-vs-lag. Etc.

Sure, system loading can cause divergences of "near perfect mouse fluidity in TestUFO Mouse Tester" , but so much delicious data makes it into the graphs (easteregg'd to the brim in cleverness like subtle pixel positions & colorcodes, alphablends and textual overlays, etc), that I can still see mouse problems despite certain categories of browserbloat problems. Some features are added to Version 2 that comes after the Version 1 launch, but fundamentally, it will help make mouse smoothness performance troubleshooting easier.

TestUFO Mouse Tester is not a mouse lag tester. But it is a good mainstream mouse diagnostic tool that tries to pull down advanced data into mainstreamable territory. Any users that tries to lie about their data, smart users can see the data in the image (or animated GIF/PNG that looks like a music visualization powered off the mouse behaviors). The eye candy of the graph is crampacked with useful interpretable data targeted at troubleshooting weak links in the refresh rate race to retina refresh rates.

It is definitely useful to esports in conjunction with additional test (like known mouse lag tests). There are mice with low lag but bad jitter. There are mice with high lag but perfectly smooth. TestUFO Mouse Tester has limitations but the graph surprisingly tell a ton of story. It is simply more of a handy visual diagnostic tool that also simultaneously helps the refresh rate race to retina refresh rates, which thus also helps esports too.

(I will keep going on if you keep denying the usefulness of HTML5 tests for mice -- I have much more rebuttal ammo where it comes from. But I will only post surges weekly or bi-monthly, since I am focussing on projects that matters to Blur Busters :D )
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

1000WATT
Posts: 391
Joined: 22 Jul 2018, 05:44

Re: Discussion about Legitimacy of HTML5 Mouse Testing

Post by 1000WATT » 25 Nov 2020, 21:39

I alone cannot find this test? :lol: TestUFO Mouse Tester
I often do not clearly state my thoughts. google translate is far from perfect. And in addition to the translator, I myself am mistaken. Do not take me seriously.

1000WATT
Posts: 391
Joined: 22 Jul 2018, 05:44

Re: Discussion about Legitimacy of HTML5 Mouse Testing

Post by 1000WATT » 25 Nov 2020, 21:57

I have read this thread.
Schizobeyondpills' behavior is nothing new. He himself will not do anything "this is not a king's business."
schizobeyondpills wrote:
08 Nov 2020, 02:15
Chief Blur Buster wrote:
07 Nov 2020, 23:22
You're thinking about this from the wrong angle (Although you'd be surprised at what I can do in JavaScript nowadays). It's still intended to be just a (good) synthetic mouse tester like MouseTester.exe but far more real-world than MouseTester.exe because it pushes CPU/GPU/mouse simultaneously a bit harder than MouseTester.exe does. Most benchmarks are synthetic anyway and the more benchmarks the merrier. If you want to write a C/C++/ASM benchmark, please go ahead, but it's more synthetic unless you write the mouse benchmark in a game engine.
good is to great what bad is to good. high frequency devices/system require great engineering, not good, deff NOT good enough.

yes benchmarks are synthetic but this is not a benchmarking discussion, but testing discussion, testing needs to be such that you engineer for accuracy of real world usage with minimum overhead. is absorbing the sun (Chrome) a reflection of such engineering?
It's a vicious circle. There is schizobeyondpills version 2.0. And he will say: the overhead will distort what actually happens in the game. And the accuracy is not 0.000000x is not suitable to be accepted by him. And only 0.0000000x accuracy will show what is really happening in the game.
I often do not clearly state my thoughts. google translate is far from perfect. And in addition to the translator, I myself am mistaken. Do not take me seriously.

Post Reply