schizobeyondpills wrote: ↑08 Nov 2020, 02:15
no pre-judging, I block people who play games with browser open. Chrome is well known bloated browser, to the point people make memes out of it. the only thing worse engineered for low latency/high performance and more bloated than Chrome is Windows 10. Chrome comes pretty close on 2nd place.
Exactly. We're in 100% total agreement with this. If good benchmark/test/diagnostic data punches through the bloat, then I'm doing things successfully.
The best-mouse-results forum maches will also reveal the highest quality browser installations and highest quality browser implementations too. Low quality benchmarks/tests/diagnostics, obviously will be posted, and the debate will occur whether it's the mouse or the browser (that's to be expected, and I will obviously have that disclaimer).
W3C did something brilliantly correct: PointerEvents is a time-stamped array of mouse polls that is largely immune to JavaScript performance jitter. There are other issues though, like too limited maximum array size that can be fixed by enlarging a constant in Chromium source code -- to allow it to buffer up enough 8000 Hz polls between PointerEvent arrays relayed to the event handler, but that's at least a fixable thing.
Also, noise caused by browser doesn't really show in PointerEvents because it already timstamps the mouse polls and queues them up in an array and relays the mouse poll timestamps. I only need about 50 to 100 PointerEvents arrays per second to capture 1000Hz mouse data as reliably as MouseTester. So it's immune to JavaScript jitter for the purposes of refreshrate-vs-pollrate harmonics analyses (a bigger priority of the mouse benchmarks/tests/diagnostics too).
It's not a benchmark/test/diagnostic designed for RTOS engineers or military simulator engineers or high-funded latency precision outfits, Right Tool for Right Job. It's a tool for the mass market consumer, to push the pro-territory down further into mainstream-territory. Future advanced mouse tools will copycat and be better than TestUFO.
On a blanket basis (no financial payments have ever been made to Blur Busters about anything mouse-related, by the way) -- I have an ulterior personal interest to legitimate high-Hz mouse first which is still often dismissed as voodoo stuff by many people less knowledgeable about the need for high-precision mice. Mass market mouse testing/benchmark/diagnostic tools that is popularized will be simultaneously testing both the browser AND the mouse (and it will disclose as such).
Putting formerly realtime analysis tasks to something non-realtime, is possible by creative moves like timestamped poll arrays, which is something W3C/WHATWG thankfully went with. As long as the poll loop is at the highest priority above other browser tasks, its data can be more useful than expected in post-analysis, delicious clear patterns show behaviours that look patterned (ISR style waveforms, CPU-spike style waveforms, browser-performance style waveforms).
schizobeyondpills wrote: ↑08 Nov 2020, 02:15
It does deserve a reply because evolution takes blood sweat and tears to progress forward. its painful to know how much you dont know, to go into the unknown, to fail a thousand times, be on brink of giving up, and then succeed. thats evolution, people fear it and run from it into "good enough" engineering and lying to themselves to feel good with whatever they have just to not feel evolution. the moment that "good enough" engineering produces something they start repeating the classic delusion "dont reinvent the wheel" just to further reinforce their delusion bubble made to shield them from the fear of the evolution.
I generally agree with the gist of the paragraph so we're mutual on that, even if we have totally different approaches to this (in details, in popularization sequence & in levels of diplomacy).
schizobeyondpills wrote: ↑08 Nov 2020, 02:15
did i mention i block people that send me MouseTester graphs? its a joke of a tool for 10 reasons I wrote before, and i can write 50 more, and all of them affect eachother, so please dont use MouseTester as anything of quality reference or argument. MouseTester is more pointless than in browser measures how flawed it is.
Fine. I know MouseTester is a flawed tool, albiet not completely useless. It successfully determined one of my systems couldn't handle 8000Hz at all without a Windows reinstall (ISR on bogged Windows install was too slow to do 8000 polls on one CPU core). But, at least, I used the right tool for the right job -- more of a blunt hammer rather than a surgical knife.
I will say this pre-emptively, you probably will continue blocking people who send you TestUFO Mouse Tester results. Even when people read the "Disclaimer: TestUFO Mouse Tester tests both the browser AND the mouse. Results also represent browser performance on mouse handling too."
However, by power of suggestion, the color coded visualization will reveal spikes suggestive of browser misperformance versus operating system misperformance, so at least it's easy to reply to "That definitely looks like background performance spikes, fix those first before you send me anymore screenshots". Whereas MouseTester rescales the graph axes in a way that it's hard to see mouse jitter, because of that one datapoint outlier that forced scaling of axes, making the graph harder for advanced people (like me) to interpret. I have an ulterior motive to make sure graph screenshots are reasonably interpretable, including zooming graph axes on the median rather than on the outliers, and colorcoding the outliers instead (depending on how far above the top edge of graph they are), just like my pre-existing animation time graphs. In addition to this, mouse telemetry is embedded as text (speed, volatility / stddev, etc) at the corner of the graph, so it's more useful for people like me.
Fundamentally, it's still massmarket drivel to you, but at least I will be able to adequately interpret the graphs with high likelihood of browser-issue blame (etc) or background-software blame. There may also later (2021+) be a mode that also display WebGL graphics and other activity, to ensure a bit more system loading during mouse testing, but that will not come in the first version.
Anyway, I'm not changing this plan, but I will update disclaimers to cover people like you.
You will still delete emails with TestUFO Mouse Tester, probably.
schizobeyondpills wrote: ↑08 Nov 2020, 02:15
Chief Blur Buster wrote: ↑07 Nov 2020, 23:22
Also, we aren't just only about esports here. High Hz is for everybody, even non-gamers.
unfortunately no, high frequency/low latency components are only for a niche % of human race with high enough mental clockrate to just be able to perceive them, 99% of human race is very happy with 24 FPS movies/Netflix.
Correct. Grandma not telling DVD versus HDTV, yada, yada, yada, yada.... I've heard the story many times. Perspective!
However,
Mainstream did not ask for 1080p
Mainstream did not ask for 4K
Mainstream did not ask for 8K
What is happening:
- Apple and Samsung are slowly commoditizing 120Hz
- New consoles now do 120Hz
- Almost all 4K HDTVs now support 120Hz
- DELL and HP are going to be adding 120Hz to office monitors as a standard feature later this decade
Yes, Apple postponed 120Hz to the next iPhone, but you see the trend that 120Hz becomes a freebie (slowly), much like 4K is an included feature of almost any television you buy today. Suuuuuure, many run 4K TVs at only 1080p, but that is not my point. And yes, many run 120Hz displays at only 60Hz, though increasingly when you plug in a computer to an LG CX OLED or the newest XBox into a recent Samsung, it more frequently autodetects 120Hz and jumps to it. Much like new Netflix players automatically use 4K now if plugged into a 4K TV.
But, guess what? 120Hz is now being considered as a standards-addition to office monitors. Laptops and PCs plugged into them, autodetect them as 120Hz and people observe it scrolls smooth like a 120Hz Apple iPad. Also, upcoming Apple laptops/monitors are rumored to start including 120Hz in the coming years, until the entire Apple suite is all universally 120Hz for user-experience considerations.
So, don't teach me 120Hz is not being mainstreamed later this century. Suuuure, we may disagree
when it becomes mainstream, but when I can buy a new XBox and a new TV, and it already automatically goes straight to 120Hz, one starts to realize what I'm talking about.
By end of this decade, it will be increasingly harder to buy a display that doesn't support 120Hz, much like it's hard to buy 720p unless going bottom-barrel. People start seeing the difference, much like when 4K was pushed onto people. Browser scrolling halves in motion blur.
My parents also even remarked the impressive improvement in laptop scrolling performance when I showed their 4K TV supports 120Hz from the laptop.
120Hz is being commoditized slowly whether you like it or not. Ever since we discontinued CRTs, the slow discovery of the need to increase refresh rates have yielded new discoveries and understandings (that before then, only few believed/knew about). Duh.
Also, there is a discovery of a nausea uncanny valley where 48fps and 120fps HFR creates more headaches than 1000fps UltraHFR. I'm a fan of 24fps Hollywood Movie Maker mode. I worked for RUNCO / Key Digital / dScaler / etc and was the inventor of the world's first Open Source 3:2 Pull Down Deinterlacer Algorithm in year 2000 (twenty years ago) -- if you saw my Internet Archive link in my 3:2 pulldown thread.
Eventually, it outperformed a Faroudja line double back in the day when John Adcock implemented my algorithm in MMX assembly language, allowing real time 3:2 pulldown deinterlacing and 24fps->72Hz conversion in year 2000 from a HTPC connected to my 72Hz-capable NEC XG135 CRT projector. Originally working on a Hauppauge TV card, I upgraded it to a SDI-capable card, so it was now deinterlacing 720x480 near flawlessly from an analog input signal from a SDI-capable DVD player.
So, yessiere, I have Hollywood Moviemaker snob cred. But, my favourite frame rates are 24fps (classic) and 1000fps (holodeck) as a result. I'm a bit disdainful of 48fps and 120fps non-strobed HFR though. Still too much motion blur, especially since source blur (camera) and destination blur (persistence) is additive.
But I am smart enough to know the HFR problems and soap opera effect problems too. The motion sickness-nausea uncanny valley where there's still a bit motion blur, but stutter is now gone, is sometimes a headache-creating combination for some. (It's similar for VR, but it also surprisingly applies to discrete flat panel displays too). You might have seen my UltraHFR work at
www.blurbusters.com/ultrahfr
Strobing is a great band-aid for this region of frame rates above flicker fusion threshold, but too low for blurless sample-and-hold. Many are still headached by "smooth motion that still has motion blur" (aka 120fps HFR). But a subsegment gets sick from strobing. 60Hz vs 120Hz is an 8.3ms improvement, and we need 120Hz->1000Hz for almost equal improvement (7.3ms improvement), as a dramatic spike up the diminishing curve of returns, avoiding the incrementalism of 240Hz and 480Hz. We're big fans of going straight to 1000Hz for UltraHFR, actually.
The strobeless blur reduction route (1000fps 1000Hz) has far fewer headaches, due to no flicker-induced sicknesses and no blur-induced sicknesses. Everybody sees differently and gets bothered differently. Hate brightness or hate flicker or hate tearing. 12% are color blind. Everybody's glasses have different prescriptions. Everybody reacts differently. Some barfs at the flicker. Others barf at the motion blur (motion blur sickness). Yet more barfs at stutter (motion fluidity sickness). The way you feel about motion is not the same as the way the other person feels about motion.
But the fact remains: Jumping over the HFR uncanny valley is also part of Blur Busters' goals too to simultaneously solve flicker/blur considerations.
1. Fix source stroboscopic effect (camera): Must use a 360-degree camera shutter;
2. Fix destination stroboscopic effect (display): Must use a sample-and-hold display;
3. Fix source motion blur (camera): Must use a short camera exposure per frame;
4. Fix destination motion blur (display): Must use a short persistence per refresh cycle.
Therefore;
A. Ultra high frame rate with 360-degree camera shutter is also short camera exposure per frame;
B. Ultra high refresh rate with sample-and-hold display is also short persistence per refresh cycle,
Until recently, there was no technology to prove this. But now that there is (Phantom Flex 1000fps sped-up on prototype 1000Hz displays), it.... Just dropped more than a dozen jaws, including from many Hollywood Moviemaker snobs that were anti-SOE. All the blur dizzy/nausea/sicknesses completely disappeared, and all the flicker-derived dizzy/nausea/sickness cases completely disappeared. It wasn't five-sigma comfort, but it added another nine (or two) to the discomfort-reduction in early casual tests. The flat-panel version of VR-sickness-elimination. As long as you're not doing too much vertigo-inducing stuff (so movie producer technique for "IMAX simulator footage" style use cases will quickly optimize to 1000fps 1000Hz comfort), it's miraculous in being less nauseous than semiblurry SOE-feeling 120fps HFR. You either stay low (24fps) for comfort, or you go all the way (low persistence sample and hold), "Go Big Or Go Home" when it comes to HFR, go straight to UltraHFR.
Research papers will come out by 2030s, but the early skunkworks just micdropped a lot of questions. It's an astounding amount of work leaping 120Hz to 1000Hz, but fortunately it has now ceased to be technological unobtainium. There is now an engineering path to a gigantic 8K 1000Hz for a specialized display (hyper expensive simulator display) in less than a mere decade involving full-bitdepth refresh cycles (no DLP temporal dithering shit). The GPU horsepower is almost there (in custom software written for 4-SLI) generating ~240-400fps at 8K in tightly optimized SLI-only framepacing on all those cards -- with some difficulty. But that's less an order of magnitude away from becoming a successful realtime source for the already engineered 8K 1000Hz plan -- it will take only a few more years to push 8K 1000fps Unreal 5 quality for "cost no object" situations, assisted partially by some frame rate amplification algorithms. There is obviously SLI latency of frame pipelining, but, at least it's adequately realtime for simulator-priority scenarios. (It's still less than VR strobe lag).
Even at still 1000fps, there is still 8 pixels of motion blur per 8000 pixels/sec, but the motionblur is so low that most footage doesn't even reaveal any of this; with most pans being comfortable (no phantom arrays, no stroboscopics, no motion blur) no matter what your eye gaze is doing relative to the display (fixed eye view, tracking eye view); it looks closer to analog motion! Like real life than any external direct-view display has ever be. Majority of motion blur limitations now shifts to human brain limitations, instead of display enforcing motion blur on you above-and-beyond source blur and display persistence. With natural behavior of brain, less nausea from side effects of the humankind invention of digital frame rates to emulate analog motion. Even 1000fps 1000Hz is not the final frontier, but it's such a magical punch of a jump up the diminishing curve of returns that is no longer an Apollo mission anymore or Star Trek unobtainium -- more technologically achievable like a jet flight to altitude.
By 2040, with the boom of 1000Hz for esports/VR/reality emulation/holodeck/simulator rigs/industrial/amusement rides/etc use cases in the elite spheres quickly making 120fps cinematic HFR minor stepping stone undesirably obsolete -- plus also 120Hz and 240Hz are engineerably-into-freebie (like retina and 4K) since the BOM and math checks out to a viable commoditization paths that I've been hearing about behind the scenes. 120Hz is following the path of retina screens as we speak, although the path-to-commoditization-as-freebie-includedness is about approximately 2x slower.
We're more like the 1970s and 1980s people conceptualizing HDTV right now; you heard it here first: UltraHFR is the HFR Holy Grail, to the point where there will probably only be two still-popular new-content-creation framerates by year 2100: 24fps (or a low below-flicker-fusion-threshold framerate), and >1000fps (far beyond other annoying thresholds). A small remaining subsegment of population will still always have motion sickness that is only solvable by 24fps. But a lot of those 48fps&120fps discomfort cases are fixed at 1000fps@1000Hz. All other frame rates eventually becomes niche or legacy (like viewing last century's 60Hz videos). That's the far-futurist in me speaking, but ultimately, high-Hz isn't expensive. After witnessing 1000fps 1000Hz, non-strobed 48fps and 120fps HFR video is now garbage incrementalism.
There will always be people who get sick from any motion, but fewer people get sick from 1000fps@1000Hz than from 48fps or 120fps HFR (strobed or nonstrobed).
We're strobing only because refresh rates are not high enough for low-persistence flicker-free operation (low persistence sample-and-hold). Our name sake is Blur Busters, and while we're also latency focussed, we are also hugely smoothness/blur focussed.
Anyway, offtopic.
The point is, it's so blatantly obvious that 120Hz is going be freebie mainstream by ~2030 (you're living under a rock if you can't see the trojan horsing), and 240Hz is going to be freebie mainstream by ~2040 due to ergonomic browser scrolling motion blur considerations and simple mudane stuff like that. Like 4K is now a freebie in televisions, even all those $299 Walmart sales. (Some of those even now has 120Hz in its PnP DisplayID/E-EDID too!) High Hz fundamentally isn't an expensive technology anymore because the performance required is getting cheaper and cheaper, and the migration to reality emulation requires a good big jump over the uncanny valley all the way to strobeless CRT clarity for the proper video wow effect.
schizobeyondpills wrote: ↑08 Nov 2020, 02:15
Yeah, but why would you need to timestamp reads? timestamped reads are nothing but compensation for latency bridge or jitter/desync, which should be the main fixing problem, reducing it past a certain point so that you dont need timestamps.
The tool primary purpose isn't intended to be latency (for now), so nonrelevant here.
It will indirectly help that by pushing innovation in low-CPU-utilization ultraprecise mouse of the future (
HD Mouse API) as well as other things, but my goal is to explode the popularity of mouse flaws, even if it utilizes some somewhat-synthetic mouse benchmarks. Sometimes the extra noise produces massively useful data in the streams, especially if the data is way easier to interpret than MouseTester data (see above for explanation)
I welcome others to come up with mouse latency testing, but the testing tool's current goal is relative precision/smoothness/motion quality/blur reduction/etc. Mouse jitter adds motion blur, given the stutter-to-blur continuum (low frequency vibrates visibly like slow guitar strings, high frequency jitter blends to blur like fast guitar strings), and I want to see mouse become increasingly smoother in the refresh rate race to retina refresh rates. It's a whac-a-mole issue for blur busting, aka name sake.
schizobeyondpills wrote: ↑08 Nov 2020, 02:15
Yes, but what kind of a disclaimer? ultra low latency sensitive disclaimer or a general one? big difference.
TestUFO Mouse Tester is not currently designed to be a mouse latency tester. Move along
schizobeyondpills wrote: ↑08 Nov 2020, 02:15
Browser is average Joe's best tool and as such Joe has 50 tabs open, 10 toolbars installed, 30 extensions collecting his data and few malware/bloat programs running on his system.
Read earlier posts above. The visualization is intended to be designed in a way that makes it easy for me to interpret whether it's probably background processing or the mouse issue, etc. Smart automatically formatted graph design is designed to be third-party-analysis friendly (including by me), and by corollary, useful to smoothing-priority research.
Some correlations will be subjective but a lot of it is very convincing -- e.g. spikey versus general noise increase versus etc. -- the graphs hones on them an emphases/colorcodes them, rather than unformatted MouseTester graphs etc. With additional data beyond MouseTester, such as number statistics on the corner of the graph (stddev, etc)). Collect enough data, and if 99 out of 100 of graphs of the same pattern occurs because of browser tabs in background, then I can easily blame the user for having browser tabs open, and dismiss the graphs, allowing smart people to surgically hone onto the most interesting graphs that users post. The fact that it's one click away with no software, ensures a lot more data can be accumulated, even if a lot of reveals computer/browser flaws (which is useful data to me and the world). Browser vendors have often noticed "why is TestUFO Animation Time Graph so good on that system and not on this one?" and optimized their browsers too, in the vairous github/bugzilla items.
And in those cases where browser is not the weak link (many graphs will clearly show that), it pushes the mouse vendors. I'm mouse vendor agnostic, but I can acknowledge Razer is simply the most generous in their first supplying of 8000Hz mouse samples for general Blur Busters advocacy purposes. Kudos to them for being brave with the potential flaws of 8000Hz, and the revealation of the need to push the needle in the refresh rate race to retina refresh rates. It's fine to hate upon commercialism and paid researchers (but a lot of research projects -- including this one -- has no payment associated to it even if I indirectly benefit from increased site publicity and its ads/affiliate links).
Also there is a feature to visualize based on poll imperfections, as well as poll-vs-Hz imperfections, to separately reveal flaws on mouse fluidity for VSYNC OFF situation, versus mouse fluidity for VSYNC ON situation, I have enough data to be able to colorcode slightly differently (or show to concurrent graphs) based on what use case the user wants to optimize their mouse for. The synced visualization shows really interesting artifacts for 1000Hz mice on 360Hz monitors in a visually easier-to-identify manner than a mouse pointer stroboscopic photograph,
but they beautifully corroborate each other!
Anyway...
Rising tides lifts all boats. First 1000 Hz mice were also laughed at initially too and had many performance problems, that took years to improve. In 2021-2022 I will post major articles about the need to address the Vicious Cycle Effect much more seriously (precision, latency, etc), in various rethinks of the entire architecture, but it depends on how much pull I have after popularizing the mouse tester, etc which will help convince mouse vendors to adopt
HD Mouse API -- I will certainly be advertising HD Mouse API as part of TestUFO Mouse Tester, as part of my ulterior goal of open industry wide vendor agnostic initiative, even if Razer is the generous one providing the first >1000Hz mice to Blur Busters to starting-pistol this initiative...
Mouse latency testing is badly needed, but that can be another concurrent tool developed in the same toolchest (written in C++/asm). Not all tools need to be made of the same metal/plastic and of the same age and of the same precision (steel vs titanium vs bakelite vs acrylic vs whatever) but I am introducing something that adds useful data to the world -- it just isn't data you're wanting (latency)
Good graph, but I wrote my
HD Mouse API thread before I even noticed this post, and you appear to (at least partially) agree with the theory of an
HD Mouse API. I already know this. We have an overlapping venn diagram, even if our agreement-overlap is subject to debate.
For this older post, you bring up good and valid points -- some which definitely need other tests/benchmarks/tools created for it -- but you're in the wrong forest. I'm looking at different trees in a different forest.
Consequently, since I am only writing a test tool focussed on relative precision (precision of poll-to-poll), rather than the tapedelay element (absolute latency offset) -- this thusly is a low priority offtopic forum thread to me, even if the discussion is useful reading for other people who wishes to create different, new mouse tests.
Higher poll helps lower lag, and there are already lag tests for mice elsewhere, use them. The tool I am focussed on is more precision-related that benchmarks both the browser+mouse chain (surprisingly well for the intended testing goal and convincing all manufacturers to tackle the problem).
Just because I agree
mouse latency testing versus
mouse smoothness testing are both needed -- it doesn't mean TestUFO will test both. Sometimes both has to be tested at once, but that's a job for an executable (agree).
But for relative-precision (including microstutter of 1000Hz mouse on 360Hz monitors), TestUFO is spectacular reveal of that proof in a more blatant color-coded beat-frequency visualization (difference of theoretical perfect position versus closest-poll to current composited VSYNC refresh cycle), which just literally micdrops the proof that 1000Hz isn't smooth enough for 360Hz monitors, and some technology innovation needs to be done (high poll, HD Mouse API, whatever, manufacturers, dammit, fix the problems).
Sometimes the blanket increase of good visual statistics data helps, especially if the graphs are easy to interpret by industry. Whether optimizing a motherboard for fewer spikes, and optimizing the mouse to have less poll jitter, and fixing drivers/software to be smoother. All of these can have side effects interacting with each other (including improved latencies too, by sheer side effect of higher sampling rates and better USB refinements, or faster ISRs, etc). It all Rube Goldbergs beautifully, even if haphazardly sometimes.
An EXE requires more time-and-effort and requires a download and doesn't shake the market hard enough to get the manufacturers do something about it. In a metaphorical, it generates "too little data/statistics surface to attack industry with" for the time available. A superior 10x-more-refined HTML5 utility (for the programming time & money effort spent) versus a poorly-made poorly-marketed executable. Rock, meet Hard Place. Thus, "Low Lying Apple" opportunities. Instead, as soon as HTML5 standards catches up on something beautifully good (already for 1000Hz mice), I sieze it as part of popularizing a test. I gotta compromise, shake things up using HTML5 to increase complaint surface, to get the blanket of manufacturers industry-wide to notice, and finally do something about it.
Goals are achieved, except by different concurrent streams of pressures/market demand/etc to manufacturers industrywide. Others create superior power user tools that measures other aspects (latency), but there's a huge need to at least incubate a mouse testing need in this rare decadal upgrade supercycle (shattering the 1000Hz poll barrier).
Sometimes these are necessarily separate tools for the constraints of capabilities of specific environments (e.g. HTML5 PointerEvents + PointerLock API is accurate enough for benchmarking relative-precision and Hz-beatfreq effects, but bad for measuring latency). The disclaimer already mentions it does not test mouse lag and the disclaimer will certainly be refined even further (browser screen space allowing) based on a mass of advanced-user feedback (including from you).
Move along, TestUFO Mouse Tester isn't something you're interested in -- it does not test mouse lag at this time, although it will indirectly (Rube Goldberg) spin off to pushing the needle on those considerations. Nontheless I will continue to check this thread approximately weekly.
Cheers,