Changing Processor State Greatly affects mouse sensitivity/ lag

Everything about latency. Tips, testing methods, mouse lag, display lag, game engine lag, network lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.
User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Changing Processor State Greatly affects mouse sensitivity/ lag

Post by Chief Blur Buster » 17 Aug 2022, 19:29

Kyouki wrote:
17 Aug 2022, 02:40
3333fps/hz :shock: damn.. :o
Yep. Alas, it illustrates that we're going to be stuck with strobing for a long time. Strobing is a humankind bandaid because we can't fully fix display motion blur at current contemporary two-digit or three-digit refresh rates.

Motion blur is lovely soometimes (Hollywood Filmmaker 24fps Mode is gerat), but other times motion blur is absolutely astoundingly evil (display motion blur is massive nausea headaches in virtual reality headsets, as discovered in early non-strobed VR headsets). So the type of application used for the display can increase/decrease the importance of display motion blur...

I hate going off topic in these threads, but I love to help other researchers learn more about the refresh rate race, even if they are researching other things (e.g. camera shutters or optical illusions), because understanding the refresh rate race helps a researcher become smarter about unanticipated effects that are often overlooked... So without further ado:

The easiest way to understand display motion blur is Blur Busters Law: 1ms of pixel visibility time translates to 1 pixel of motion blur per 1000 pixels/sec. Blur Busters Law is simply an easy interpretation of MPRT(100%). Blur Busters Law is essentially our "E=mc^2" style simplification of MPRT formula which is complex only because of the 10%->90% cutoff. But if you assume GtG=0 and MPRT0->100, then motion blur is suddenly elementary-school mathematics, like a camera shutter.

240fps 240Hz 0ms GtG (for blurfree source material that is panning in motion) has exactly the same motion blur as a 1/240sec SLR camera shutter photographing panning scenery (like lens-following a sports athelete), for the same physical angular motion speeds relative to human vision. In other words, humans can see apart 1/240sec and 1/1000sec sports photographs better than 1/240-vs-1/480 or 1/480-vs-1/1000, the bigger blur differentials are easier to see by average laypeople. (It also applies to mudane things such as text scrolling in web browsers -- the raison d'etre of 120Hz iPads, and you can bet Apple is internally testing 240Hz iPads with the DisplayWeek's newly announced 240Hz OLEDs...).

So for some researchers, by reframing the refresh rate race debate as a blur equivalence to camera shutter, it helps certain classes of researchers in some trades (e.g. those experienced with camera blur behaviors). This helps them realize the uncapping of a specific Hz limit or Hz assumption, because Hz limitations creates aggregate effects like blurs (sample and hold effect...). And the need to test blur differentials, and understanding the "geometric requirement" of the refresh rate race. Today's Hz incrementalism hides a lot of this; and often leads to mistaken assumtions.

A low-resolution photograph (e.g. 8mm or 16mm camera frame) will be harder to see motion blur differentials because stationary images already have low resolution, so the better motion resolution doesn't help as much for very high resolution. So 70mm film (metaphorical equivalent of panning 8K scenery) will be easier to see shutter-speed differentials than 8mm film (metaphorical equivalent of VHS scenery). e.g. 1/240sec shutter vs 1/1000sec shutter (in camera parlance) versus the concept of 240fps-at-240Hz versus 1000fps-at-1000Hz (in refresh rate race parlance), have identical persistence blur effect -- the sample and hold blur mathematics maps exactly to camera shutter blur mathematics when GtG=0, jitter=0, framerate=Hz -- the blur mathematics suddenly becomes identical when those variables are set!! (not all researchers realize this)

It is just simply Blur Busters Law (our simplification to make it super-easy to understand display motion blur).

...Therefore for framerate=Hz, GtG=0, jitter=0;
Nonstrobed: Blur is exactly the same as frametime
Strobed: Blur is exactly the same as squarewave pulse width duration

...Therefore;
We know Oculus Quest 2 has a typical strobe pulse width of 0.3ms at least for early models at default setting (let's ignore squarewave ripple in an oscilloscope for practicality's sake);

...Therefore;
To match 0.3ms blur, we need 0.3ms frametime;

...Therefore;
1000 milliseconds per second, divided by 0.3ms ~= 3333

...Therefore;
We need 3333 unique frame snapshots per second along the motion vector, to match the motion clarity of 0.3ms strobing.

(Note: Error margin applies, it could be a 0.32ms or a 0.28ms pulse width; newer headsets may be different. The math still ends up creating a multiple-thousands refresh rate requirement -- still demonstrates the unobtainium without strobing)

For VRR, which is ideally permanently framerate=Hz within VRR range, you can see motion blur vary up/down as framerate ramps up/down. It's a very great demo of Blur Busters Law at www.testufo.com/vrr ... Low framerates vibrate, while high frame rates blends to motion blur. Another great example of "stutter=blur" continuum is www.testufo.com/eyetracking#speed=-1 as not all researchers realize stutter and MPRT blur is exactly the same thing; just a matter of the flicker fusion threshold (and any GtG-fuzzying of that threshold line since slower GtG displays will have a lower flicker fusion threshold due to the softer flicker wavefrom, and fast GtG displays will have a higher flicker fusion threshold due to the more squarewave stutter-edge-flicker waveform. That's why 60fps often is smooth on LCD but still stuttery on OLED displays, a "behavior" that not all researchers realize it's simply the shifted flicker fusion threshold from the different flicker waveforms (sinewave'd by slow GtG), which can vary from 40-50fps (for LCD) to 70-80fps (for OLED) -- the opposite side of the century-old refresh rate standard (aka 60Hz, dating back to the dawn of TVs).

Erratic stutter/judder (e.g. 3:2 pulldown et al) on top of regular perfect-regular stutter is simply an extension of sample and hold (picture an erratic-height stairstep rather than a perfect stairstep -- and the biggest-height stairsteps is the new motion blur weak link, aka stutter is a very simple mathematical worsening of perfect framepaced regular sample and hold). At low frequencies it's extra shakiness (janky) and at high frequencies it's just extra blur (more blurry than perfect framepacing, ala stuttery 360fps is possible to look blurrier than perfect framepaced 144fps).

I already teach training classes (I'm for hire; I fly over both Atlantic and Pacific to mainly display-related companies), to demonstrate various elements of the refresh rate race, so I do a bunch of TestUFO power points at different displays, and run a high-speed camera in real time in-classroom, and show off results of multiple displays.

You have a 360Hz display.
They're INCREDIBLY educational to show off to other researchers.
In the vein of the saying "give a fish, they're fed for the day. Teach how to fish, they're fed for life", I like telling researchers some custom tests to show off to other researchers, because there is almost an infinite number of useful TestUFO tests when you consider all the possible useful combinations & permutation of the TestUFO parameters....

So here are some of my favourite show-off custom-tweaked TestUFO tests for 360Hz displays are the following.
Bookmark for showing off to other researchers, or at least to self-educate yourself.
  • TestUFO Stutter=Blur Continuum
    Watch the 2nd UFO for at least 30 seconds. Observe how stutters blends to blur, and how blur blends back to stutter, as a matter of the framerate (from flicker fusion threshold relating versus stutter). Like slow vibrating music string (shaking) versus fast vibrating music string (blurry). This is literally fundamental "Sample And Hold 101"; researchers are surprised that they do not fully grasp sample and hold unless they finally understand stutter=blur (regardless of regular sample-and-hold "stutter" or erratic GPU/engine/pulldown "stutter"). For some people, this specific TestUFO demonstration is the metaphorical blur "Unifying Theory"-equivalent eureka moment, literally, for sample and hold blur=stutter continuum. No textbooks explain things this way, and TestUFO is worth a thousand words in one link -- one custom-configured TestUFO link sometimes teach more than a 100 page textbook! (Incidentally, silencing debates is exactly why I invented TestUFO as the Internet's defacto micdropping factory.)
    TestUFO DLP Color Wheel Simulator -- wave your hand in front of this for rainbow effects. This TestUFO is displaying in color-sequential monochrome, to generate color temporally. So I'm simulating a DLP color wheel with this specific TestUFO. 240Hz and up recommended for this test. Do not run this TestUFO on anything less than ~165-ish, or you will get epileptic flicker.
  • Test UFO Variable-Persistence Black Frame Insertion Demo for 360Hz, 6 UFO at 60fps
    This is a great demonstration of Blur Busters Law. More pixel visibility time = more motion blur. And I even compare low-framerate strobed with high-framerate non-strobed (bonus UFO at bottom).
  • Test UFO Variable-Persistence Black Frame Insertion Demo for 360Hz, 8 UFO at 45fps
    This is a great demonstration of Blur Busters Law. More pixel visibility time = more motion blur. And I even compare low-framerate strobed with high-framerate non-strobed (bonus UFO at bottom)
Now that you know about the refresh rate race problem (Hz incrementalism, jitter weak links, slow GtG, etc), we have to try and upgrade refresh rates even more geometrically at least for higher end applications. I developed several new viable refresh rate parallelization techniques that can uncap the limitations of the refresh rate race with today's technology for some applications -- turning Hz into simply a budget consideration for the amount of off-the-shelf technology to purchase. (The main issue is I need additional help from a windows indirect display driver skilled person, for an open source refresh rate combiner module written from scratch -- PM within).
Kyouki wrote:
17 Aug 2022, 02:40
The urges for more framerate is also still there, primarily because it is so enjoyable when its on par within your refresh rates and I do wish more games took better notice of other methods of achieving better framerate for their end users.
One thing I've noticed is that many developers don't always improve jitter when increasing frame rates -- it depends on the game. I've seen many games where framerate improvements don't seem to be very visible.

Have you ever tried a Quest 2 VR headset? Virtually all games on Quest 2 are perfect framerate=Hz VSYNC ON (ultra low latency). It's amazing how VR developers pay 100x more attention to jitter/blur. I have so much fun playing games like Star Wars: Tales From Galaxy Edge, to other content, that sometimes PC games (even Cyberpunk 2077 visuals) kind of bore me. Yes, I even have more fun playing on Quest 2's mobile GPU than my personal NVIDIA RTX 3080, simply because the high-funded VR screen is so light years beyond any of the gaming monitors, in motion blur elimination technology.

The motion-quality improvements far more outclasses the rendering-quality loss, to the point that you actually see more detail (during turning) on a Quest 2 than 360fps-360Hz on a RTX 3090. Or I can cable-up with a PC (higher cost VR headset with a cable) to get even better visuals, at the loss of flexibility. I hate the dizzying rollercoaster demos, but the comfortable content is really lovely, whether be the VR-ified equivalents of Sierra Quest style stuff (Down The Rabbit Hole) or the AAA-class solo adventure-story FPS stuff (Tales From Galaxy Edge / Red Matter / Walking Dead standalone / and for cabled PCVR, Half Life Alyx), depending on what you're into, as a translation from the PC gaming experience.

They all use Vulkan API internally, and the developers have to follow extreme perfectly framepaced framerate=Hz, since the wide-FOV experience and the panny behavior (head turns = screen perpetual scroll) massively amplifies display motion blur to the hilt.

PC games are still far away from these VR principles, but you notice the massive improvements to Microsoft Flight Simulator fluidity because they were trying to improve the framepacing for VR, but it had a major side effect of becoming much better on high-hz PC gaming monitors, because a lot of jitter/stutter was removed. Some of these hybrid VR/nonVR games (like Microsoft FS 2020) are a great textbook example of how programming improvements can translate very well over to non-VR in improving the benefits of the refresh rate race on planar 2D displays.

But for now, the way even a lowly mobile GPU manages to preserve gametime:photontime to an accuracy of less than 100 microseconds, is astounding -- they really whac-a-mole'd the jitter weak links, even for smartphone-quality camera sensors used for room tracking (and surviving things like light changes, light switches, different time of days, increased camera sensor noise at night, etc). Without any external PC, without an external sensor, etc -- just put headset on in a random room, let it do the near-photogrammetry scan for RoomScale in a new room, and you're playing VR in 30 seconds. It's easier to setup than an iPad, I wish other VR headsets were that easy. Standing ovation to Carmack (but not Facebook), at least they removed the Facebook login requirement (finally).
Kyouki wrote:
17 Aug 2022, 02:40
Vulkan API come to mind for instance, being much better frame pacing, frame times and overall resource use is much smoother and nicer. DOOM Eternal being one of those enjoyable examples. On the side I also invest time in some beta programs for games - or are some internal tester for some games, always trying to advertise or promote the use of better technologies for displays, rendering, graphics, etc. Trying for myself to understanding swapchain technology, MPO, and all the different display modes within game graphics api.
This is essentially of keeping gametime:photontime as perfectly relative as possible considering error margins:
- Accurate motion math in engine
- Rendertime variances between adjacent frames
- Mouse jitter
- Graphics driver jitter (e.g. amount of time that passes between Present() and the first pixel spewing out of graphics output)
- Sync technology interaction with various refresh technologies (strobed, nonstrobed, vrr, non-vrr), remember not all pixels refresh at the same time, and turning on/off strobing can affect jitter/latency mathematics weirdly. If you want that particular wall of text, for your future reference, let me know, sometimes competitive advantage swaps between VSYNC ON versus VSYNC OFF, with different display refreshing patterns, because of less violation of gametime;photontime for the different pixels across the 2D display plane.

Error margins become more visible when we're talking about MPRT's smaller than stutter/jitter error margins, so things like turning on strobing can amplify visibility of jitter, and switching from VSYNC ON->VSYNC OFF can produce more massive jitter differences during strobed than nonstrobed. There are easy scientific explanations for these, that are non-obvious to most researchers though.

So as Hz goes up, gametime:photontime divergences need to remain smaller than MPRT to prevent it from throttling MPRT (aka adding more blur from high-frequency stutter/jitter)... So the refresh rate race puts more demand on accuracy of the entire gametime:photontime relativeness pipeline.

Even things like stutters punching through VRR are often a side effect of things like rendertime variances (wild yo-yoing) or framebuffer-delay fluctuations. VRR tries to keep gametime:photontime constant, which is why if you keep this variable accurate, even random framerate fluctuations look smooth: TestUFO Random Frametimes Look Amazing Smooth in VRR. But this only happens if you're successfully maintaining gametime:photontime ....things like mere rendertime fluctuations can add visible stutter, so if one frame takes much longer to render than the next frame (e.g. from texturestreaming!!), or vice versa, stutters can punch through. Which is why Optane was kinda impressive for 360Hz VRR, even though it only increase framerates by 1% -- because it helped the gametime:photontime quite a bit. Sadly, Intel discontinued Optane, so we'll have to rely on better stutter-resisting UE5 texturestreaming algorithms to prevent things like disk access and texture decompression overheads from jittering gametime:photontime relativity.

Vulkan API is a gift from the sky for the refresh rate race, it is certainly easier to keep gametime:photontime more consistent.
Kyouki wrote:
17 Aug 2022, 02:40
Speaking of tech implementation's, one of the worst implementations (personally) is TAA/DLSS and any kind of temporal effects. This truly feels like a extra smearing or blur technique that is just absolute bane of existence for anything motion clarity.
It is a double edged sword. It is a frame rate amplification technology of sorts, and you need to configure it in a way that AI-based detail and framerate enhancement behaviors outweighs the blurring of imperfect AI-based frame rate amplification. (Classic interpolation is the same, but DLSS 2.0 is vastly superior in preserving detail). And DLSS algorithms do sometimes add weird forms of frametime jitter (not always).

The DLSS 2.0 does a much better job than DLSS 1.0 so if configured to prioritize quality, when run in some games, its benefits outweighs the cons. It's a game of making the motion resolution increases (of more framerate) exceed the motion resolution decrease (of DLSS artifacts). It has a "sweet spot behavior", if you're prioritizing maximization of motion resolution. Maximize DLSS for maximum framerate, and the DLSS blurring exceeds the motion resolution improvements of the frame rate increases. So back off a bit, tweak, and you find the sweet spot median of maximal motion resolution improvement. But for some games (especially super-jittery games) the hump is unfindable -- DLSS works better when you've got good content framepacing + you're using a global sync tech (VSYNC ON) + global refresh tech (strobing). (That's why all VR headsets are always VSYNC ON + strobed, as it globalizes both the presentation and the refresh, for consistent gametime:photontime for all pixels of a display). Sometimes removing time differentials between pixel refresh, sometimes creates an unexpected outcome.

So better frame rate amplification AIs will do a better job as more time passes.

My vision of future framerate amplification technology is the combination of both temporal (ASW-like) and spatial (DLSS-like) frame rate amplification, to create 4x-10x frame rate increases that finally massively outweighs the side effects of DLSS. even more DLSS accuracy (DLSS 3.0 and up) combined with ASW-like algorithms (like Oculus Rift does), where 45fps is almost laglessly converted to 90fps via reprojection techniques. If you could combine DLSS+ASW, you could do 4x-5x frame rate increases pretty easily, for tomorrow's 1000Hz screens, while using only 200-250fps worth of GPU.

I feel we are still in the Wright Brothers era of frame rate amplification.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Kyouki
Posts: 190
Joined: 20 Jul 2022, 04:52

Re: Changing Processor State Greatly affects mouse sensitivity/ lag

Post by Kyouki » 18 Aug 2022, 05:34

First of all, wow thank you so much - this back and forth is a lot of fun to read and I wish we had a better way of scouting through your posts of the past (which you sometimes convert to new threads/and or articles! awesome.)
Using search helps but it won't always find the most recent or up to date answers.

Got a few questions and when I get home today I'll go through the post you've made so insanely detailed, which thank you so much for that.
Chief Blur Buster wrote:
17 Aug 2022, 19:29
Kyouki wrote:
17 Aug 2022, 02:40
3333fps/hz :shock: damn.. :o
Yep. Alas, it illustrates that we're going to be stuck with strobing for a long time. Strobing is a humankind bandaid because we can't fully fix display motion blur at current contemporary two-digit or three-digit refresh rates.
That do appear to be the unfortunate case indeed. I am happy with it's discovery though, I always wondered why I never found myself having tracking problems playing cs1.6 on a old CRT and wanted to figure out why that was. Now that makes sense to me. Your topics, posts, articles have all clarified this. I'm still learning a lot and trying to make sense of a lot of it but I do feel like I am totally past the imposter syndrome.

I've been trying to play with ULMB a lot and its my preferred way of playing racing/esports titles/shooters, this is the current set up:
- 240Hz strobing ULMB
I've read that this isn't a really properly tuned ULMB and just Gsync module + ULMB with some presets for ULMB to use. Like one of your post, incremental changes across the refresh rate range. I found that some resolutions do not engage ULMB at all for this matter.

- 30% ULMB Pulse width
currently on 60% but intend to go lower, see if I can adjust overtime with my eyes to the darker screen - started on 100%, as I didn't know what this did and now gotten to the 60% mark.

- FPS cap to the decimal of refresh rate from ufotest.
- Fast Sync - though if I revisit your article this could also be done with Vsync ON just with the low latency trickery of RTSS framerate cap which is what I was doing above.

(For competitive play, as I do not see use of ULMB in casual games. I use display changer to quickly swap refresh rate modes with icons on the desktop)

Are you familiar with this person he also makes YouTube videos about sample and hold, among other techniques and educates viewers. He also made a strobing test tool similar to that of testufo but in an game/application format UE4. https://www.aperturegrille.com/
---
Gsync is great in other scenario's if done properly and when it properly engages (not all applications I found were designed perfectly).
Under Gsync does pixel response time change? They remain a fixed interval per refresh rate cycle correct?
This thread poked that interest.
viewtopic.php?t=10357&p=84123#p84123
---
Chief Blur Buster wrote:
17 Aug 2022, 19:29
I hate going off topic in these threads, but I love to help other researchers learn more about the refresh rate race, even if they are researching other things (e.g. camera shutters or optical illusions), because understanding the refresh rate race helps a researcher become smarter about unanticipated effects that are often overlooked... So without further ado:

You have a 360Hz display.
They're INCREDIBLY educational to show off to other researchers.
In the vein of the saying "give a fish, they're fed for the day. Teach how to fish, they're fed for life", I like telling researchers some custom tests to show off to other researchers, because there is almost an infinite number of useful TestUFO tests when you consider all the possible useful combinations & permutation of the TestUFO parameters....

So here are some of my favourite show-off custom-tweaked TestUFO tests for 360Hz displays are the following.
Bookmark for showing off to other researchers, or at least to self-educate yourself.
I'll go through these with love and make notes to educate others with this sample data to showcase the effects. Awesome, thank you so much.
Chief Blur Buster wrote:
17 Aug 2022, 19:29
Have you ever tried a Quest 2 VR headset? Virtually all games on Quest 2 are perfect framerate=Hz VSYNC ON (ultra low latency). It's amazing how VR developers pay 100x more attention to jitter/blur. I have so much fun playing games like Star Wars: Tales From Galaxy Edge, to other content, that sometimes PC games (even Cyberpunk 2077 visuals) kind of bore me. Yes, I even have more fun playing on Quest 2's mobile GPU than my personal NVIDIA RTX 3080, simply because the high-funded VR screen is so light years beyond any of the gaming monitors, in motion blur elimination technology.

The motion-quality improvements far more outclasses the rendering-quality loss, to the point that you actually see more detail (during turning) on a Quest 2 than 360fps-360Hz on a RTX 3090. Or I can cable-up with a PC (higher cost VR headset with a cable) to get even better visuals, at the loss of flexibility. I hate the dizzying rollercoaster demos, but the comfortable content is really lovely, whether be the VR-ified equivalents of Sierra Quest style stuff (Down The Rabbit Hole) or the AAA-class solo adventure-story FPS stuff (Tales From Galaxy Edge / Red Matter / Walking Dead standalone / and for cabled PCVR, Half Life Alyx), depending on what you're into, as a translation from the PC gaming experience.

They all use Vulkan API internally, and the developers have to follow extreme perfectly framepaced framerate=Hz, since the wide-FOV experience and the panny behavior (head turns = screen perpetual scroll) massively amplifies display motion blur to the hilt.

PC games are still far away from these VR principles, but you notice the massive improvements to Microsoft Flight Simulator fluidity because they were trying to improve the framepacing for VR, but it had a major side effect of becoming much better on high-hz PC gaming monitors, because a lot of jitter/stutter was removed. Some of these hybrid VR/nonVR games (like Microsoft FS 2020) are a great textbook example of how programming improvements can translate very well over to non-VR in improving the benefits of the refresh rate race on planar 2D displays.

But for now, the way even a lowly mobile GPU manages to preserve gametime:photontime to an accuracy of less than 100 microseconds, is astounding -- they really whac-a-mole'd the jitter weak links, even for smartphone-quality camera sensors used for room tracking (and surviving things like light changes, light switches, different time of days, increased camera sensor noise at night, etc). Without any external PC, without an external sensor, etc -- just put headset on in a random room, let it do the near-photogrammetry scan for RoomScale in a new room, and you're playing VR in 30 seconds. It's easier to setup than an iPad, I wish other VR headsets were that easy. Standing ovation to Carmack (but not Facebook), at least they removed the Facebook login requirement (finally).
Kyouki wrote:
17 Aug 2022, 02:40
Vulkan API come to mind for instance, being much better frame pacing, frame times and overall resource use is much smoother and nicer. DOOM Eternal being one of those enjoyable examples. On the side I also invest time in some beta programs for games - or are some internal tester for some games, always trying to advertise or promote the use of better technologies for displays, rendering, graphics, etc. Trying for myself to understanding swapchain technology, MPO, and all the different display modes within game graphics api.
This is essentially of keeping gametime:photontime as perfectly relative as possible considering error margins:
- Accurate motion math in engine
- Rendertime variances between adjacent frames
- Mouse jitter
- Graphics driver jitter (e.g. amount of time that passes between Present() and the first pixel spewing out of graphics output)
- Sync technology interaction with various refresh technologies (strobed, nonstrobed, vrr, non-vrr), remember not all pixels refresh at the same time, and turning on/off strobing can affect jitter/latency mathematics weirdly. If you want that particular wall of text, for your future reference, let me know, sometimes competitive advantage swaps between VSYNC ON versus VSYNC OFF, with different display refreshing patterns, because of less violation of gametime;photontime for the different pixels across the 2D display plane.
I haven't had the chance to try a VR headset other then the PlayStation headset or one of the first ones they produced. I did feel motion sicky with that. I do remember that.

I'd love to see that wall of text, it gives me extreme joy and insights to understand it -- rather then saying 'oh well its just better' echo chambering.
Chief Blur Buster wrote:
17 Aug 2022, 19:29

Error margins become more visible when we're talking about MPRT's smaller than stutter/jitter error margins, so things like turning on strobing can amplify visibility of jitter, and switching from VSYNC ON->VSYNC OFF can produce more massive jitter differences during strobed than nonstrobed. There are easy scientific explanations for these, that are non-obvious to most researchers though.

So as Hz goes up, gametime:photontime divergences need to remain smaller than MPRT to prevent it from throttling MPRT (aka adding more blur from high-frequency stutter/jitter)... So the refresh rate race puts more demand on accuracy of the entire gametime:photontime relativeness pipeline.

Even things like stutters punching through VRR are often a side effect of things like rendertime variances (wild yo-yoing) or framebuffer-delay fluctuations. VRR tries to keep gametime:photontime constant, which is why if you keep this variable accurate, even random framerate fluctuations look smooth: TestUFO Random Frametimes Look Amazing Smooth in VRR. But this only happens if you're successfully maintaining gametime:photontime ....things like mere rendertime fluctuations can add visible stutter, so if one frame takes much longer to render than the next frame (e.g. from texturestreaming!!), or vice versa, stutters can punch through. Which is why Optane was kinda impressive for 360Hz VRR, even though it only increase framerates by 1% -- because it helped the gametime:photontime quite a bit. Sadly, Intel discontinued Optane, so we'll have to rely on better stutter-resisting UE5 texturestreaming algorithms to prevent things like disk access and texture decompression overheads from jittering gametime:photontime relativity.

Vulkan API is a gift from the sky for the refresh rate race, it is certainly easier to keep gametime:photontime more consistent.
Never knew how much Vulkan has or is being used out there in the wild, that is really awesome. I am a huge fan of this API, and that is good news as well. IT truly feels better in miles by the DirectX counterpart in use or feel. (resource use)
Chief Blur Buster wrote:
17 Aug 2022, 19:29
Kyouki wrote:
17 Aug 2022, 02:40
Speaking of tech implementation's, one of the worst implementations (personally) is TAA/DLSS and any kind of temporal effects. This truly feels like a extra smearing or blur technique that is just absolute bane of existence for anything motion clarity.
It is a double edged sword. It is a frame rate amplification technology of sorts, and you need to configure it in a way that AI-based detail and framerate enhancement behaviors outweighs the blurring of imperfect AI-based frame rate amplification. (Classic interpolation is the same, but DLSS 2.0 is vastly superior in preserving detail). And DLSS algorithms do sometimes add weird forms of frametime jitter (not always).

The DLSS 2.0 does a much better job than DLSS 1.0 so if configured to prioritize quality, when run in some games, its benefits outweighs the cons. It's a game of making the motion resolution increases (of more framerate) exceed the motion resolution decrease (of DLSS artifacts). It has a "sweet spot behavior", if you're prioritizing maximization of motion resolution. Maximize DLSS for maximum framerate, and the DLSS blurring exceeds the motion resolution improvements of the frame rate increases. So back off a bit, tweak, and you find the sweet spot median of maximal motion resolution improvement. But for some games (especially super-jittery games) the hump is unfindable -- DLSS works better when you've got good content framepacing + you're using a global sync tech (VSYNC ON) + global refresh tech (strobing). (That's why all VR headsets are always VSYNC ON + strobed, as it globalizes both the presentation and the refresh, for consistent gametime:photontime for all pixels of a display). Sometimes removing time differentials between pixel refresh, sometimes creates an unexpected outcome.

So better frame rate amplification AIs will do a better job as more time passes.

My vision of future framerate amplification technology is the combination of both temporal (ASW-like) and spatial (DLSS-like) frame rate amplification, to create 4x-10x frame rate increases that finally massively outweighs the side effects of DLSS. even more DLSS accuracy (DLSS 3.0 and up) combined with ASW-like algorithms (like Oculus Rift does), where 45fps is almost laglessly converted to 90fps via reprojection techniques. If you could combine DLSS+ASW, you could do 4x-5x frame rate increases pretty easily, for tomorrow's 1000Hz screens, while using only 200-250fps worth of GPU.

I feel we are still in the Wright Brothers era of frame rate amplification.
Oh..

That would be amazingly splendid. I currently was in the stance of ''we need more RAW performance'' and DLSS is bad because it smudged or produces much more blur. - Or TAA for that matter. Any temporal stuff.
We're truly in that middle ground of transition to moving towards better things and I am excited to witness it.

But this kind of shifts that mentality for DLSS if we can indeed achieve that 200-250fps worth of GPU for a 1000hz display.
If that works, that would be awesome.
---
This is/was/will be awesome for me to dive into reading into further. Other then the questions I have above, I'll leave the thread stranding here for de-clutter sake. Would love to see more articles and condensed information pieces from you so I can link that to friends, have been doing that with the ULMB thread and Gsync article. Spreading the wording and knowledge of blurbusters.

Thank you for all your hard work and sharing your findings as well as your time for posting it online for people to read. For me it's an literal eye-opening experience and I LOVE to understand the bits and pieces underlaying the tech rather then 'well its just better'.

Have an amazing day.
CPU: AMD R7 5800x3D ~ PBO2Tuner -30 ~ no C states
RAM: Gskill Bdie 2x16gb TridentZ Neo ~ CL16-16-16-36 1T ~ fine tuned latency
GPU: ASUS TUF 3080 10G OC Edition(v1/non-LHR) ~ disabled Pstates ~ max oced
OS: Fine tuned Windows 10 Pro, manual tuned.
Monitor: Alienware AW2521H ~ mix of ULMB/Gsync @ 240hz/360hz
More specs: https://kit.co/Kyouki/the-pc-that-stomps-you

Vocaleyes
Posts: 287
Joined: 09 Nov 2021, 18:10

Re: Changing Processor State Greatly affects mouse sensitivity/ lag

Post by Vocaleyes » 19 Aug 2022, 02:05

Stumbled across this https://youtu.be/cLKHb7DwtEk

Seems quite similar to what I’m experiencing. I can still, with synapse now on my gaming rig, force the cursor to stutter with app open, the instant dpi is changed. Still resolved by changing polling in app, and then once again changing dpi after will cause stutters.
I’m going to test forcing cpu to 50-99%, then playing with priorities whilst the cursor has been affected by the cpu change to observe any additional negative effects or improvements.

It may be worth noting I am also on coffee lake architecture and a z390 chipset.

Shade7
Posts: 222
Joined: 25 May 2022, 18:44

Re: Changing Processor State Greatly affects mouse sensitivity/ lag

Post by Shade7 » 19 Aug 2022, 03:30

Vocaleyes wrote:
19 Aug 2022, 02:05
Stumbled across this https://youtu.be/cLKHb7DwtEk

Seems quite similar to what I’m experiencing. I can still, with synapse now on my gaming rig, force the cursor to stutter with app open, the instant dpi is changed. Still resolved by changing polling in app, and then once again changing dpi after will cause stutters.
I’m going to test forcing cpu to 50-99%, then playing with priorities whilst the cursor has been affected by the cpu change to observe any additional negative effects or improvements.

It may be worth noting I am also on coffee lake architecture and a z390 chipset.
Very interesting :shock:

Could this work with Valorant??

Eonds
Posts: 262
Joined: 29 Oct 2020, 10:34

Re: Changing Processor State Greatly affects mouse sensitivity/ lag

Post by Eonds » 19 Aug 2022, 10:07

All of this is useless if your system is micro stalling constantly.
Attachments
Screenshot_7.png
Screenshot_7.png (55.03 KiB) Viewed 3334 times

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Changing Processor State Greatly affects mouse sensitivity/ lag

Post by Chief Blur Buster » 19 Aug 2022, 17:31

Well, depends on which is the worse of the two. Systems can have different microstalling behaviors of different jitter/stutter error (in time).

At the end of the day, it’s really just a game of weak links. At 60Hz, the 60Hz is the weak link. But at 500Hz gaming monitors and 8000Hz mice, weak link can shift to worst microstutter source.

Ignoring placebos…. Blurs and jitters of less than 1ms can actually be human visible when display specs and settings are pushed to extremes after all! It’s all normally lost in display motion blur of a non-strobed 60 Hz display (16.7ms of blurring).

But with good strobing, 0.5ms of 4000 pixels/sec is 2 pixels. If visual jitter error in milliseconds is bigger than MPRT in milliseconds, then the jitter can potentially be made visible if screen angular resolution is extreme enough that 4000 pixels/sec is easily eye trackable, e.g. strobed 1440p or 4K display)
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
TommioTTV
Posts: 9
Joined: 31 Jul 2022, 05:57
Contact:

Re: Changing Processor State Greatly affects mouse sensitivity/ lag

Post by TommioTTV » 27 Sep 2022, 03:48

Unreazz wrote:
01 Aug 2022, 18:28
Vocaleyes wrote:
01 Aug 2022, 03:48
As promised https://youtu.be/YuMluQ7Mq4w


To clarify, sens was set to 3.60 at start, after applying 50%, had to set to 8.00 to correct, still could have been higher, but then set back to 100% and it’s out of control fast.

FPS lightly affected, but not the main issue.
i read recently a post in discord about incorrect timings with alienware monitors. maybe this will help you. not about in this case now tho, just in generel to let you know, since i saw that you have alienware. i dont know if you have the same model.

Image
Image
Image
Hey look! that's me! :lol:

Post Reply