useplatformtick and/or disabledynamictick on mouse input lag

Everything about latency. Tips, testing methods, mouse lag, display lag, game engine lag, network lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.
Rallaz
Posts: 45
Joined: 12 May 2020, 08:41

Re: useplatformtick and/or disabledynamictick on mouse input lag

Post by Rallaz » 02 Mar 2022, 10:57

kriegor wrote:
02 Mar 2022, 00:50
Chief Blur Buster wrote:
14 Jan 2022, 15:02
The last two quotes starts violating forum rules (to each other) -- but first original statement by kriegor about an off day is somewhat ambigious because there's no voice tone associated -- so it may have been interpretable as a friendly tip or something offensive
I appreciate this sentiment. Opinions should not be considered offensive.

Threads like these have become far too common over the years, with strikingly similar patterns.

A user may experience a downphase, a decline in performance which may really have any reason you can think of, but most commonly: fading motivation, poor mindset, ageing, declining health, etc etc.
He then starts digging online and finds loose threads of people ambigously talking about input lag and fixing said input lag, usually with a lot of hear-say, placebo, misconception and "I want to believe in it therefore it is true" mixed in.
"Oh wow", he thinks to himself, "may this be what held me back all along? I think yes. This is it. After all, there is 2 users here who say this made their game feel a lot better, so it must be true."
So he then thinks he found his magic solution to all his problems and, after applying whatever tweak or fix he found, suddenly finds his skill "reignited". "Oh wow, this feels so good. This stuff is the real deal!" - confirmation bias at it's finest.

Of course, subconsciously, he knows he may be wrong. He then proceeds to let everyone else know with a forum thread about how doing this one thing magically fixed all his problems and turned him into a god. Secretly, he hopes that more people will apply the same "fix" he applied and come up with the same conclusion: "Wow, this really did something!" to reaffirm his beliefs.
He need to convince others to convince himself.
Otherwise, his newfound reality may shatter once more and he may realize that the problem was in his head all along. Should this happen, he may experience either mental rebirth or stagnation and decline once more.

Long story short, the human mind is an astonishingly powerful tool.
Much more powerful than changing things in your registry will ever be.
With all my respect into what you just said "the mind is powerful tool" it is and it should not be underestimated at all when it comes down to things as input lag and such.

But I personally didn't start my thread about input-lag 2 years ago without relying on a macro testing changes after adjustments made into regedit/nvidia and such things before comming to a conclusion about things.

I mean sure its about a certain feeling and how you perform but if you feel that something feels quite off and you have recordings of things that a macro can reproduce and not "been able" to complete as a task or some kind of movement because of the lag it becomes more and more obvious like it could be something.

Which is why I personally when mixing with EMI/EMF/RFI/Power stuff I run a macro to see how it performs compared to what the "non-input lag recorded" macro should be able to complete in comparison to how much worse it is if I changed something.

This is sort of the best thing we can usually do is to record something when it is at its best and compare to changes that has been made afterwards.

This method has gotten me much further then most of the stuff I've been testing in terms of just changing non-proved "fixes".
But in my case its more about power related issues but I think it goes more along with testing methods before you say "this is the fix".

So had a more of a wide spreadsheet like this
- Changing this makes the macro Overperform
values of random stuff
- Changing this makes the macro Underperform
values of random stuff

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: useplatformtick and/or disabledynamictick on mouse input lag

Post by Chief Blur Buster » 02 Mar 2022, 16:41

Just want to say...

We like to encourage users to find solutions and there are lot of "I solved my problem after a few years" stories that are proven legitimate, so we often tamp-down on unnecessary discouragement very fast -- it's a well known red line on Blur Busters -- to the chagrin of those posters (then I invariably get two private messages thanking me for the moderating). For every poster who dislike a specific moderating action, we have way more posters profusely thankful. It's a hard line to toe.

Blur Busters = bleeding edge of milliseconds = 'nuf said.

/micdrop
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

kriegor
Posts: 29
Joined: 12 May 2014, 23:17

Re: useplatformtick and/or disabledynamictick on mouse input lag

Post by kriegor » 02 Mar 2022, 17:13

Rallaz wrote:
02 Mar 2022, 10:57
With all my respect into what you just said "the mind is powerful tool" it is and it should not be underestimated at all when it comes down to things as input lag and such.

But I personally didn't start my thread about input-lag 2 years ago without relying on a macro testing changes after adjustments made into regedit/nvidia and such things before comming to a conclusion about things.

I mean sure its about a certain feeling and how you perform but if you feel that something feels quite off and you have recordings of things that a macro can reproduce and not "been able" to complete as a task or some kind of movement because of the lag it becomes more and more obvious like it could be something.

Which is why I personally when mixing with EMI/EMF/RFI/Power stuff I run a macro to see how it performs compared to what the "non-input lag recorded" macro should be able to complete in comparison to how much worse it is if I changed something.

This is sort of the best thing we can usually do is to record something when it is at its best and compare to changes that has been made afterwards.

This method has gotten me much further then most of the stuff I've been testing in terms of just changing non-proved "fixes".
But in my case its more about power related issues but I think it goes more along with testing methods before you say "this is the fix".

So had a more of a wide spreadsheet like this
- Changing this makes the macro Overperform
values of random stuff
- Changing this makes the macro Underperform
values of random stuff
That is nice, but what kind of macro are you using? What macro testing? I am most curious what you have come up with and how it functions.
But it has to be said: Anything that does not produce proper scientific measurements isn't serious testing. It's guesstimating. It's a crutch. You cannot draw any conclusions from it, just assumptions. That means in most cases external tools are required, such as Nvidia LDAT.
Which is why I am extra curious what macro you came up with that overperforms and underperforms in certain situations. What's the margin for error? What scenario? What game? Underperform in what and by how much?

And again, tweaking this or that or having 2ms less input lag is, for the most part, not going to make you better (and besides seomthing like turning off Anti Aliasing, almost no single change will make that much difference anyway). It's your belief it's going to that will make you better. Which is fine, really. But it brings us back to my initial statement.
Chief Blur Buster wrote:
02 Mar 2022, 16:41
there are lot of "I solved my problem after a few years" stories that are proven legitimate
first of all: I never said people don't solve problems after years.
I said: no single registry value or driver will have held you down for years, that is just completely absurd.
Second: can you show me even a single case like that? unless external measurement devices and scientific method are involved, it is not legitimate, just saying. using some sort of app or "feeling", any claim remains a claim.

Furthermore, this doesn't change anything about what I said above.

And it remains yet to be proven that any singular registry change or whatever sort of tweak can reduce your input lag in the millisecond range.
It also has to be pointed out that there is really no point throwing around phrases like that, if you and your community intend to be something that is taken seriously, you would be better off not giving further credibility to baseless claims and anecdotal postings that reek of placebo.

Stay rational, always scientific and never accept any claim or anecdote as truth, unless the person can provide significant evidence in the form of scientific measurements.

I percieved blur busters as a community dedicated to - as the name suggests - motion blur reduction. A forum dedicated to anything monitor related.
The whole "input lag" movement has just somewhat recently emerged these past few years and it is riddled by snakeoil and pseudoscience, it is like religion: you either convert to the Lagism and believe or you are the devil.

So again, as someone with a positive agenda, you would be adviced to stick to factual truth - saying "we are about shaving off milliseconds, nuff said" after I presented a rational approach makes you come across as an overenthusiastic high-schooler and someone who entertains the idea of Lagism.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: useplatformtick and/or disabledynamictick on mouse input lag

Post by Chief Blur Buster » 02 Mar 2022, 19:00

kriegor wrote:
02 Mar 2022, 17:13
Stay rational, always scientific and never accept any claim or anecdote as truth, unless the person can provide significant evidence in the form of scientific measurements.
Welcome to Blur Busters. It sounds like you're new to Blur Busters. Allow me to cross-post the following:

(Acknowledgement: The "welcome...new" part is meant in good nature humor -- you are knowledgeable in other topics that I agree with (360Hz G-SYNC example) but I need to explain why we allow these "uncertain placebo, uncertain proven" posts in the Forums part of Blur Busters...)
Chief Blur Buster wrote:
02 Mar 2022, 18:40
I am bumping this thread up again to prove my point that existing tests are often lacking.

For example, latency testing is not the same as motion blur testing.

And, testing flicker for human limits doesn't acknowledge other effects (motion blur, phantom arrays, etc). You flicker test someone, and they can't see beyond specific Hz, and you falsely conclude that humans can't benefit from a display above "X Hz".

Our forums are full of users who post "probably placebos, but potentially legit" posts that we sometimes try to invent new tests for, to figure out whether it's placebo/legit. Many researchers did this -- the mouse 2000 Hz guy created that paper because of a Twitter debate between me and a Korean researcher, Sunjun Kim, who tested 1000Hz vs 2000Hz vs 4000Hz vs 8000Hz pollrates. He initally said "I doubt" but after doing the scientific research paper from my prodding, he now agrees going over 1000Hz with mice has benefit -- and the screenshot of the red/white/green squares from mouse jitter, is a direct copy and paste from that paper. I *saw* a difference with my eyes but was too busy to do tests myself, so I convinced another researcher.

Even though I was not cited in the paper, it was very clear that it was my Twitter thread and egging-on that convinced this Korean researcher to invent tests to try to prove himself wrong. He was unable to, and acknowledged that I was right. This has happened dozens of times already.

There are over a hundred papers that are clearly incubated indirectly from a Blur Busters forum idea, while others are more directly cited (20+ of them).

I don't have time to be involved in all of them, but we love to incubate areas that needs tests by egging other researchers to invent tests for things that too narrow-scope to push confirmed limits.

That is what Blur Busters does -- we incubate new tests (directly and indirectly) to cover what was seen with human-visible observations & we convince others to invent new tests.

I see things with my eyes, and instantly recognize the current tests are often flawed (doesn't correctly test for what I saw) to the disbelief of many disbelieving researchers, and then I either invent new tests for them -- or they invent new test for it. And then, we're invariably proven correct.

It is true that for every 100 "assumed placebo" effects, researchers later find out 10 of them are actually not placebos. But that's the nature of our forums as a crowdsourced incubator of future research ideas. This is the birth of many "Better Than 60Hz" research. So we're a Great Defender of posts about placebos around here, so we can dissect and rip them apart.

Forum users can reply "Did you do test X" but we delete those "Haha, that's a placebo" posts here -- we're very serious about bleeding edge of temporal display sciences around here.

Importantly, if you are researcher-minded or a high school / university graduate eager to school people around here, whoa there buddy. Big mistake around this area. Before replying to posts that you think are placebo effects, it is best to do homework at www.blurbusters.com/area51 as well as forums.blurbusters.com/area51 (both places). Nobody shall do instant evidence-free dismissals of what only the human eye saw. This is a forum. Yes, we need research. Yes, we need papers. Yes, we need tests. But we need people telling about human observed effects. This is why too many researchers said "Humans do not benefit from a display above X Hz" (85Hz, 120Hz, 255Hz) because they didn't test all the effects of a finite frame rate but instead only tested flicker or momentary brief-flash object-identification (like an old fighter pilot test), or other narrow-scope effects. This was frequently spun over the years as a Hz limit of the human eye.

____

We've grown great reputation because of this enforced open-discussion about "potential placebos, potential legitimate" effects that we need to invent new tests for.

Sometimes we are directly cited (examples on Google Scholar where researchers cite me, my articles, my papers, or one of my www.blurbusters.com/area51 articles, etc) and sometimes I'm egging on other researchers to invent tests to cover things that are untestable using the tests they did.

TL;DR1: Tests often miss many things that the human eye/brain legitimately saw, and was only proven when additional tests were invented to cover things overlooked in earlier too-limited-scope tests.

TL;DR2: This is why we allow discussion about things that might be "unconfirmed placebos, unconfirmed legitimate". Lots of such stuff were laughed at until it became part of the Blur Busters textbook. We are an incubator (ourselves or other researchers) of inventing new tests for hard-to-test stuff. We are the site that convinced many people that >60Hz monitors were worthwhile too. (and technologies such as VRR or strobe backlights)
Chief Blur Buster wrote:
02 Mar 2022, 19:39
Hello,

Often, human observations are reported and other people think it's placebo. That's the problem -- sometimes tests are not yet invented to test for something we saw with our eyes.

Most gaming mice are 1000Hz, but I was a long-time proponent of 2000Hz+ mice for years. No major mouse manufacturer bothered to manufacture a proper true-2000Hz gaming mouse, until the Razer 8KHz came out.

Sometime in 2020, Razer released an 8KHz mouse to my enthusaic reception, and the difference was very clearly human visible to my eyes. But we needed tests. I was in the middle of developing tests for it, but Sunjun Kim in Korea beat me to it. The important thing was that Blur Busters was the one who encouraged them to create a research paper about the benefits of mouse going above 1000 Hz

October 2020

From tweet, I pounced on this thread with a lot of my replies.
Image
You can read my tweet replies, that I eventually convinced the researcher this was worth testing. That I knew that there was a benefit to having a mouse Hz massively super-sampling the display Hz to reduce mouse jitter.

June 2021

He then developed tests to execute what I have described/said, and announced on twitter about the paper. He also confirmed there was benefit -- proving that I was right that >1000Hz gaming mice such as Razer 8KHz had humankind benefit.
Image
September 2021

The paper became visible open-access on ACM -- "Do We Need a Faster Mouse? Empirical Evaluation of Asynchronicity-Induced Jitter"
https://dl.acm.org/doi/10.1145/3472749.3474783
And its open-access PDF (publicly visible, no paywall)

Here are some key screenshots from some pages of this paper, showing the jitter (between display Hz and mouse Hz), as well as human blind testing:
Image

Image
In October 2021, I created a twitter thread with commentary, including suggestions on additional frontiers to test (if future researchers wanted to), such as how changes to DPI and MPRT potentially affects human-visibility of >1000Hz mice even more powerfully than in the existing paper. The original researcher "Liked" that Twitter thread.

While I was not cited in that particular paper, it was clearly spawned from the Twitter discussion that I, myself started.

Currently, there are over 100 research papers indirectly spawned by ideas influenced by Blur Busters / social media / forums / etc.
Of these with confirm credit, I'm cited in more than 20 of them (Google Scholar Search of TestUFO or Blur Busters being mentioned)
(...Brownie point: Your newer post in the 360Hz GSYNC thread is closer to the correct spirit of Blur Busters Forums...)
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

MT_
Posts: 113
Joined: 17 Jan 2017, 15:39

Re: useplatformtick and/or disabledynamictick on mouse input lag

Post by MT_ » 23 Mar 2022, 09:37

forii wrote:
23 Jun 2021, 17:40
If you are at 20H2 (Win 10) or latest do not touch any bcdedits command from Internet. These all stuff used to help in older Windows.

There is also no need to turn off fullscreen optimisations in specific game/aplication anymore.


No need to force even number like 0.50ms timer resolution, there is no difference between 1ms~ or 0.5ms~, there can be with 16ms vs 1ms but mine was this 0.997ms by default.

This is how it should look for you (cmd -> bcdedit)
I disabled HPET in device manager tho

Image

If you wanna have better response time of your mouse/keyboard then doesnt use same usb sticks next to each other. Use different one for mouse and keyboard.
MatrixQW wrote:
24 Jun 2021, 16:29
forii wrote:
23 Jun 2021, 17:40
No need to force even number like 0.50ms timer resolution, there is no difference between 1ms~ or 0.5ms~, there can be with 16ms vs 1ms but mine was this 0.997ms by default.
Windows default is 15,625ms.
It's not necessary to change the timer because games (and multimedia apps) will do it and set it to 1ms.
Between 0.5 and 1ms there is indeed no difference as you can see in the screenshots, just a 0,0µs on std.dev for the more demanding. :)
I'd like to debunk these claims that timer resolution does nothing.

First of all, Windows automatically sets timer resolution depending on its requirements. It can be seen in powercfg -energy (in example) to see which applications etc request the timer resolution to be lowered (Or increased).

Traditionally the timer resolution dropped to either 1.0ms or 0.5ms~ (With some variation, also depends on windows build (Or even 7) and whether HPET or iTSC was used. (And 0.496 or other odd values could actually make the timer drift up or downwards in certain workloads)

Personally, my desktop is always at 2.000ms by default as the soundcard drivers I use request such a low timer. Games usually (If not always) request at minimum of 1.000ms (Also partly due to MMCSS) due to latency requirements. Some older windows 10 versions iirc even set 0.5ms by default thanks to MMCSS.

Just going by some logical thinking and that timer resolution can go up and down, and Windows usually decides how low to go, doesn't mean that going lower can't yield any benefits, but obviously it can also cause other effects.

In terms of games, timer resolution being 1 or 0.5ms HAS effect on many internal engine framerate limiters (Load of Cryengine based games, CSGO, just to name a few). It does however depend on the implementation of the limiter of said engine. If a game would use a similar mechanic to RTSS or Nvidia's latest iteration I doubt it does much. Also, depending on the game, and barring any fps limiter by itself, can still make frametimes a lot smoother, not to mention certain games actually perform better in CPU bound situations with a lowered timer. (Assuming you prefer playing uncapped).

Basically this means that; It depends on your PERSONAL requirements. I need (and want) to run i.e. CSGO at a limit of lets say 240 fps with absolute minimal tearing on a 120hz monitor (Basically because I consider this the optimal tradeoff between input lag and smoothness) and use the internal fps limiter, i have no choice but to run the timer at 0.5ms otherwise the tearing will be atrocious and the 240fps cap and frametime target will barely ever be reached.

From my own testing (with 960 fps camera and modified mouse with LED attached to left mouse button) I've also noticed that input lag slightly increases when using 0.5ms timer, but for my purpose the lowered timer still outweighs the slight input lag increase.

It looks to be a two-edged sword. Another interesting scenario is for instance using G-sync with a built-in fps limiter in game slightly below your refresh rate, for the sake of input latency you'd probably want to stick to 1.0ms (Assuming the engine is decent and doesnt have too much frametime fluctuations to hit the V-sync ceiling, otherwise you'd might want to experiment with a lowered timer resolution as well.

Either way, it look to be a tradeoff between precision and potentially slightly increased latency. 1.0 > 0.5 might sound like an input latency reduction but practically it can end up increasing latency in certain situations. Lower doesnt always mean lower input lag, but it does mean higher accuracy. There is no definitive answer to whats best, it depends on the situation, your requirements and end result.
LTSC 21H2 Post-install Script
https://github.com/Marctraider/LiveScript-LTSC-21H2

System: MSI Z390 MEG Ace - 2080 Super (300W mod) - 9900K 5GHz Fixed Core (De-lid) - 32GB DDR3-3733-CL18 - Xonar Essence STX II

Raphaeangelo
Posts: 8
Joined: 01 Nov 2021, 21:54

Re: useplatformtick and/or disabledynamictick on mouse input lag

Post by Raphaeangelo » 14 Feb 2024, 11:20

akylen wrote:
11 Jan 2022, 07:27
For me , the problem is that this is just no fun when i play , i shot more bullets , shot first, shot the guy in the back , i always end up dying , if you're wondering about my level , im global elite and level 10 on faceit, this is just unplayable , my shots didnt register , its like i have to spent 30+ bullets on warzone to remove the armor , and on the killcam of the guy , with the pistol he two shots me .

I dont know what is causing this , but this is not playable at all on all games .
I 100% agree with you! Don't let anyone gaslight you. I suspect people like you and I are hyper aware of micro changes/delay and others aren't so they'll say "it's placebo" or "get better". I am a Masters Apex Player and I 100% agree with what you are saying. If I leave HPET ON in device manager inputs feel delayed and the worse part is I fee like I have to use 30 more bullets to knock someone. Every fight is 20% harder (I'm not missing shots or having a "bad day"). As soon as I disable HPET in device manage and set

Code: Select all

bcdedit /set useplatformclock no
Apex feels smoother, more responsive, and people go down easier with the "correct" number of bullets. Feels less sweaty. I'm 100% convinced HPET ON causes the issues you're describing, I just want to know why. I don't necessarily like endlessly "tweaking" my system just so it plays well. Anyone know why this is happening?

Noirpirate85
Posts: 17
Joined: 06 Jan 2024, 16:33

Re: useplatformtick and/or disabledynamictick on mouse input lag

Post by Noirpirate85 » 15 Feb 2024, 09:55

i cant even use these commands i dont have any entries in my bootloader. so i never know if it will make a difference or not.
can anyone tell me why because i never did disable it .

Lev1n
Posts: 26
Joined: 17 Apr 2023, 05:40

Re: useplatformtick and/or disabledynamictick on mouse input lag

Post by Lev1n » 16 Feb 2024, 11:28

I dont know whats wrong with my pc. When i enable those tick settings and default timer every game im playing is hella smooth but mouse is not responsive feels like 1kg cant track enemys cant snap and its like playing at 120 ping. When i disable and use .5 or 1 ms timer games and windows is hella responsive but games are not smooth escpecially when shooting. Its been like this for 3 different rig(9900k/10850k and now its 12900ks) only thing didnt changed is my room and ram sticks.

Any suggestions to fix this thing?

Slender
Posts: 590
Joined: 25 Jan 2020, 17:55

Re: useplatformtick and/or disabledynamictick on mouse input lag

Post by Slender » 16 Feb 2024, 17:03

Lev1n wrote:
16 Feb 2024, 11:28
I dont know whats wrong with my pc. When i enable those tick settings and default timer every game im playing is hella smooth but mouse is not responsive feels like 1kg cant track enemys cant snap and its like playing at 120 ping. When i disable and use .5 or 1 ms timer games and windows is hella responsive but games are not smooth escpecially when shooting. Its been like this for 3 different rig(9900k/10850k and now its 12900ks) only thing didnt changed is my room and ram sticks.

Any suggestions to fix this thing?
for first, what windows version you use?
if win10 it use TSC + TSC tick
if win11 it use TSC + RTC tick
Disabledynamictick work only on w10.
useplatformtick always on at w11 (disabledynamictick not use because that rtc tick).
If you disable HPET from bios, you have worse perfomance with useplatformtick yes (w10, def by w11). What settings you change?

Lev1n
Posts: 26
Joined: 17 Apr 2023, 05:40

Re: useplatformtick and/or disabledynamictick on mouse input lag

Post by Lev1n » 17 Feb 2024, 10:09

Slender wrote:
16 Feb 2024, 17:03
Lev1n wrote:
16 Feb 2024, 11:28
I dont know whats wrong with my pc. When i enable those tick settings and default timer every game im playing is hella smooth but mouse is not responsive feels like 1kg cant track enemys cant snap and its like playing at 120 ping. When i disable and use .5 or 1 ms timer games and windows is hella responsive but games are not smooth escpecially when shooting. Its been like this for 3 different rig(9900k/10850k and now its 12900ks) only thing didnt changed is my room and ram sticks.

Any suggestions to fix this thing?
for first, what windows version you use?
if win10 it use TSC + TSC tick
if win11 it use TSC + RTC tick
Disabledynamictick work only on w10.
useplatformtick always on at w11 (disabledynamictick not use because that rtc tick).
If you disable HPET from bios, you have worse perfomance with useplatformtick yes (w10, def by w11). What settings you change?
Its 21h2 win10 disabled defender and updates thru ntlite everything is default. Cpu is oc'd to 5.3 1.3 voltage. Rams are xmp on gear1(4k cl16) Didnt touch anything on bios.
Can you send the codes for tweaks you suggest?

Post Reply