Optimization Hub for beginners

Everything about latency. Tips, testing methods, mouse lag, display lag, game engine lag, network lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.
User avatar
xeos
Posts: 42
Joined: 12 Jul 2018, 14:54
Contact:

Re: Optimization Hub for beginners

Post by xeos » 26 Dec 2020, 21:55

Brainlet wrote:
26 Dec 2020, 16:40
A little bit of prejudice. I've seen way too many people claim "it doesn't matter" or "margin of error" when they see a sub 0.5ms difference while in reality it can completely alter motion perception since min/max is often impacted as well.
So you'd rather use your intuition instead? I can't imagine that's what you are suggesting, but I'm not clear on what's your proposed alternative that actually exists currently.
Measure display input lag the cheap way or the best way (IMHO, but I'm biased).

Brainlet
Posts: 69
Joined: 30 May 2020, 12:39
Contact:

Re: Optimization Hub for beginners

Post by Brainlet » 26 Dec 2020, 23:34

xeos wrote:
26 Dec 2020, 21:55
Brainlet wrote:
26 Dec 2020, 16:40
A little bit of prejudice. I've seen way too many people claim "it doesn't matter" or "margin of error" when they see a sub 0.5ms difference while in reality it can completely alter motion perception since min/max is often impacted as well.
So you'd rather use your intuition instead? I can't imagine that's what you are suggesting, but I'm not clear on what's your proposed alternative that actually exists currently.
In an ideal scenario you'd want to make use of both human feels and click to photon tests (until we have something better to measure with). Though, what I really dislike is the way people digest them. Instead of progressing towards "oh I see, it actually matters" it's regressing towards "see, I told you it doesn't matter at all, humans can't feel milliseconds" which ultimately leads to things like Windows being extremely bloated with a massive impact on input lag (without any backlash from user base) and endless graphical advancements at the cost of latency (most of the time forced, not optional). It just seems to me as if these tests are just further reinforcing the general consensus of "it doesn't matter, you're insane". It feels like "humans can't notice 60+ fps" all over again and I'm honestly not even sure what kind of test results would change the hive mind (except them experiencing it for themselves). Maybe I'm overestimating the impact of full latency graphs on people's opinions about this. In the end they will probably just say the same old "it doesn't matter".
Starting point for beginners: PC Optimization Hub

User avatar
xeos
Posts: 42
Joined: 12 Jul 2018, 14:54
Contact:

Re: Optimization Hub for beginners

Post by xeos » 27 Dec 2020, 11:10

Human "feel" is way too open to bias. You need objective data to back it up. And as you've said click to photon tests are too laborious to get decent data. At least with a photodiode test you can be sure your monitor is performing at it's very best.
Measure display input lag the cheap way or the best way (IMHO, but I'm biased).

diakou
Posts: 76
Joined: 09 Aug 2020, 11:28

Re: Optimization Hub for beginners

Post by diakou » 27 Dec 2020, 21:27

xeos wrote:
27 Dec 2020, 11:10
Human "feel" is way too open to bias. You need objective data to back it up. And as you've said click to photon tests are too laborious to get decent data. At least with a photodiode test you can be sure your monitor is performing at it's very best.
Having a reflex latency analyzer monitor, I can tell you that I am still having more success with debugging and identifying issues through feel and not the numbers I'm getting over 100+ samples (although it's so goddamn frustratingly long to do this with RLA as I don't have CSV file and have to individually sample them one by one manually)

Though, having both combined is very, very nice. It's weird, if you're really in tune with a game of choice for your testing scenarios, it's ridiculous how accurate your feel truly is. I mean just according to input lag testing samples, a 2ms change at times feels wildly different, wildly different (but this is also most likely not due to the raw inputlag change, but a combination of other things that are not seen with the RLA.) I wouldn't say I'm able to notice 2ms at all if it was actually just 2ms.

Long Ass TL;DR I think RLA and the similar will be overall good for the latency progression & more, but brainlets' concerns are very valid. Would be frustrating to see a world where these numbers are used as a false debunker, which they're very capable of doing considering the accuracy for the exact testing it tries to do. Don't underestimate feel, even if there has been so much placebo & the like for years, until the consumers can have much better tools, feel is still too important of an asset. I bet that Chief more often than not has troubleshooted monitors problems purely through perception a lot more over the years - and then eventually supplemented with tools and data if needed to further assist in the troubleshooting.

User avatar
xeos
Posts: 42
Joined: 12 Jul 2018, 14:54
Contact:

Re: Optimization Hub for beginners

Post by xeos » 28 Dec 2020, 10:56

diakou wrote:
27 Dec 2020, 21:27
Having a reflex latency analyzer monitor, I can tell you that I am still having more success with debugging and identifying issues through feel and not the numbers I'm getting over 100+ samples (although it's so goddamn frustratingly long to do this with RLA as I don't have CSV file and have to individually sample them one by one manually)
Hopefully you aren't suggesting that you can feel lag that the hard numbers don't show...

Though the reflex latency analyzer monitor doesn't measure actual pixel changes so if it's the monitor's fault I suppose there could be something to that.

Meanwhile, what is really needed here to prove OR disprove the idea that pro gamers can feel 2ms of lag is a simple experiment where lag is added artificially and the ability's of said pro's to pick out when there's extra lag is tested. I haven't see that published anywhere, have you?
Measure display input lag the cheap way or the best way (IMHO, but I'm biased).

User avatar
MaxTendency
Posts: 59
Joined: 22 Jun 2020, 01:47

Re: Optimization Hub for beginners

Post by MaxTendency » 28 Dec 2020, 13:13

xeos wrote:
28 Dec 2020, 10:56
Hopefully you aren't suggesting that you can feel lag that the hard numbers don't show...
Except, click to photon tests aren't hard numbers, its quite incomplete as it cannot measure latency continuously.
There's a more detailed post explaining this https://forums.blurbusters.com/viewtopi ... 482#p58482
xeos wrote:
28 Dec 2020, 10:56
Meanwhile, what is really needed here to prove OR disprove the idea that pro gamers can feel 2ms of lag is a simple experiment where lag is added artificially and the ability's of said pro's to pick out when there's extra lag is tested.
This is another reason why click to photon tests aren't an end all be all. Take 1k hz vs 8k hz mice for example, it's merely 0.875ms difference on paper, not even a millisecond. So someone might artificially induce 0.875ms of lag, and after seeing people couldn't notice it, come to the conclusion that 8khz is useless and unnoticeable.

That conclusion would be incorrect. This is because 8khz polling doesn't ONLY reduce latency, it also adds smoothness that click to photon tests will not be able to detect.
Chief Blur Buster wrote:
01 Oct 2020, 10:20

1000Hz mouse at 360 Hz
Image

8000Hz mouse at 360 Hz
Image
TL;DR Yes, its good to have some objective measurements, and click to photon is better than nothing. But it has serious shortcomings and should not be treated as hard fact.

User avatar
jorimt
Posts: 1498
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Optimization Hub for beginners

Post by jorimt » 28 Dec 2020, 13:27

MaxTendency wrote:
28 Dec 2020, 13:13
TL;DR Yes, its good to have some objective measurements, and click to photon is better than nothing. But it has serious shortcomings and should not be treated as hard fact.
When performed and interpreted properly, click-to-photon test results can and should be treated as hard fact where the absolute spread of min/avg/max input lag values are concerned in the given test scenario, but they are indeed not the complete picture where distribution of said values over a period of uninterrupted frames are concerned.

Anyone criticizing the shortcomings of highspeed/photodiode methods are barking up the wrong tree, as no method currently exists to test for the differences you are suggesting, and they aren't intended to.

See this related post I just made (on happenstance) for supplement:
https://forums.blurbusters.com/viewtopi ... =10#p61945
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: Acer Predator XB271HU / LG 48CX OS: Windows 10 MB: ASUS ROG Maximus X Hero CPU: i7-8700k GPU: EVGA RTX 3080 FTW3 UG RAM: 32GB G.SKILL TridentZ @3200MHz

Brainlet
Posts: 69
Joined: 30 May 2020, 12:39
Contact:

Re: Optimization Hub for beginners

Post by Brainlet » 28 Dec 2020, 17:47

I assume we can all agree that the jump from 500 Hz to 1000 Hz mouse polling rate is significant, yet NVIDIA research measured a measly 0.4ms latency decrease. Do you see where I'm getting at now? Data without extensive latency graphs like this can easily be abused as debunking "proof".

Also, regarding piLagTester:
The lag testing software runs on Linux
The pi draws a black background, and then roughly once a second displays a set of target rectangles (top/middle/bottom). You place the sensor over the desired target, and the piLagTester measures the monitor's response starting from the moment the frame of video data is sent over the Pi's HDMI port.
I don't see how this helps measuring optimizations affecting the pipeline between mouse and GPU output in competitive shooters on Windows.
Starting point for beginners: PC Optimization Hub

User avatar
jorimt
Posts: 1498
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Optimization Hub for beginners

Post by jorimt » 28 Dec 2020, 19:22

Brainlet wrote:
28 Dec 2020, 17:47
I assume we can all agree that the jump from 500 Hz to 1000 Hz mouse polling rate is significant, yet NVIDIA research measured a measly 0.4ms latency decrease.
What's the source article? Is Nvidia seriously attempting to test differences in mouse tracking via a click-to-photon test? If so, of course that's not accurate. Mouse polling isn't even directly related to average input lag levels in that context.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: Acer Predator XB271HU / LG 48CX OS: Windows 10 MB: ASUS ROG Maximus X Hero CPU: i7-8700k GPU: EVGA RTX 3080 FTW3 UG RAM: 32GB G.SKILL TridentZ @3200MHz

Brainlet
Posts: 69
Joined: 30 May 2020, 12:39
Contact:

Re: Optimization Hub for beginners

Post by Brainlet » 28 Dec 2020, 19:29

jorimt wrote:
28 Dec 2020, 19:22
Brainlet wrote:
28 Dec 2020, 17:47
I assume we can all agree that the jump from 500 Hz to 1000 Hz mouse polling rate is significant, yet NVIDIA research measured a measly 0.4ms latency decrease.
What's the source article? Is Nvidia seriously attempting to test differences in mouse tracking via a click-to-photon test? If so, of course that's not accurate. Mouse polling isn't even directly related to average input lag levels in that context.
Exactly why I'm concerned in general. People WILL produce all kinds of data to mislead the majority of consumers when it comes to input lag impacting user experience.

Here is the source article:
https://www.nvidia.com/en-us/geforce/gu ... ion-guide/
Starting point for beginners: PC Optimization Hub

Post Reply