So you'd rather use your intuition instead? I can't imagine that's what you are suggesting, but I'm not clear on what's your proposed alternative that actually exists currently.
Optimization Hub for beginners
Re: Optimization Hub for beginners
Measure display input lag the cheap way or the best way (IMHO, but I'm biased).
Re: Optimization Hub for beginners
In an ideal scenario you'd want to make use of both human feels and click to photon tests (until we have something better to measure with). Though, what I really dislike is the way people digest them. Instead of progressing towards "oh I see, it actually matters" it's regressing towards "see, I told you it doesn't matter at all, humans can't feel milliseconds" which ultimately leads to things like Windows being extremely bloated with a massive impact on input lag (without any backlash from user base) and endless graphical advancements at the cost of latency (most of the time forced, not optional). It just seems to me as if these tests are just further reinforcing the general consensus of "it doesn't matter, you're insane". It feels like "humans can't notice 60+ fps" all over again and I'm honestly not even sure what kind of test results would change the hive mind (except them experiencing it for themselves). Maybe I'm overestimating the impact of full latency graphs on people's opinions about this. In the end they will probably just say the same old "it doesn't matter".
Starting point for beginners: PC Optimization Hub
Re: Optimization Hub for beginners
Human "feel" is way too open to bias. You need objective data to back it up. And as you've said click to photon tests are too laborious to get decent data. At least with a photodiode test you can be sure your monitor is performing at it's very best.
Measure display input lag the cheap way or the best way (IMHO, but I'm biased).
Re: Optimization Hub for beginners
Having a reflex latency analyzer monitor, I can tell you that I am still having more success with debugging and identifying issues through feel and not the numbers I'm getting over 100+ samples (although it's so goddamn frustratingly long to do this with RLA as I don't have CSV file and have to individually sample them one by one manually)
Though, having both combined is very, very nice. It's weird, if you're really in tune with a game of choice for your testing scenarios, it's ridiculous how accurate your feel truly is. I mean just according to input lag testing samples, a 2ms change at times feels wildly different, wildly different (but this is also most likely not due to the raw inputlag change, but a combination of other things that are not seen with the RLA.) I wouldn't say I'm able to notice 2ms at all if it was actually just 2ms.
Long Ass TL;DR I think RLA and the similar will be overall good for the latency progression & more, but brainlets' concerns are very valid. Would be frustrating to see a world where these numbers are used as a false debunker, which they're very capable of doing considering the accuracy for the exact testing it tries to do. Don't underestimate feel, even if there has been so much placebo & the like for years, until the consumers can have much better tools, feel is still too important of an asset. I bet that Chief more often than not has troubleshooted monitors problems purely through perception a lot more over the years - and then eventually supplemented with tools and data if needed to further assist in the troubleshooting.
Re: Optimization Hub for beginners
Hopefully you aren't suggesting that you can feel lag that the hard numbers don't show...diakou wrote: ↑27 Dec 2020, 21:27Having a reflex latency analyzer monitor, I can tell you that I am still having more success with debugging and identifying issues through feel and not the numbers I'm getting over 100+ samples (although it's so goddamn frustratingly long to do this with RLA as I don't have CSV file and have to individually sample them one by one manually)
Though the reflex latency analyzer monitor doesn't measure actual pixel changes so if it's the monitor's fault I suppose there could be something to that.
Meanwhile, what is really needed here to prove OR disprove the idea that pro gamers can feel 2ms of lag is a simple experiment where lag is added artificially and the ability's of said pro's to pick out when there's extra lag is tested. I haven't see that published anywhere, have you?
Measure display input lag the cheap way or the best way (IMHO, but I'm biased).
- MaxTendency
- Posts: 59
- Joined: 22 Jun 2020, 01:47
Re: Optimization Hub for beginners
Except, click to photon tests aren't hard numbers, its quite incomplete as it cannot measure latency continuously.
There's a more detailed post explaining this https://forums.blurbusters.com/viewtopi ... 482#p58482
This is another reason why click to photon tests aren't an end all be all. Take 1k hz vs 8k hz mice for example, it's merely 0.875ms difference on paper, not even a millisecond. So someone might artificially induce 0.875ms of lag, and after seeing people couldn't notice it, come to the conclusion that 8khz is useless and unnoticeable.
That conclusion would be incorrect. This is because 8khz polling doesn't ONLY reduce latency, it also adds smoothness that click to photon tests will not be able to detect.
TL;DR Yes, its good to have some objective measurements, and click to photon is better than nothing. But it has serious shortcomings and should not be treated as hard fact.
Re: Optimization Hub for beginners
When performed and interpreted properly, click-to-photon test results can and should be treated as hard fact where the absolute spread of min/avg/max input lag values are concerned in the given test scenario, but they are indeed not the complete picture where distribution of said values over a period of uninterrupted frames are concerned.MaxTendency wrote: ↑28 Dec 2020, 13:13TL;DR Yes, its good to have some objective measurements, and click to photon is better than nothing. But it has serious shortcomings and should not be treated as hard fact.
Anyone criticizing the shortcomings of highspeed/photodiode methods are barking up the wrong tree, as no method currently exists to test for the differences you are suggesting, and they aren't intended to.
See this related post I just made (on happenstance) for supplement:
https://forums.blurbusters.com/viewtopi ... =10#p61945
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series
Displays: ASUS PG27AQN, LG 48C4 VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)
Author: Blur Busters "G-SYNC 101" Series
Displays: ASUS PG27AQN, LG 48C4 VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)
Re: Optimization Hub for beginners
I assume we can all agree that the jump from 500 Hz to 1000 Hz mouse polling rate is significant, yet NVIDIA research measured a measly 0.4ms latency decrease. Do you see where I'm getting at now? Data without extensive latency graphs like this can easily be abused as debunking "proof".
Also, regarding piLagTester:
Also, regarding piLagTester:
The lag testing software runs on Linux
I don't see how this helps measuring optimizations affecting the pipeline between mouse and GPU output in competitive shooters on Windows.The pi draws a black background, and then roughly once a second displays a set of target rectangles (top/middle/bottom). You place the sensor over the desired target, and the piLagTester measures the monitor's response starting from the moment the frame of video data is sent over the Pi's HDMI port.
Starting point for beginners: PC Optimization Hub
Re: Optimization Hub for beginners
What's the source article? Is Nvidia seriously attempting to test differences in mouse tracking via a click-to-photon test? If so, of course that's not accurate. Mouse polling isn't even directly related to average input lag levels in that context.Brainlet wrote: ↑28 Dec 2020, 17:47I assume we can all agree that the jump from 500 Hz to 1000 Hz mouse polling rate is significant, yet NVIDIA research measured a measly 0.4ms latency decrease.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series
Displays: ASUS PG27AQN, LG 48C4 VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)
Author: Blur Busters "G-SYNC 101" Series
Displays: ASUS PG27AQN, LG 48C4 VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)
Re: Optimization Hub for beginners
Exactly why I'm concerned in general. People WILL produce all kinds of data to mislead the majority of consumers when it comes to input lag impacting user experience.jorimt wrote: ↑28 Dec 2020, 19:22What's the source article? Is Nvidia seriously attempting to test differences in mouse tracking via a click-to-photon test? If so, of course that's not accurate. Mouse polling isn't even directly related to average input lag levels in that context.Brainlet wrote: ↑28 Dec 2020, 17:47I assume we can all agree that the jump from 500 Hz to 1000 Hz mouse polling rate is significant, yet NVIDIA research measured a measly 0.4ms latency decrease.
Here is the source article:
https://www.nvidia.com/en-us/geforce/gu ... ion-guide/
Starting point for beginners: PC Optimization Hub