axaro1 wrote: ↑23 Jun 2020, 16:48
Imagine unironically believing that a team of professional testers who reviewed 110 monitors in the past years have no idea of how to properly test for input lag
IIRC RLCScontender previously said something about ELMB-Sync having 7-11ms of input lag so I guess RTings got the entire review wrong, I may as well write an email to tell them to stop testing with their shitty equipment and their incompetent staff.
ItwasLuck wrote: ↑23 Jun 2020, 15:56
I love how you skimmed through the main input lag chart @240HZ which was done with an
oscilloscope! And you do what? Jump on the fact that I added data from a Human Benchmark done @60HZ. Lol, your ignorance really baffles the mind.
I tend to agree that oscilloscope is only a limited way to test real-world latency. My preference for latency tests is a Present()-to-photons stopwatch with full parametric disclosure (most sites don't even properly disclose lag test stopwatching methodology), so that it also accomodates GPU output-level latency-cooperative behaviours.
There's really no way to silo display latency separately of the GPU because many sync technologies have latency co-operative behaviors where Display A-betterthan-B can invert B-betterthan-A with a single GPU-level setting change.
Things that can cause "A-better-than-B" become "B-better-than-A"
1. Change sensor location, or switch between photodide vs camera (pixel-vs-pixel versus first-anywhere).
2. Changing display setting (VRR on/off, strobe on/off)
3. Changing refresh rate (60Hz vs 240Hz)
4. Changing sync setting (VSYNC ON, VSYNC OFF, VRR, etc)
5. Changing lag stopwatch start parameter (VBI stopwatch start, Present() stopwatch start)
6. Changing lag stopwatch end parameter (GtG % setting, pixel-for-pixel change, first-anywhere change, etc)
7. Testing longer part of chain or less part of chain (one end of cable, both end of cable, transceiver/ramdac or bypass, etc)
8. Testing different colors (since GtG pixel change is different colors)
9. Testing towards actual human vision, versus testing towards abstract signal level irrelevant to human reaction times.
Changing any ONE of the above, can sometimes dramatically change lag numbers (
sometimes by well over 10ms!!!). There might be no change on display A, but huge dramatic change (>10ms) on display B.
Over the last 5 years, I've already discussed the
cesspool of latency tests in the Lag Testing forum. Single pixel lag tests can differ hugely from reaction time testing, because of periphal vision, because different pixels have different lags / lag gradients / lag volatility depending on parameters (refresh rate, scanconverting TCONs, strobe on/off, VRR on/off, pixel location, VSYNC ON/OFF, etc), where I've seen SO MANY displays leapfroag each other (B-better-A and A-better-B).
I am just going to put a generic disclaimer.
Lag Testing Disclaimer
1. Latency Tests Can Be Like Snowflakes.
2. No Two Snowflakes Are Alike.
3. You Can't Compare Lag Tests Across Sites.
4. Do Not Arbitrarily Bash Lag Tests Of Site A Over Site B
5. Instead, Please Complain About Lack Of Test Disclosure.
6. Lag Is Never A Single Number.
7. Lag Is A Complex Topic
A huge problem is lag test disclosure is lacking on many websites. Leo Bodnar is a 60fps VSYNC ON 0 MPRF lag tester that uses VBI stopwatch start and an unknown GtG stopwatch end with black-white pixel change color. SMTT 2.0 is a 1000fps VSYNC OFF two-display-differential latency tester that uses yellow-blue pixel change color.
And RLCSContender, I now suggest you stop saying any website's lag test is wrong. They just are all using different lag stopwatches different from each other. Just simply complain, "Websites don't show full latency testing disclosure". (And I will also complain here: Your latency testing disclosure is incomplete too). Will be starting to work on improved latency testing standardization.
Also, the lowest lag is the first pixel below a tearline during VSYNC OFF, so even tearline location can vary display lag, and VSYNC OFF bypasses scanout latency (on some displays, TOP=CENTER=BOTTOM for VSYNC OFF, where BOTTOM is the same 3ms as TOP), but totally changes to TOP<CENTER<BOTTOM or TOP>CENTER>BOTTOM on other setting changes (sync setting, strobe setting, etc).
Now, also, GtG10%-GtG50% is roughly the approximate point where pixels become visible to human eyes (reaction time clocks start). Sometimes it's very delayed from GtG0% start and sometimes it's almost instantaneous. If a site measures only one lag number, it's best to carbon-copy as close as possible to human vision, and choose a human-visible GtG number (not darn near GtG0%, not darn near GtG100%). How you tune your GtG stopwatch stop (within your noise margins of your oscilloscope, of which is totally different for different brands) can dramatically mean different lag for real-world human reaction. Also, moreover, peripheral vision exists. An enemy that shows up in a different part of the screen can be different lag than an enemy that shows up at crosshairs. And a full screen explosion flash can become visible far before screen center.
Lag Tests Are Like Snowflakes. No Two Are Alike
Ideally, only compare lag numbers between the same reviewer/same site/same person.
So, the websites aren't being dishonest. It's just a simply lag-testing-parameters disclosure problem. I'm smart enough to acknowledge that.