hmukos wrote: ↑20 May 2020, 18:20
I've put 12ms as an approximate baseline that includes those factors (you gave this approximate number in earlier answer). Frame render in this case is just 0.5ms.
That 12ms was an approximation of additional input lag added from the mouse/display to the single 27ms reading I mentioned though.
hmukos wrote: ↑20 May 2020, 18:20
Aren't those factors the same for 60hz and 240hz case?
This is were we keep getting caught. Yes, but the
scanout is now
not the same. At 240Hz, the scanout now occurs 180 times more per second AND the individual cycles are 12.4ms faster per when directly compared to 60Hz. This affects
all other factors.
hmukos wrote: ↑20 May 2020, 18:20
So we clicked at A, waited for delay in B, frame was rendered in 0.5ms. Shouldn't we see the tearline right away? (I didn't quite understand the C to be honest. The scanout position should be the sum of A and B and the tearline should appear right after this scanout position, no?)
The display doesn't register the input first, the system does. The display doesn't "register" input until it appears via a tearline.
We have the point in which the existing scanout cycle was
when you clicked, then the amount of scanout cycles that occur between the time you clicked and any additional time the input devices/system/display processing add to it (which again, can be randomized within a certain range per input read), and then the point in which the input reflects in the current scanout, which may be just starting, or half finished, or nearly finished.
--------
I also have another reply before you edited your post...
hmukos wrote: ↑20 May 2020, 17:44
than why does the gap between Min and Max tighten for higher refresh rates?
Because the individual scanout cycles are faster and are occurring more frequently per second at higher than 60 refresh rates, so the scanout itself becomes less and less of a contributing factor to accumulative input lag, and you're progressively left with the remaining factors in the latency chain.
E.g. we're reducing the maximum contributing variance of the scanout factor from a range of 0 - 16.6ms (60Hz) to 0 - 10ms (100Hz), or 0 - 6.9ms (144Hz), or 0 - 4.2ms (240Hz), and so forth.
If you look at the "V-SYNC off + 0 FPS (2000+ FPS)" in each of the graphs here:
https://blurbusters.com/gsync/gsync101- ... ettings/9/
You'll see the min/max ratio is
relatively proportionate to just under a single refresh cycle across the respective refresh rates:
- 60Hz min/max range = 13ms (of 16.6ms scanout)
- 100Hz min/max range = 8ms (of 10ms scanout)
- 120Hz min/max range = 6ms (of 8.3ms scanout)
- 144hz min/max range = 5ms (of 6.9ms scanout)
- 200Hz min/max range = 3ms (of 5ms scanout)
- 240Hz min/max range = 3ms (of 4.2ms scanout)