Page 1 of 1

Is there any difference in input lag between refresh rates?

Posted: 28 May 2017, 05:04
by drmcninja
I know FPS correlates to input lag, but what about monitor refresh rate?

For example, is input lag different @ 60Hz monitor refresh rate than it is at 144Hz?

Re: Is there any difference in input lag between refresh rat

Posted: 28 May 2017, 10:05
by Sparky
drmcninja wrote:I know FPS correlates to input lag, but what about monitor refresh rate?

For example, is input lag different @ 60Hz monitor refresh rate than it is at 144Hz?
Yes and no, depending how you count, and on v-sync or VRR settings.

The situation where it doesn't impact latency is when you have vsync off and you're only counting the age of the newest information on the screen. In this case the type and engineering of the monitor matter much more.

When you finish a new frame, refresh rate determines how long anything from the old frame stays on the screen. With VRR(g-sync or freesync), higher max refresh rate lets you run higher framerates without introducing extra input lag.

With ordinary uncapped v-sync there's a very large difference in input lag when you change refresh rate, with vsync off refresh rate will determine how long a tear remains on screen, and if framerate isn't equal to refresh rate, a higher refresh rate will mean less judder.

Re: Is there any difference in input lag between refresh rat

Posted: 28 May 2017, 15:05
by spacediver
here's one useful way that I think about it (someone correct me if I'm wrong).

Assuming vsync off, and no gsync or freesync:

Suppose you had a monitor that had 1000 rows of pixels.

You're playing a game and running at 1000 fps.

Suppose you run the monitor at 1 hz (1 refresh per second).

This essentially means that at any given moment, there will be 1000 frames rendered on the monitor simultaneously--one row worth of pixels for each frame.

Now suppose at time (t) = 0 ms, the top row of pixels starts being scanned onto display. At that same time, in the server game world, an enemy appears at the bottom of the screen, and stays there. To make things simple, we'll assume the enemy consists of a single green line with a height of one row of your monitor's pixels.

Because your refresh rate is 1 hz, you will only see the enemy start to become visible at 999 ms, when the last row is drawn on your monitor (you'll only actually see the enemy if they are still there a second after they appeared there).

So there was a lag of almost 1 whole second that was due to your low refresh rate. This despite having a very high frame rate.

Re: Is there any difference in input lag between refresh rat

Posted: 29 May 2017, 18:49
by sharknice
To simplify it you end up getting 10 ms less input lag on average with 144hz.
Typically there aren't any real differences aside from the raw refresh time.

144hz refresh rate: 1 second/144 = 7 ms
60hz refresh rate: 1 second/60 = 17 ms


There are a lot of other factors that add to the input lag chain but a higher refresh rate will always have lower input lag.

Re: Is there any difference in input lag between refresh rat

Posted: 29 May 2017, 19:00
by RealNC
sharknice wrote:To simplify it you end up getting 10 ms less input lag on average with 144hz.
That sounds like the best-case, not the average?

At 60Hz, latency is between (0 .. 16.67] with an average of 8.33. At 144Hz, it's between (0 .. 6.94] with an average of 3.47. So on average, wouldn't 144Hz give you 4.86ms less latency?

(This is without vsync, obviously. The latency difference with vsync is huge between 60 and 144.)

Re: Is there any difference in input lag between refresh rat

Posted: 29 May 2017, 21:59
by sharknice
RealNC wrote:
sharknice wrote:To simplify it you end up getting 10 ms less input lag on average with 144hz.
That sounds like the best-case, not the average?

At 60Hz, latency is between (0 .. 16.67] with an average of 8.33. At 144Hz, it's between (0 .. 6.94] with an average of 3.47. So on average, wouldn't 144Hz give you 4.86ms less latency?

(This is without vsync, obviously. The latency difference with vsync is huge between 60 and 144.)
Yeah you're right, that would be best case not average.

Re: Is there any difference in input lag between refresh rat

Posted: 07 Jun 2017, 12:50
by Chief Blur Buster
Yes, scanout lag is often unchanged.

On most 144Hz monitors today with "Instant Mode" (lagless scanout) -- then with 1000fps VSYNC OFF, the top edge and bottom edge of the screen has exactly the same input lag (within error margin of 1/(framerate) = 1ms) at both 60Hz and 144Hz. This is assuming LCD GtG is the same at both refresh rates. Displays are refreshed top-to-bottom, as seen in high speed videos of an LCD refreshing. It's the same pixel refresh sequential order as CRT (except it just looks like a wipe-effect; rather than a flicker -- due to LCD being sample-and-hold)

The only difference is that each pixel is updated less often at 144Hz than at 60Hz, so you have more "effective" average lag because of missed samples.

A random event can have a lag modifier of (+0ms to +1/144sec) at 144Hz but there can be a lag modifier of (+0ms to +1/60sec) at 60Hz. The lag modifier varies depending on when the event occurs, relative to the scanout. Events happening just before the scanout reaches it (below the scanout) will have less lag than events happening just after the scanout.

Thus, best-case lag is identical at 60Hz and 144Hz, but worst-case lag is always worse at 60Hz than 144Hz.

Re: Is there any difference in input lag between refresh rat

Posted: 11 Jun 2017, 06:43
by Q83Ia7ta
from my research:
setup:
nikon 1 J1 320x120 1200fps
logitech g100s with led
asus gtx 950 strix last available drivers
win10 32-bit ltsb

method ioquake3: local listen server with sv_fps 1000, client with with 1000 fps (com_maxfps 0). transition: huge white crosshair white to off (reddish wall) on mouse button press.
method: M01: custom d3d9 app with ~6000 fps which change color from black to white on mouse button press.
method M01 gives more accurate results because of higher frame rate (FPS)

Re: Is there any difference in input lag between refresh rat

Posted: 12 Jun 2017, 16:34
by Chief Blur Buster
Very nice testing, Q83Ia7ta!

Comparing 60Hz versus 240Hz becomes quite easy, but the error factors (e.g. game engine jitter, operating system jitter, etc) cause a lot of noise that adds enough statistical noise to muddy comparing min/max/avg for smaller refresh rate differentials on many panel technologies.

You should try a run with these:
- Same 240Hz monitor model, same graphics driver, same system, same settings
- Only variable becomes huge differentials in refresh rates; 60Hz, 120Hz, 180Hz and 240Hz.
- Do multiple runs to clear up as much statistical noise as possible

That will help more clearly show the min-lag-remains-same (at any Hz) but avg/max-changes (as Hz changes). It's useful for the longstanding debate, "does Hz affects input lag?".

Re: Is there any difference in input lag between refresh rat

Posted: 13 Jun 2017, 21:54
by sharknice
It's also worth noting if you're moving your mouse you're constantly experiencing both the minimum and maximum input lag.

Just looking around in a first person shooter makes it very easy to "feel" the input lag. A 5 ms average difference may be hard to tell though.