Page 1 of 2

Does VRR Increase or Decrease Input Lag?

Posted: 09 Mar 2025, 08:38
by Enigma
So, I saw a guy on YouTube who measured and said that disabling VRR on PS5 and his LG C4 resulted in 20ms faster input lag. A friend of mine said this can't be, because VRR is supposed to reduce input lag (he says because, for example, when your GPU produces 60fps and your monitor is 120hz, VRRing the monitor into 60fps will result in less input lag because it won't have to wait for frames. Is this even true?)

Anyway, I jumped into the rabbit hole. Found a bunch of guys on YouTube that report increased input lag (anywhere from 1ms to much higher). While places like RTINGS say any additional input lag is insignificant enough that it's not worth mentioning. Then I read the article about GSYNC on Blue Busters, but it's from 2017 I believe. My own reasoning is that any additional computing (either it's by the G-SYNC separate chip or by the GPU itself, in the case of HDMI VRR or AMD FreeSync) will add SOME extra latency, even if little.

Could anybody please shed light?

Thank you so much,
Stoked to have found this community. I love analytical minds ☺️

Re: Does VRR Increase or Decrease Input Lag?

Posted: 09 Mar 2025, 10:18
by jorimt
Enigma wrote:
09 Mar 2025, 08:38
(he says because, for example, when your GPU produces 60fps and your monitor is 120hz, VRRing the monitor into 60fps will result in less input lag because it won't have to wait for frames. Is this even true?)
Yes, because each frame then scans in at the speed of current physical refresh rate of the display (120Hz = 8.3ms frametime) instead of the frametime of the lower framerate (60 FPS = 16.6ms frametime):
https://blurbusters.com/gsync/gsync101- ... ttings/13/

This is technically achievable with no sync in the same scenario, VRR just does it without tearing. It's an automated form of QFT
Enigma wrote:
09 Mar 2025, 08:38
Then I read the article about GSYNC on Blue Busters, but it's from 2017 I believe
Nothing involving VRR latency has changed since then; simplest answer is VRR prevents sync latency, but does not add to or reduce any other form of latency. I.E. it's essentially neutral.

Further, as described below, any "latency" that can be attributed to VRR when compared directly to no sync is its lack of tearing, but that obviously doesn't count, since tearing prevention is the entire purpose of VRR in the first place:
https://blurbusters.com/gsync/gsync101- ... ettings/6/
To eliminate tearing, G-SYNC + VSYNC is limited to completing a single frame scan per scanout, and it must follow the scanout from top to bottom, without exception. On paper, this can give the impression that G-SYNC + V-SYNC has an increase in latency over the other two methods. However, the delivery of a single, complete frame with G-SYNC + V-SYNC is actually the lowest possible, or neutral speed, and the advantage seen with V-SYNC OFF is the negative reduction in delivery speed, due to its ability to defeat the scanout.

Bottom-line, within its range, G-SYNC + V-SYNC delivers single, tear-free frames to the display the fastest the scanout allows; any faster, and tearing would be introduced.
Now, since VRR is limited to functioning within the refresh rate, the lowest frametimes that can be achieved with it is the maximum physical refresh rate of the display, whereas no sync can achieve even lower frametimes due to the framerate being able to exceed the refresh rate (by multiple times in many cases), but then, of course, there's tearing and more potential for unmitigated frametime jitter due to the uncapped framerate:
https://blurbusters.com/gsync/gsync101- ... ettings/9/

That, and lowest achievable frametime isn't traditionally consider a reduction in latency, but, instead, an increase in average framerate, and however low or high frametime is does not itself determine whether the appearance of user input is further delayed.

Hence, if having VRR enabled is adding any latency to the chain, it would be the given display model/console, its firmware and/or a (PC GPU) driver quirk in that particular case, or a misunderstanding by the tester of what is and isn't an increase in actual input latency, not the fundamental behavior of VRR itself.

Re: Does VRR Increase or Decrease Input Lag?

Posted: 16 Mar 2025, 10:09
by roro13200
Hi, I wanted to know what is the point of faster scanning? Since when I play at 60fps, on PC even with gsync, I see that the frametime is always at 16.67ms?

Re: Does VRR Increase or Decrease Input Lag?

Posted: 16 Mar 2025, 10:56
by RealNC
roro13200 wrote:
16 Mar 2025, 10:09
Hi, I wanted to know what is the point of faster scanning? Since when I play at 60fps, on PC even with gsync, I see that the frametime is always at 16.67ms?
That's the game-side frame time. The display-side frame time is lower with faster scanout. At 240Hz, it's always 4.2ms, at 120Hz it's 8.3ms, etc.

Re: Does VRR Increase or Decrease Input Lag?

Posted: 16 Mar 2025, 11:33
by roro13200
which means that the same image is displayed several times?

Re: Does VRR Increase or Decrease Input Lag?

Posted: 16 Mar 2025, 13:05
by RealNC
roro13200 wrote:
16 Mar 2025, 11:33
which means that the same image is displayed several times?
No.

At 60Hz, the image is scanned out in 16.7ms, and then the next one starts almost immediately. Latency at the top of the screen is near 0, but at the bottom of the screen it's 16.7ms.

At 240Hz, the image is scanned out in 4.2ms, and then the monitor waits and does nothing for 12.5ms. The whole cycle is 16.7ms. But each frame is drawn much faster. Latency at the top of the screen is near 0, but at the bottom of the screen it's 4.2ms. So the worst-case scanout latency was reduced by 12.5ms.

Re: Does VRR Increase or Decrease Input Lag?

Posted: 16 Mar 2025, 13:44
by roro13200
Thank you for your clear explanations, but what is the difference with VRR and VSync off? Since at 240Hz and 60fps the tearing is practically invisible.

Re: Does VRR Increase or Decrease Input Lag?

Posted: 16 Mar 2025, 13:51
by RealNC
roro13200 wrote:
16 Mar 2025, 13:44
Thank you for your clear explanations, but what is the difference with VRR and VSync off? Since at 240Hz and 60fps the tearing is practically invisible.
I can see the tearing just fine, so it's not invisible :P

Re: Does VRR Increase or Decrease Input Lag?

Posted: 04 Apr 2025, 07:58
by netborg
My rule of thumb is: I use VRR as long as I can't at least output double the fps consistently my monitor can display, after which latency decrease in fast turns via teared rendering becomes apparent.

VRR has the big benefit of visual smoothness while having consistent latency. If you'd render the same amount of fps teared respective to the monitor's Hz, you'd have pretty much the same average latency than via VRR, with increased latency variation and smoothness decrease.

Re: Does VRR Increase or Decrease Input Lag?

Posted: 04 Apr 2025, 08:05
by netborg
roro13200 wrote:
16 Mar 2025, 13:44
Thank you for your clear explanations, but what is the difference with VRR and VSync off? Since at 240Hz and 60fps the tearing is practically invisible.
Tearing is always visible, also at high refresh. It just becomes harder to pinpoint without slowmo cam.