Enigma wrote: ↑09 Mar 2025, 08:38
(he says because, for example, when your GPU produces 60fps and your monitor is 120hz, VRRing the monitor into 60fps will result in less input lag because it won't have to wait for frames. Is this even true?)
Yes, because each frame then scans in at the speed of current physical refresh rate of the display
(120Hz = 8.3ms frametime) instead of the frametime of the lower framerate
(60 FPS = 16.6ms frametime):
https://blurbusters.com/gsync/gsync101- ... ttings/13/
This is technically achievable with no sync in the same scenario, VRR just does it without tearing. It's an automated form of
QFT
Enigma wrote: ↑09 Mar 2025, 08:38
Then I read the article about GSYNC on Blue Busters, but it's from 2017 I believe
Nothing involving VRR latency has changed since then; simplest answer is VRR prevents sync latency, but does not add to or reduce any other form of latency. I.E. it's essentially neutral.
Further, as described below, any "latency" that can be attributed to VRR when compared directly to no sync is its lack of tearing, but that obviously doesn't count, since tearing prevention is the entire purpose of VRR in the first place:
https://blurbusters.com/gsync/gsync101- ... ettings/6/
To eliminate tearing, G-SYNC + VSYNC is limited to completing a single frame scan per scanout, and it must follow the scanout from top to bottom, without exception. On paper, this can give the impression that G-SYNC + V-SYNC has an increase in latency over the other two methods. However, the delivery of a single, complete frame with G-SYNC + V-SYNC is actually the lowest possible, or neutral speed, and the advantage seen with V-SYNC OFF is the negative reduction in delivery speed, due to its ability to defeat the scanout.
Bottom-line, within its range, G-SYNC + V-SYNC delivers single, tear-free frames to the display the fastest the scanout allows; any faster, and tearing would be introduced.
Now, since VRR is limited to functioning
within the refresh rate, the lowest frametimes that can be achieved with it is the maximum physical refresh rate of the display, whereas no sync can achieve even lower frametimes due to the framerate being able to exceed the refresh rate
(by multiple times in many cases), but then, of course, there's tearing and more potential for unmitigated frametime jitter due to the uncapped framerate:
https://blurbusters.com/gsync/gsync101- ... ettings/9/
That, and lowest achievable frametime isn't traditionally consider a reduction in latency, but, instead, an increase in average framerate, and however low or high frametime is does not itself determine whether the appearance of user input is further delayed.
Hence, if having VRR enabled is adding any latency to the chain, it would be the given display model/console, its firmware and/or a
(PC GPU) driver quirk in that particular case, or a misunderstanding by the tester of what is and isn't an increase in actual input latency, not the fundamental behavior of VRR itself.