Page 1 of 1

Understanding VRR

Posted: 29 Jan 2023, 05:22
by BlauparK
If I'm running g-sync > adaptative sync on instead of fixed refresh for preventing tearing on more graphically demanding competitive games where frames drop below 120 for example, the input lag you will get at every moment wil be the given one for the refresh of the monitor at that fps count ?

Let me explain, I have been offered an asus vg259qm at a really bargain price for competitive fps games. According to rtings this monitor has an awful input lag at 60hz. If i play with g-sync - adaptative sync on a quite graphically demanding multiplayer game and my fps drops to 60 or less at some moment, ill be getting that 35ms input lag that the asus monitor has? If so to avoid it setting fixed refresh rate + highest available preferred refresh rate on nvidia 3d control panel would prevent it ... But to have a gsync / freesync monitor and not use that feature where is most needed, at lower fps counts is a pitty. I preffer to skip it and get another monitor if so

Thank you, still learning lot of things, thrilling to know more. Very informative forums you got here :)

Re: Understanding VRR

Posted: 29 Jan 2023, 06:29
by RealNC
BlauparK wrote:
29 Jan 2023, 05:22
Let me explain, I have been offered an asus vg259qm at a really bargain price for competitive fps games. According to rtings this monitor has an awful input lag at 60hz. If i play with g-sync - adaptative sync on a quite graphically demanding multiplayer game and my fps drops to 60 or less at some moment, ill be getting that 35ms input lag that the asus monitor has?
No. That lag measurement is only applicable when running the monitor in 60Hz mode. With VRR, you never do that. You always run it in 240Hz mode (or 280Hz overclocked) which does not have the issue. With VRR, even if FPS drops to 60, your monitor's scanout speed is still going to be that of 240Hz. You might have heard of this as "quick frame transport" (QFT). See:

viewtopic.php?t=4064#p32651

Except that QFT is a name the HDMI spec came up with. You're getting it on displayport as well if you use VRR, automatically. You don't need to tweak or configure anything. Just choose max Hz in the GPU driver and/or game, enable g-sync, and you'll be getting QFT.

Re: Understanding VRR

Posted: 29 Jan 2023, 19:19
by Chief Blur Buster
BlauparK wrote:
29 Jan 2023, 05:22
If I'm running g-sync > adaptative sync on instead of fixed refresh for preventing tearing on more graphically demanding competitive games where frames drop below 120 for example, the input lag you will get at every moment wil be the given one for the refresh of the monitor at that fps count ?

Let me explain, I have been offered an asus vg259qm at a really bargain price for competitive fps games. According to rtings this monitor has an awful input lag at 60hz. If i play with g-sync - adaptative sync on a quite graphically demanding multiplayer game and my fps drops to 60 or less at some moment, ill be getting that 35ms input lag that the asus monitor has? If so to avoid it setting fixed refresh rate + highest available preferred refresh rate on nvidia 3d control panel would prevent it ... But to have a gsync / freesync monitor and not use that feature where is most needed, at lower fps counts is a pitty. I preffer to skip it and get another monitor if so

Thank you, still learning lot of things, thrilling to know more. Very informative forums you got here :)
No.

You get ultra-low-latency 60fps latency on a 280Hz VRR display.

VRR has QFT built-in, so if you're using VRR you are already in perpetual QFT mode (Quick Frame Transport). QFT is rarely used outside of VRR (requires a hack to use QFT without VRR), but VRR has defacto QFT built in.

The key is that it must be connected by PC (not by console), with the entire VRR range unlocked. All frames are blasted at (1/MaxHz)th of a second over the cable, and scanned out in (1/MaxHz)th of a second onto the panel.

If you turn on VRR, then your "60fps" frames are:
(A) Blasted over the cable in 1/280sec;
(B) Skips the scan conversion that normally happens with slow-delivery of 60Hz frames;
(C) AND the panel scans-out from top to bottom in 1/280sec; and
(D) Steps (A) and (C) happens simultaneously, with the signal streamed almost straight to the panel, with only small rolling-window (for picture processing and DP/HDMI micropacket dejittering and audio demultiplex, etc).

So you're getting the same ~0.5/280sec halftime display latency for a 60fps frame as for a 250fps frame.

Not all pixels refresh at the same time, see www.blurbusters.com/scanout

Re: Understanding VRR

Posted: 29 Jan 2023, 20:06
by BlauparK
Thanks you for the detailed explanations, regards