Adaptive VSYNC VS VSYNC input lag

Everything about latency. Tips, testing methods, mouse lag, display lag, game engine lag, network lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.
User avatar
jorimt
Posts: 2484
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Adaptive VSYNC VS VSYNC input lag

Post by jorimt » 12 Nov 2018, 08:48

petrakeas wrote:Using just 2 buffers instead of 3 (as most modern games do in regular Vsync) could result in less input lag.
Most modern games (in exclusive fullscreen mode) actually tend to use double buffer V-SYNC in the vast majority of cases.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

petrakeas
Posts: 15
Joined: 27 Apr 2018, 06:34

Re: Adaptive VSYNC VS VSYNC input lag

Post by petrakeas » 12 Nov 2018, 09:55

jorimt wrote:
petrakeas wrote:Using just 2 buffers instead of 3 (as most modern games do in regular Vsync) could result in less input lag.
Most modern games (in exclusive fullscreen mode) actually tend to use double buffer V-SYNC in the vast majority of cases.
If double buffer V-SYNC was used (for example in a 60Hz monitor), then a small delay in rendering one frame would result in dropping to 30 fps. However, most modern games can have any framerate up to the screen refresh rate when V-SYNC is used. This can only be achieved with 3 buffers.

User avatar
jorimt
Posts: 2484
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Adaptive VSYNC VS VSYNC input lag

Post by jorimt » 12 Nov 2018, 10:06

Depends if you're running the game in borderless / windowed mode or exclusive fullscreen mode.

For instance, if the game has built-in double buffer V-SYNC, and you're playing in borderless / windowed mode, the operating system's DWM will override in-game V-SYNC and use its own form of triple buffer.

Whereas if the game has built-in double buffer V-SYNC, and you're in true exclusive fullscreen mode, double buffer will still be in effect, and half-refresh locks will occur.

It's even a little more complicated now with Window 10's Game Mode and "fullscreen optimization" settings, where many games now run in a hybrid borderless/exclusive fullscreen mode at default, and who knows how that effects each game in regards to double/triple buffer V-SYNC.

Anyway, as far as I'm aware, if Adaptive V-SYNC is enabled, it will use double buffer above the refresh rate if the game uses double buffer, and it will use triple buffer above the refresh rate if the game uses triple buffer.

Ultimately, I have the means (high speed camera + mouse w/LED) to test the input lag between Adaptive V-SYNC and standalone V-SYNC, but said testing is very time consuming and involved, and there really isn't enough in your scant, perceptively-based observations to justify it for me, especially with what I already know about these syncing methods.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

petrakeas
Posts: 15
Joined: 27 Apr 2018, 06:34

Re: Adaptive VSYNC VS VSYNC input lag

Post by petrakeas » 12 Nov 2018, 10:31

I am referring to exclusive fullscreen mode and vsync. I have disabled Windows 10 hybrid exclusive mode by disabling gamebar. Before disabling it, alt-tabbing was almost instant, so I think it is working.

Anyway, I'll experiment a bit more with 24 and 60 Hz and come back with my findings. I can record up to 720p@240 Hz. It will be enough for something more solid that my observations :D .

If double buffer vsync is used in most games, is there any benefit in this method (frame limiting slightly below refresh rare to avoid filling in the third buffer and simulate double buffer)?

User avatar
jorimt
Posts: 2484
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Adaptive VSYNC VS VSYNC input lag

Post by jorimt » 12 Nov 2018, 10:41

petrakeas wrote:I am referring to exclusive fullscreen mode and vsync. I have disabled Windows 10 hybrid exclusive mode by disabling gamebar. Before disabling it, alt-tabbing was almost instant, so I think it is working.
You can also try disabling "fullscreen optimizations" per game exe to be safe, as I've sometimes found disabling the gamebar still doesn't disable the hybrid mode for some games.
petrakeas wrote:Anyway, I'll experiment a bit more with 24 and 60 Hz and come back with my findings. I can record up to 720p@240 Hz. It will be enough for something more solid that my observations :D .
You'll get a lot of variance per sample at only 240Hz capture (high margin of error), but as long as you do it right (easier said than done), it may give you enough to roughly tell whether there is really an input lag difference between the two modes.

My educated guess, and existing knowledge says "no," but feel free to give it a shot.
petrakeas wrote:If double buffer vsync is used in most games, is there any benefit in this method?
Whether Adaptive V-SYNC is using double buffer or triple buffer above the refresh rate, the method you linked (using RTSS to set a limit a fraction below the refresh rate) should reduce input lag in standard V-SYNC mode more than uncapped Adaptive V-SYNC either way, as it prevents frames from over-queuing in both double and triple buffer modes.
petrakeas wrote:(frame limiting slightly below refresh rare to avoid filling in the third buffer and simulate double buffer)?
To be clear, the RTSS method does not make triple buffer into double buffer, or "remove" a buffer. Standard triple buffer only has more input lag above the refresh rate than double buffer above the refresh rate because it has an extra buffer to be overfilled. But in that scenario, both double and triple buffer get overfilled, which is why you need the RTSS limit either way to prevent it for both.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

petrakeas
Posts: 15
Joined: 27 Apr 2018, 06:34

Re: Adaptive VSYNC VS VSYNC input lag

Post by petrakeas » 12 Nov 2018, 12:29

Whether Adaptive V-SYNC is using double buffer or triple buffer above the refresh rate, the method you linked (using RTSS to set a limit a fraction below the refresh rate) should reduce input lag in standard V-SYNC mode more than uncapped Adaptive V-SYNC either way, as it prevents frames from over-queuing in both double and triple buffer modes.
I did the following experiments. All of them in exclusive fullscreen mode. When saying RTSS limiter, I mean limiting the framerate to 59.920 which results in reduced input lag with regular vsync.

I used Adaptive VSYNC+ RTSS limiter in Just Cause 3. The result is that VSYNC stayed enabled. I didn't notice input lag improvement. From time to time, a frame was repeated resulting in stuttering.

I used Adaptive VSYNC + RTSS limiter + disabled Triple Buffering (from the game's options) in Overwatch. The result was that VSYNC got disabled and tears were visible.

I used Adaptive VSYNC + RTSS limiter + enabled Triple Buffering (from the game's options) in Overwatch. VSYNC was enabled and the input lag felt similar to double buffer vsync. In other words it was similar to: VSYNC + RTSS limiter or just Adaptive VSYNC.

So, frame limiting didn't improve when used with Adaptive VSYNC. The reason for this is that there is no other buffer to be queued, in which case frame limiting would help.
it prevents frames from over-queuing in both double and triple buffer modes.
both double and triple buffer get overfilled, which is why you need the RTSS limit either way to prevent it for both
I don't think I follow you here regarding double buffered VSYNC. In double buffered VSYNC, the screen reads from one buffer and the GPU renders on the second buffer. If it finishes rendering fast enough it will stall. How can the GPU queue more frames? Are you referring to "maxium pre-rednered frames"?

User avatar
jorimt
Posts: 2484
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Adaptive VSYNC VS VSYNC input lag

Post by jorimt » 12 Nov 2018, 12:43

petrakeas wrote:So, frame limiting didn't improve when used with Adaptive VSYNC. The reason for this is that there is no other buffer to be queued, in which case frame limiting would help.
Correct, in that scenario, Adaptive VSYNC is just plain-old V-SYNC OFF, and V-SYNC OFF doesn't need an FPS limit to reduce input lag, as it doesn't have buffers to over-queue.

The FPS limiting technique with RTSS only works with standard double buffer or triple buffer (not counting DWM or Fast Sync triple buffer-type) V-SYNC.
petrakeas wrote:I don't think I follow you here regarding double buffered VSYNC. In double buffered VSYNC, the screen reads from one buffer and the GPU renders on the second buffer. If it finishes rendering fast enough it will stall. How can the GPU queue more frames? Are you referring to "maxium pre-rednered frames"?
Read this section of my article here, it should answer your question more clearly:
https://www.blurbusters.com/gsync/gsync ... ettings/7/

EDIT:
petrakeas wrote:How can the GPU queue more frames?
V-SYNC does not stop the system from rendering more frames, it only stops the display from showing more of the rendered frames than it's own max refresh rate can display per second, and thus prevents tearing (which is literally parts of multiple rendered frames displaying in a single "scanout" [the process that physically draws each frame, pixel by pixel, left to right, top to bottom on-screen], as opposed to displaying one complete rendered frame [V-SYNC] in a single scanout).

So for instance, at 60Hz, if the game is running at 100 FPS (or really any framerate above the display's max refresh rate) with double buffer V-SYNC, all 100 of those frames are being rendered per second, but only 60 of them can actually be displayed on the 60Hz monitor per second, which means when you finally see your gun fire, you're watching your input from (at least) two or more frames ago, since the frames are continuously rendering faster than the monitor can display to you in real-time.

Only an FPS limit below the refresh rate can make the game render no more than the screen's refresh rate in frames per second, which prevents most of the delivery delay added (see my previous article link further above in this post) when using V-SYNC.

TLDR: V-SYNC prevents the monitor from displaying more than x frames per second equal to the max refresh rate of the monitor (no tearing), while FPS limiters prevent the system from rendering more than x frames per second.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

petrakeas
Posts: 15
Joined: 27 Apr 2018, 06:34

Re: Adaptive VSYNC VS VSYNC input lag

Post by petrakeas » 13 Nov 2018, 06:05

So for instance, at 60Hz, if the game is running at 100 FPS (or really any framerate above the display's max refresh rate) with double buffer V-SYNC, all 100 of those frames are being rendered per second, but only 60 of them can actually be displayed on the 60Hz monitor
This is certainly not the case. It has been explained in this forums as well and it is also explained in less technical terms in the video that @RealNC posted in this thread.

VSYNC's primary function is not to cap the framerate but it accomplishes that as a side-effect due to buffer back pressure. I'll explain in more detail below.

In double buffer VSYNC, the game renders on the second frame. It it finishes fast enough, it will block when calling swapBuffers() or present() because there is no other buffer available. It will unblock when the screen finishes reading the first frame.

The same stands for 3 buffer VSYNC (not fast sync though). The game will initially render on the second frame buffer. If it finishes fast enough, it will render on the third frame buffer (at this point, the framerate it not yet capped). Then, it will block because no more frames are available. The swap buffer() call in the game code, will block. Thus, the frame rate will be capped to the screen's maximum refresh rate. However, it will have queued one extra buffer compared to double buffer VSYNC, introducing input lag. No more frames will be queued (this may depend on the maximum pre-rendered frames setting. Perhaps 1 pre-rendered frame will be queued, which is not an actual rendered frame but the instructions that will create it). In either case, when all frame buffers and frame queue slots have been filled, the game's render thread will block. This is entirely different to your understanding.

You can check out this post to better understand how frame buffers and pre-renderes frames work with VSYNC:
With vsync, the game is only blocked from rendering more frames once all frame buffers have been filled. So even though vsync does limit the game's render rate, it only does so at a point where it's too late because it doesn't actually care about frame rates. It doesn't look at the frame times. It only looks at the vblank signal of the graphics card (the vblank signal is the period between monitor scanouts, where switching to a new frame is safe and won't tear.) So really, the frame limiting aspect of vsync is a secondary effect, not a primary function.

and this (which references back to blurbusters :lol: )
Vsync does not actually cap the frame rate in the same way a frame limiter does. This is a popular misconception about vsync. What it does is sync the output of new frames to the monitor's "vblank" signal (the point between the monitor having finished scanning out the current frame and is preparing to scan out the next.) However, this is an asynchronous operation in modern systems and that means the game is allowed to go back and work on preparing more frames. The game is only prevented from doing that when it's too late.

So because there is no real frame cap, the game is preparing new frames as fast as it can. Once all possible frame buffers and all pre-render queues have been filled, only then will the game be prevented from queuing more frames to be rendered or displayed. That means when all these buffered and queued frames are displayed later on, they're based on very old input. Old input equals input lag.
Also, when a game runs uncapped, you can check that the GPU usage is 100%, you can hear the fan spinning louder, you can also check the framerate from within the game's OSD if supported. When VSYNC is enabled, you can see that the GPU usage drops, the fan spins slower etc because less frames are rendered. To better see this difference, you need a game that runs much higher that the screen's refresh rate when uncapped.

EDIT:

Also check out this thread for a more technical explanation.

User avatar
jorimt
Posts: 2484
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Adaptive VSYNC VS VSYNC input lag

Post by jorimt » 13 Nov 2018, 09:31

petrakeas wrote:This is certainly not the case. It has been explained in this forums as well and it is also explained in less technical terms in the video that @RealNC posted in this thread.

VSYNC's primary function is not to cap the framerate but it accomplishes that as a side-effect due to buffer back pressure.
My explanation was intentionally oversimplified, apparently grossly so to garner such a response from you, as I am actually aware of how V-SYNC works.

I'm also well aware of all of the other sources you provided, and I actually answered a question on the first page of the last thread you linked in your post "EDIT."

I think we're getting caught on semantics at this point...

Replace my "100 FPS (or really any framerate above the display's max refresh rate)" and "all 100 of those frames are being rendered per second," in my last post with "frametime."

"Frametime" reflects render time, and render time can, at points, be "faster" than the max refresh rate with the framerate above it with V-SYNC ON. This is why you don't see the FPS meter on, say, Afterburner, go over your max refresh rate with V-SYNC ON, but why you can still see the frametime meter sometimes dip (lower frametime = higher framerate) when the framerate exceeds your refresh rate with V-SYNC ON.

Lower frametimes (which, again, reflect "faster than refresh rate" render times) is what causes the over-queuing of V-SYNC buffers without a proper FPS limit in place, which, in turn, ultimately causes, yes, the repeated render "blocking," and thus added input lag with V-SYNC ON.

And when I said "V-SYNC does not stop the system from rendering more frames" in my last post (again, obviously too much oversimplification in trying to explain it to you on my part), I meant it doesn't stop it in time to prevent the added delivery delay when the framerate is well above the refresh rate without an FPS limit below the refresh rate in place.

You didn't even seem to be understanding the difference between an FPS limiter and Adaptive V-SYNC (or at least the difference in their benefit when using standalone V-SYNC with an FPS limit), and were suggesting Adaptive V-SYNC possibly had lower input lag than standalone V-SYNC with framerates above the refresh rate (which, to my knowledge, are both identical in that scenario, as they both use the same driver-level V-SYNC solution) earlier in this thread, so I wasn't sure how much I had to dumb my explanation down. Obviously, I went too far the other way for you in my last post.

Noted, and hopefully clarified enough now to be to your liking :? ;)
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

petrakeas
Posts: 15
Joined: 27 Apr 2018, 06:34

Re: Adaptive VSYNC VS VSYNC input lag

Post by petrakeas » 13 Nov 2018, 11:32

This makes sense now ;) Sometimes, oversimplifying is not the best approach (at least for me).

I was already aware though of how frame render time and fps play along ,but it was a nice explanation nevertheless (better than the one presented in your article ;) ).

The reason I suspected (not tested yet, but will test it when I find some time) that Adaptive VSync may be faster that NVCP VSync is that I think that the latter may add one extra frame buffer to the queue and thus add 1 extra frame of latency.

What I haven't figured out though (slightly off-topic) is how maximum pre-rendered frames play along with frame buffers of VSync. I have read some threads, but we can't be sure how Nvidia has implemented this. For example, in double-buffered VSync and 1 maximum pre-rendered frame, what is the maximum number of queued frames possible (if no framerate limit is engaged by the game or an external tool)? It could be 2: 1 fully rendered frame buffer waiting to be flipped with the other buffer, and 1 pre-rendered frame (commands waiting to be executed). Or it could be just 1: 1 fully rendered frame (GPU blocks until the buffer is flipped).

Post Reply