What exactly causes V-sync to introduce input lag?

Everything about input lag. Tips, testing methods, mouse lag, display lag, game engine lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more!

Re: What exactly causes V-sync to introduce input lag?

Postby stirner » 28 Dec 2016, 06:27

Without drivers installed (NVIDIA), my cursor gets noticably less laggy and noticably tears.
stirner
 
Posts: 74
Joined: 07 Aug 2014, 04:55

Re: What exactly causes V-sync to introduce input lag?

Postby silikone » 31 Dec 2016, 20:11

RealNC wrote:When the game submits the frame for output, it keeps working on new frames. It doesn't actually wait for the vblank itself. The OS/driver is doing the waiting for vblank, but the game doesn't actually care. The game can submit another frame later on, even though the previous one is still waiting for the vblank. So now you have two frames waiting to be displayed, containing 33.3ms old input data. Then a third frame comes in, and now we're on 50ms input lag.

Actually this can get so bad that it's sometimes possible to sit on 100ms worth of frames... 100ms input lag depending on how many buffers exist in the whole chain.

A frame limiter is needed to avoid this issue. What the frame limiter does is get between the game and the OS/driver, and intercept the frame submission. The game thinks it calls the OS function, but in reality it calls the frame limiter's replacement of that function. The new function doesn't just accept the new frame for output. It first looks how long it has been since the previous frame has been displayed. For a 60FPS cap, it looks if at least 16.7ms have passed since the last scanout. If not, it just waits and most importantly, it makes the game also wait since it doesn't yet accept the new frame.

Since the game is blocked until at least 16.7ms have passed since the previous frame, the game cannot compute new frames that are older than 16.7ms.


Where do all these buffers come from, though? I have been trying to activate double-buffered V-sync, but it seems to behave like a triple-buffer for whatever reason.
silikone
 
Posts: 24
Joined: 02 Aug 2014, 12:27

Re: What exactly causes V-sync to introduce input lag?

Postby Sparky » 31 Dec 2016, 20:23

silikone wrote:Where do all these buffers come from, though? I have been trying to activate double-buffered V-sync, but it seems to behave like a triple-buffer for whatever reason.


With vsync on, double buffering vs triple buffering means the GPU stops working when it's finished one frame, vs finished two frames. With double buffering, you have one framebuffer being scanned out to the display, and the other framebuffer either getting worked on or sitting around to get displayed. With triple buffering, the GPU starts working on the next frame while the second frame is still waiting to get displayed.

Furthermore, there's a distinction of triple buffering that drops frames when there are extra frames waiting to be displayed(fullscreen windowed, or fast sync), and triple buffering that doesn't drop frames(which adds latency when your framerate is limited by your refresh rate).
Sparky
 
Posts: 443
Joined: 15 Jan 2014, 02:29

Re: What exactly causes V-sync to introduce input lag?

Postby RealNC » 31 Dec 2016, 21:23

silikone wrote:Where do all these buffers come from, though? I have been trying to activate double-buffered V-sync, but it seems to behave like a triple-buffer for whatever reason.

There's no way to force double buffering. Or at least I don't know of a way.

However, a frame limiter set to just slightly below the refresh rate (0.005FPS less or so), makes sure the buffers do NOT get filled. When the frame buffers get filled, you get additional input lag because the game engine now sits on sampled input for a long time, which means it gets older and older as time passes, which in turns means more input lag. The pre-render buffers are filled with data that depends on old input. Although you can set the buffer size to 1; the setting is called "pre-rendered frames" on Nvidia and "flip queue size" on AMD. It's not uncommon to see something like 100ms total input lag in some cases because of this. This happens when the game is rendering too fast; all buffers just fill up, the game engine sits on old input data, and once that input data finally makes into the output buffer, it needs to wait around even more, making that input data even more outdated. That's where these 100+ms input lag numbers come from.

The above is known as "vsync back-pressure." It results in frames getting displayed that are based on user input that is way too old.

A frame limiter makes sure the game doesn't render too fast. That means no buffer congestion. The buffers stay empty and the input data used to render the frames is always "fresh". So even if you can't force double buffering, a frame limiter just makes the extra frame buffer be empty. The result is the same as if you had double buffering, even though the game itself has set up three buffers. The frame limiter makes sure it just isn't used, even if it's there.

However, that also means that - since the buffers stay empty - you have no protection against FPS drops. You can get the usual behavior of being locked to 30FPS when the game can't maintain stable 60FPS. But if you have a powerful system and get always keep 60FPS, a frame limiter is going to work like a charm.
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
User avatar
RealNC
 
Posts: 694
Joined: 24 Dec 2013, 18:32

Re: What exactly causes V-sync to introduce input lag?

Postby silikone » 31 Dec 2016, 21:49

RealNC wrote:The pre-render buffers are filled with data that depends on old input. Although you can set the buffer size to 1; the setting is called "pre-rendered frames" on Nvidia and "flip queue size" on AMD. It's not uncommon to see something like 100ms total input lag in some cases because of this.


Ah, I was wondering where this was being used. This seems to match my experience of the input lag slowly building up as the frame rate exceeds the refresh rate. During this brief moment, the in-game FPS display also goes beyond the refresh rate before finally settling, which I assume means that all of the buffers have been filled.

If double-buffering cannot be forced, what is the option in the NVIDIA control panel for?
silikone
 
Posts: 24
Joined: 02 Aug 2014, 12:27

Re: What exactly causes V-sync to introduce input lag?

Postby RealNC » 01 Jan 2017, 00:05

silikone wrote:If double-buffering cannot be forced, what is the option in the NVIDIA control panel for?

It's for OpenGL games. There you can force triple buffering. For D3D games, you can only try and force triple buffering with RadeonPro, if the game doesn't already use TB. If it does, you cannot disable it.

So: you can try and force TB on D3D games that use DB, but you cannot force DB in games that use TB.
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
User avatar
RealNC
 
Posts: 694
Joined: 24 Dec 2013, 18:32

Re: What exactly causes V-sync to introduce input lag?

Postby Chief Blur Buster » 03 Jan 2017, 11:08

A common trick to enable non-blocking triple buffering is VSYNC OFF + Windowed mode.

Windowed mode is always VSYNC ON by default on Windows 7 and later. It's desktop compositing where all the graphics windows/layers are merged into one frame and then flipped at the next VSYNC.

For apps running VSYNC OFF, the Windows desktop compositing mode it behaves as the final display buffer, and displays the most recently rendered frame, with zero tearing, ala real non-blocking triple buffering.

Tearing-free triple buffering (of any kind) is still more latency than VSYNC OFF (with all the tearing artifacts), but less latency than plain old double-buffered VSYNC ON.

In latency order (when not using other framerate limiters) is as follows:
From worst to best:

VSYNC ON (windowed mode) -- Blocking queue of multiple buffers (double buffer & the windows compositing buffer on top of it all!)
VSYNC ON (full screen mode) -- Plain old double-buffering
VSYNC OFF (windowed mode) -- Non-blocking triple buffer, no tearing artifacts.
VSYNC OFF (full screen mode) -- Tearing artifacts

This is simply rule of thumb. There are exceptions (games may intentionally behave differently in different modes, and graphics drivers may do weirdness -- like different buffer queue depth in windowed versus full screen -- that adds/removes latency that skews this order)
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter!
User avatar
Chief Blur Buster
Site Admin
 
Posts: 2715
Joined: 05 Dec 2013, 15:44

Re: What exactly causes V-sync to introduce input lag?

Postby RealNC » 03 Jan 2017, 14:09

Chief Blur Buster wrote:This is simply rule of thumb. There are exceptions (games may intentionally behave differently in different modes, and graphics drivers may do weirdness

Yep. With nvidia and OpenGL or Vulkan, borderless windowed actually disables compositing. They redirect the whole thing into a semi-exclusive fullscreen mode. So vsync off means tearing there.

DirectX is not affected by this. Just GL and Vulkan.
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
User avatar
RealNC
 
Posts: 694
Joined: 24 Dec 2013, 18:32

Previous

Return to Input Lag

Who is online

Users browsing this forum: No registered users and 1 guest