What exactly causes V-sync to introduce input lag?

Talk to software developers and aspiring geeks. Programming tips. Improve motion fluidity. Reduce input lag. Come Present() yourself!
stirner
Posts: 74
Joined: 07 Aug 2014, 04:55

Re: What exactly causes V-sync to introduce input lag?

Post by stirner » 28 Dec 2016, 06:27

Without drivers installed (NVIDIA), my cursor gets noticably less laggy and noticably tears.

silikone
Posts: 57
Joined: 02 Aug 2014, 12:27

Re: What exactly causes V-sync to introduce input lag?

Post by silikone » 31 Dec 2016, 20:11

RealNC wrote:When the game submits the frame for output, it keeps working on new frames. It doesn't actually wait for the vblank itself. The OS/driver is doing the waiting for vblank, but the game doesn't actually care. The game can submit another frame later on, even though the previous one is still waiting for the vblank. So now you have two frames waiting to be displayed, containing 33.3ms old input data. Then a third frame comes in, and now we're on 50ms input lag.

Actually this can get so bad that it's sometimes possible to sit on 100ms worth of frames... 100ms input lag depending on how many buffers exist in the whole chain.

A frame limiter is needed to avoid this issue. What the frame limiter does is get between the game and the OS/driver, and intercept the frame submission. The game thinks it calls the OS function, but in reality it calls the frame limiter's replacement of that function. The new function doesn't just accept the new frame for output. It first looks how long it has been since the previous frame has been displayed. For a 60FPS cap, it looks if at least 16.7ms have passed since the last scanout. If not, it just waits and most importantly, it makes the game also wait since it doesn't yet accept the new frame.

Since the game is blocked until at least 16.7ms have passed since the previous frame, the game cannot compute new frames that are older than 16.7ms.
Where do all these buffers come from, though? I have been trying to activate double-buffered V-sync, but it seems to behave like a triple-buffer for whatever reason.

Sparky
Posts: 682
Joined: 15 Jan 2014, 02:29

Re: What exactly causes V-sync to introduce input lag?

Post by Sparky » 31 Dec 2016, 20:23

silikone wrote: Where do all these buffers come from, though? I have been trying to activate double-buffered V-sync, but it seems to behave like a triple-buffer for whatever reason.
With vsync on, double buffering vs triple buffering means the GPU stops working when it's finished one frame, vs finished two frames. With double buffering, you have one framebuffer being scanned out to the display, and the other framebuffer either getting worked on or sitting around to get displayed. With triple buffering, the GPU starts working on the next frame while the second frame is still waiting to get displayed.

Furthermore, there's a distinction of triple buffering that drops frames when there are extra frames waiting to be displayed(fullscreen windowed, or fast sync), and triple buffering that doesn't drop frames(which adds latency when your framerate is limited by your refresh rate).

User avatar
RealNC
Site Admin
Posts: 3740
Joined: 24 Dec 2013, 18:32
Contact:

Re: What exactly causes V-sync to introduce input lag?

Post by RealNC » 31 Dec 2016, 21:23

silikone wrote:Where do all these buffers come from, though? I have been trying to activate double-buffered V-sync, but it seems to behave like a triple-buffer for whatever reason.
There's no way to force double buffering. Or at least I don't know of a way.

However, a frame limiter set to just slightly below the refresh rate (0.005FPS less or so), makes sure the buffers do NOT get filled. When the frame buffers get filled, you get additional input lag because the game engine now sits on sampled input for a long time, which means it gets older and older as time passes, which in turns means more input lag. The pre-render buffers are filled with data that depends on old input. Although you can set the buffer size to 1; the setting is called "pre-rendered frames" on Nvidia and "flip queue size" on AMD. It's not uncommon to see something like 100ms total input lag in some cases because of this. This happens when the game is rendering too fast; all buffers just fill up, the game engine sits on old input data, and once that input data finally makes into the output buffer, it needs to wait around even more, making that input data even more outdated. That's where these 100+ms input lag numbers come from.

The above is known as "vsync back-pressure." It results in frames getting displayed that are based on user input that is way too old.

A frame limiter makes sure the game doesn't render too fast. That means no buffer congestion. The buffers stay empty and the input data used to render the frames is always "fresh". So even if you can't force double buffering, a frame limiter just makes the extra frame buffer be empty. The result is the same as if you had double buffering, even though the game itself has set up three buffers. The frame limiter makes sure it just isn't used, even if it's there.

However, that also means that - since the buffers stay empty - you have no protection against FPS drops. You can get the usual behavior of being locked to 30FPS when the game can't maintain stable 60FPS. But if you have a powerful system and get always keep 60FPS, a frame limiter is going to work like a charm.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

silikone
Posts: 57
Joined: 02 Aug 2014, 12:27

Re: What exactly causes V-sync to introduce input lag?

Post by silikone » 31 Dec 2016, 21:49

RealNC wrote:The pre-render buffers are filled with data that depends on old input. Although you can set the buffer size to 1; the setting is called "pre-rendered frames" on Nvidia and "flip queue size" on AMD. It's not uncommon to see something like 100ms total input lag in some cases because of this.
Ah, I was wondering where this was being used. This seems to match my experience of the input lag slowly building up as the frame rate exceeds the refresh rate. During this brief moment, the in-game FPS display also goes beyond the refresh rate before finally settling, which I assume means that all of the buffers have been filled.

If double-buffering cannot be forced, what is the option in the NVIDIA control panel for?

User avatar
RealNC
Site Admin
Posts: 3740
Joined: 24 Dec 2013, 18:32
Contact:

Re: What exactly causes V-sync to introduce input lag?

Post by RealNC » 01 Jan 2017, 00:05

silikone wrote:If double-buffering cannot be forced, what is the option in the NVIDIA control panel for?
It's for OpenGL games. There you can force triple buffering. For D3D games, you can only try and force triple buffering with RadeonPro, if the game doesn't already use TB. If it does, you cannot disable it.

So: you can try and force TB on D3D games that use DB, but you cannot force DB in games that use TB.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: What exactly causes V-sync to introduce input lag?

Post by Chief Blur Buster » 03 Jan 2017, 11:08

A common trick to enable non-blocking triple buffering is VSYNC OFF + Windowed mode.

Windowed mode is always VSYNC ON by default on Windows 7 and later. It's desktop compositing where all the graphics windows/layers are merged into one frame and then flipped at the next VSYNC.

For apps running VSYNC OFF, the Windows desktop compositing mode it behaves as the final display buffer, and displays the most recently rendered frame, with zero tearing, ala real non-blocking triple buffering.

Tearing-free triple buffering (of any kind) is still more latency than VSYNC OFF (with all the tearing artifacts), but less latency than plain old double-buffered VSYNC ON.

In latency order (when not using other framerate limiters) is as follows:
From worst to best:

VSYNC ON (windowed mode) -- Blocking queue of multiple buffers (double buffer & the windows compositing buffer on top of it all!)
VSYNC ON (full screen mode) -- Plain old double-buffering
VSYNC OFF (windowed mode) -- Non-blocking triple buffer, no tearing artifacts.
VSYNC OFF (full screen mode) -- Tearing artifacts

This is simply rule of thumb. There are exceptions (games may intentionally behave differently in different modes, and graphics drivers may do weirdness -- like different buffer queue depth in windowed versus full screen -- that adds/removes latency that skews this order)
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
RealNC
Site Admin
Posts: 3740
Joined: 24 Dec 2013, 18:32
Contact:

Re: What exactly causes V-sync to introduce input lag?

Post by RealNC » 03 Jan 2017, 14:09

Chief Blur Buster wrote:This is simply rule of thumb. There are exceptions (games may intentionally behave differently in different modes, and graphics drivers may do weirdness
Yep. With nvidia and OpenGL or Vulkan, borderless windowed actually disables compositing. They redirect the whole thing into a semi-exclusive fullscreen mode. So vsync off means tearing there.

DirectX is not affected by this. Just GL and Vulkan.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

silikone
Posts: 57
Joined: 02 Aug 2014, 12:27

Re: What exactly causes V-sync to introduce input lag?

Post by silikone » 25 Dec 2017, 20:52

I did some tests in various games, and I have learned that there is yet another challenge in the way of the perfect V-sync goal.
Quake 3 and 2, two classic OpenGL programs, suffer from imprecise time counting. They work with a Windows API that returns milliseconds in integers, resulting in a limited selection of FPS caps, such as 125. To my surprise, Quake 1 uses a different function that offers at worst one microsecond of precision, which allows me to fine-tune the frame rate to be fractional. This is where the display timings kick in. For modes like 480p, I had to drop the frame rate down to 59.94 to achieve low input lag. At 1080p, I was able to increase it to 60. At 120Hz, however, the Testufo utility reports that the monitor is a bit too slow, demanding a fractional FPS cap once again.
Getting this to work just right with minimal intervention from the user seems like a headache, for even if developers implement typical FPS caps, V-sync is not going to be swift on many display modes. Even if a display mode is exactly 60Hz, is there even any guarantee that the game timer won't drift and end up causing the display to queue up more buffers, if it's even lucky enough to have sub-1ms precision?
In theory, is there a solution that developers can implement to ensure responsive V-sync in all but the most unusual cases (possibly by reading the EDID to get a more precise refresh rate)?

Sparky
Posts: 682
Joined: 15 Jan 2014, 02:29

Re: What exactly causes V-sync to introduce input lag?

Post by Sparky » 25 Dec 2017, 22:51

In theory, is there a solution that developers can implement to ensure responsive V-sync in all but the most unusual cases (possibly by reading the EDID to get a more precise refresh rate)?
In theory yes, but to get down to the minimum of latency you need the game developer to implement a synchronous framerate cap, which uses feedback from the GPU as to the timing of both frame completion and v_blank. You need the feedback in order to control the amount of buffering, every bit of buffering will add latency, but if you don't have enough you'll drop frames and reintroduce stutter.

It's kind of like buffering a live video stream. The more buffer you have, the less likely you are to get hitches in the video playback, but the bigger the delay between when the action happens and when you see it. The problem is that the vast majority of games don't control how big this buffer is, and user workarounds can only get within a frame or so of the desired value, and often drift. The only way for the end user to actually fix the problem is to use a variable refresh display, like g-sync or freesync, in conjunction with a framerate cap(preferably in game).

Post Reply