External framerate caps adds lag. Use in-game framerate cap.

Everything about input lag. Tips, testing methods, mouse lag, display lag, game engine lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.
User avatar
Chief Blur Buster
Site Admin
Posts: 7484
Joined: 05 Dec 2013, 15:44
Location: Toronto, Ontario, Canada
Contact:

External framerate caps adds lag. Use in-game framerate cap.

Post by Chief Blur Buster » 27 May 2014, 23:33

I made a post on another forum that got good acclaim, so I'm crossposting here:
Chief Blur Buster wrote:
CoD511 wrote:Personally, I know about the issue and I'd call it an unfixed issue or a bug despite. It's unintended behaviour occurring on their part, that's just what a bug is (or the result of something causing that behaviour) with or without technicality involved. Regardless if a user does a workaround by capping the framerate below the refresh rate per game when possible.
Actually, it is kind of a law of physics limitation of sorts, rather than as a bug.

There is no easy workaround other than reinventing the 3D rendering paradigm to permit easy driver/hardware-side lagless framerate capping which is otherwise not easily possible.
  • External framerate limiters of any kind adds lag in current 3D architectures.
  • Internal framerate limiters can successfully avoid this lag.
Frames should be throttled right before input reads, and BEFORE rendering, not after already rendering, which is what external framerate limiters do (e.g. NVInspector, driver capping, VSYNC ON capping, GSYNC limit hitting). You have lost the lag battle if you already rendered the frame and then is forced to wait on presenting the frame. Game developers need to allow an internal method of frame rate capping, for full control over lag, without being forced to wait by external limiters after presenting the frame.

Only the appearance of the lag at, say, fps_max 143 (during 144Hz GSYNC) can legimately be called a potential issue. It can be improved so that there is no lag penalty when you use an in-game framerate limiter closer to Hz. But once an external limit is hit -- you yield game engine control of lag -- when something outside your game forces frames to wait.

However, new Microsoft and OpenGL APIs could theoretically be made to help assist game developers in making easy, more automatic "Just in Time VSYNC" (ultra low latency VSYNC ON by timing the rendering right before VSYNC). Likewise, hardware can work to reduce the pain of this unavoidable law-of-physics effect, such as in the future, the GSYNC limit can be raised (e.g. 240fps@240Hz) so when the limit hits, the lag penalty becomes so small. So engineering solutions exist over the long term, but isn't as simple as an in-game framerate limiter.

Did you know 8bit Nintendo games and arcade games such as Street Fighter was always VSYNC ON? We never complained about lag. The VSYNC ON only became evil for lag when the 3D accelerator / GPUs arrived, and their framebuffered architecture enforced a mandatory minimum lag from the framebuffering of 3D graphics.

TL;DR: Sudden lag increases when an external limiter of ANY kind is hit, is not a bug but a law of physics issue for current rendering architectures. This sort of thing requires creative game-developer-side programming to overcome. External framerate limiters force lag, no matter the tech. For now, game developers MUST proactively work this. Easiest way is simply to add an ability to set a configurable internal framerate limiter, similiar to Source Engine's fps_max command.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

flood
Posts: 925
Joined: 21 Dec 2013, 01:25

Re: External framerate caps adds lag. Use in-game framerate

Post by flood » 30 May 2014, 19:55

an external cap could theoretically work as well as a naive in-game cap (not the just-in-time vsync)
on linux, with intel graphics, I have managed to make a vsync'd program that has just 1 frame of unnecessary input lag. if I sleep for 14ms before rendering (i.e. crude just-in-time vsync), there is almost no input lag relative to the cursor. for some reason, the same code running on an nvidia card results in 2 frames of lag.

User avatar
RealNC
Site Admin
Posts: 2942
Joined: 24 Dec 2013, 18:32
Contact:

Re: External framerate caps adds lag. Use in-game framerate

Post by RealNC » 30 May 2014, 23:51

flood wrote:an external cap could theoretically work as well as a naive in-game cap (not the just-in-time vsync)
on linux, with intel graphics, I have managed to make a vsync'd program that has just 1 frame of unnecessary input lag. if I sleep for 14ms before rendering (i.e. crude just-in-time vsync), there is almost no input lag relative to the cursor.
That's an internal frame cap, since you're doing that in the program itself :-)

In an external frame cap, there's a sleep for 14ms (or whatever) before showing the frame, not before rendering it, and that sleep is done outside the program (by the driver.)
SteamGitHubStack OverflowTwitter
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

flood
Posts: 925
Joined: 21 Dec 2013, 01:25

Re: External framerate caps adds lag. Use in-game framerate

Post by flood » 31 May 2014, 00:52

RealNC wrote: That's an internal frame cap, since you're doing that in the program itself :-)
no it isn't. the cap is from the buffer swap which blocks when the graphics card is told to vsync.


actually i take that back; for gsync it is preferable to have a manual in-game cap.

some thoughts:

the ideal situation for vsync'd double buffering is:

(pretend monitor is 100hz)

0-2ms: get input, render frame to back buffer
2ms: call swap buffers:
2-10ms: swap buffers blocks until the next vblank
10ms: buffers have been swapped and program continues running
screen starts drawing frame from front buffer
10-12ms: get input, render frame
etc...

so the frame being drawn at 10ms shows input from 0ms. this means 10ms (1 frame) input lag at the top of the screen.

For some reason, it is usually the case that double buffered vsync'd programs have 2 or more frames of input lag... which means that there is some extra buffering somewhere.


the ideal situation for gsync is
0-2ms: get input, render frame to back buffer
2ms: call swap buffers:
2ms: if the screen is ready, swap buffers doesn't block and returns immediately
2ms-12ms: screen draws frame from front buffer
2ms - 11ms: program sleeps to keep a capped fps
11-13ms: get input, render
etc...

so here the input lag at the top of the screen is essentially the time it takes to render the frame.

if the program doesn't have an internal cap, well gsync will end up behaving like a normal double-buffered display, since swap buffers will have to block until the screen is ready.

now if the limit-hitting gsync behaves as the ideal double-buffered scenario above, it still isn't too bad because it would only be a frame of lag (7ms for 144hz). but looking at blurbuster's input lag test for csgo and gsync, it seems there's an additional 2 frames of lag in limit-hitting gsync.

User avatar
RealNC
Site Admin
Posts: 2942
Joined: 24 Dec 2013, 18:32
Contact:

Re: External framerate caps adds lag. Use in-game framerate

Post by RealNC » 31 May 2014, 01:42

flood wrote:
RealNC wrote: That's an internal frame cap, since you're doing that in the program itself :-)
no it isn't. the cap is from the buffer swap which blocks when the graphics card is told to vsync.
I'm confused. Where is the sleep done? How would you sleep before rendering from outside the program code?

Edit:
I've misread your post. You wrote "just 1 frame of unnecessary input lag". For some reason, I've read this as "just 1 ms of unnecessary input lag".

:P

For the PC with an NVidia card, maybe you get an additional frame of lag due to triple buffering. Try using:

Code: Select all

Option  "TripleBuffer"  "False"
in the "Device" section of xorg.conf.
SteamGitHubStack OverflowTwitter
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

flood
Posts: 925
Joined: 21 Dec 2013, 01:25

Re: External framerate caps adds lag. Use in-game framerate

Post by flood » 31 May 2014, 05:39

glxSwapBuffers is the thing that sleeps when there is vsync

User avatar
Chief Blur Buster
Site Admin
Posts: 7484
Joined: 05 Dec 2013, 15:44
Location: Toronto, Ontario, Canada
Contact:

Re: External framerate caps adds lag. Use in-game framerate

Post by Chief Blur Buster » 31 May 2014, 22:21

flood wrote:the ideal situation for vsync'd double buffering is:
There's a way to get VSYNC ON input lag that's almost 0ms at the top edge of screen and (1/Hz)th at the bottom edge of screen. This is 'just-in-time VSYNC', like certain apps like ezQuake is able to do. This is essentially what "Just-In-Time VSYNC" rendering techniques, such as the one in ezQuake, is able to do:

T-2ms before VSYNC: read input & render
T-1ms before VSYNC: finish rendering
T+0ms VSYNC time: flip, the current refresh cycle begins displaying frame containing input that was read only 2ms ago.

Basically wait till last minute before VSYNC (but enough time for a frame to be rendered), read input, render frame, and then *immediately* swap to framebuffer during VSYNC. This requires a very fast GPU (GPU capable of hundreds of frames per second) on a fast engine, combined with excellent prediction of rendertime, in order to do optimally, otherwise you often miss VSYNC.

There is no easy way to reliably accomplish sub-frame latency via external framerate capping, as drivers can't easily tell existing raster-timing-unaware games when to begin reading input and rendering, for optimal VSYNC. Drivers could theoretically predict subsequent rendertimes based on a trailing history of rendertimes, but this data is useless without a reliable external way to coax a game to adjust timing of input reads. You could attempt to guide this behavior by having drivers intentionally delay return from a Present() / glFlush()/glxSwapBuffers() during VSYNC ON, to right before the next subsequent VSYNC based on how long previous frames rendered -- in order to attempt to do a crude form of driver-based just-in-time-VSYNC, to try to coax a game to do input reads at the last minute. However, with the multithreaded game engines of today, this would be quite difficult to do reliably, but might work on certain games (tail end intentional delay). But with wildly varying graphics render times, the game engine is theoretically best equipped to predict rendertimes, rather than the driver itself.

So it is technologically possible to VSYNC ON input lag that are very similar to GSYNC, but it requires a powerful GPU and application-side creativity, and cannot easily be done at the driver level (except for games compatible with tail-end intentional strategic delay in returning from Present()/glFlush() as a method of driver-based just-in-time VSYNC to coax game engines to read input at the last minute before VSYNC). Obviously, you'd need a GPU that can render at framerates far in excess of your Hz, in order for this to be practical...

But, yes, external framerate caps penalty can be reduced to approximately 1 frame as you have pointed out. A simple increase of 8.3ms for 120Hz, which is fine for most people (many of us like VSYNC ON), though 5ms is a big chasm for professional (paid/sponsored) competitive game players.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

flood
Posts: 925
Joined: 21 Dec 2013, 01:25

Re: External framerate caps adds lag. Use in-game framerate

Post by flood » 31 May 2014, 22:45

There is no easy way to accomplish sub-frame latency via external framerate capping. External framerate capping still adds, at the minimum, 1 frame of input lag (in the best case scenario) as you've explained. But you can already do better than that (sub-frame) with internal framerate caps with last-minute rendering with minimum-possible buffer layers.
Well just-in-time vsync is technically still externally capped, but it's really just a matter of how external/internal is defined

IMO it's a risky solution as a sudden increase in frame render time can easily cause a missed vsync, leading to nasty stutters. maybe one way would be to allow tearing if the vsync is missed, like nvidia's adaptive vsync.

anyway gsync in conjunction with an internal framerate limited seems to be the no-compromise solution to input lag/tearing/stuttering (except for cost for us early-adopters)

User avatar
Chief Blur Buster
Site Admin
Posts: 7484
Joined: 05 Dec 2013, 15:44
Location: Toronto, Ontario, Canada
Contact:

Re: External framerate caps adds lag. Use in-game framerate

Post by Chief Blur Buster » 31 May 2014, 22:55

flood wrote:IMO it's a risky solution as a sudden increase in frame render time can easily cause a missed vsync, leading to nasty stutters. maybe one way would be to allow tearing if the vsync is missed, like nvidia's adaptive vsync.
This is sort of what San Francisco Rush 2049 used to do. (AHigh, the developer of that arcade game machine, is a forum member here). When frames began to take a longer time, a tearline slowly drifted from the top edge of the screen down a bit, then bounced back upwards off the screen.

Just-in-time VSYNC could also be based on historical frame rendertimes, plus a small safety margin. Usually rendertimes slowly moves upwards/downwards (e.g. turning left/right into complex scenery). This is easy to do with older game engines that have consistent processing time that slowly ramps up/down as you turn towards complex scenery. This would be harder to do when there's game engines that have lots single-frame-to-frame fluctuations. The safety margin could be user-adjustable, e.g. add 50% rendertime safety margin. So if you're running 500fps average for the last 10 frames, then render next frame at a time of (1/500sec + 1/1000sec) before the next blanking interval. Could also add a rendertime-fluctuation detector, so if frametime fluctuations are up and down by 1/400sec, then do the input read and render (1/500sec(rendertime detector) + 1/400sec(rendertime fluctuation detector) + 1/1000sec(safety margin)) before the VSYNC. Could also reject outliers (fluctuations that are suddenly twice as long or more), such as from disk reads or computer freezes, so it doesn't ruin the math going forward. Lots of trial and error would be warranted. The just-in-time feature could be turned on/off as a checkbox too. And the just-in-time safety margin can be a slider adjustment. User adjusts until stuttering stops or lowers to a manageable level.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

User avatar
RealNC
Site Admin
Posts: 2942
Joined: 24 Dec 2013, 18:32
Contact:

Re: External framerate caps adds lag. Use in-game framerate

Post by RealNC » 01 Jun 2014, 00:36

flood wrote:Well just-in-time vsync is technically still externally capped, but it's really just a matter of how external/internal is defined
What would be an internal cap, in your eyes?
SteamGitHubStack OverflowTwitter
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

Post Reply