high fps input lag [if GSYNC caps out?]

Talk about NVIDIA G-SYNC, a variable refresh rate (VRR) technology. G-SYNC eliminates stutters, tearing, and reduces input lag. List of G-SYNC Monitors.
flood
Posts: 929
Joined: 21 Dec 2013, 01:25

high fps input lag [if GSYNC caps out?]

Post by flood » 21 Dec 2013, 02:07

Does gsync help input lag at all when fps is higher than refresh rate (when the refresh rate is set to the maximum supported by the monitor)?

The current situation is that you can either have
vsync off: minimal input lag but a bit of tearing
vsync on: no tearing but at least 2 refresh period's worth of additional input lag.

Actually thinking about it a bit, I don't think gsync would help at all for games that run with high fps. In the ideal world games would run at 300fps and have properly implemented triple buffering... but I've yet to see anything that has triple buffering. until then Im going to stick with vsync off and live with a bit of tearing

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: high fps input lag

Post by Chief Blur Buster » 21 Dec 2013, 02:57

Good questions and thoughts; It's something I've been thinking of.

In regards to VSYNC ON input lag, I can confirm that VSYNC ON input lag is not necessarily always two frames lag. It really depends on how the game engine does it; e.g.:

- VSYNC ON, triple buffering, as you've mentioned - least input lag
- VSYNC ON, double buffering, rendering at the very last minute before VSYNC flip; less input lag because of fresher input reads during last-minute rendering. Risk of missing VSYNC but for predictable render times, this technique works well (some emulators such as WinUAE now have a command line option for this).
- VSYNC ON, double buffering, the regular way - more input lag
- VSYNC ON, multiple buffering (improper triple buffering, or worse) - even more input lag

Also, here's a great post I made on Overclock.net about GSYNC which I'll be cross-posting here, that answers the G-SYNC part of the question, and about how much latency occurs when you try to do a higher framerate.
Mark Rejhon wrote:During GSYNC rate of 30fps-144fps, GSYNC is the same as VSYNC OFF, within a 1ms difference (the GSYNC poll time).

When you cap-out at 144fps, the input lag diverges. The monitor finally begins waiting. There's slightly less lag during VSYNC OFF 300fps than GSYNC 144fps, for example. But if you never play games at 300fps, that's not going to be important to you.

Let's for instance, take Quake Live, a game that runs at 125fps capped internally. [That was before the 250fps version that came out] In that game, there wouldn't be significant difference. So when playing Quake Live, at 125fps capped:

VSYNC ON:
- Quake Live renders frame
- Quake Live **WAITS** for vsync. Input lag occurs.
- Monitor displays frame

VSYNC OFF
- Quake Live renders frame
- Monitor immediately displays frame, mid-scan (splice!), creating tear

GSYNC
- Quake Live renders frame
- Monitor immediately displays frame, with new refresh cycle beginning *immediately* at the top edge, no tear.

____

The non-waiting frame rates of GSYNC is 30fps-144fps. Below 30fps, there's the re-refresh (like a DRAM self-refresh cycle) to prevent the screen from going stale. Above 144fps, you end up waiting for the previous refresh to finish, before displaying the new refresh.

In the range (30-144), there's only about ~1ms difference between GSYNC and VSYNC OFF during this frame rate range. If you never cap-out at 144fps, you don't have any waiting occuring. If you try to send more than 144fps in GSYNC mode, input lag differences does diverge because the monitor refreshes faster. But during 30fps through 144fps, there's almost no measurable latency difference between GSYNC and VSYNC OFF (~1ms, for the "is-the-monitor-ready" poll).

GSYNC actually doesn't have a regular VSYNC per se, since it's asynchronous rather than SYNChronous, so it's really actually a special kind of VSYNC OFF (that visually looks like VSYNC ON!) -- because the monitor will start refreshing *immediately* at the top edge (no tearline) rather than *immediately* in mid-scan (tearline occurs). So during 30fps-144fps range, it combines the pros of VSYNC OFF and VSYNC ON, without the cons of either. The "300fps+" players may still prefer VSYNC OFF if they love the extra few milliseconds -- the good news is GSYNC monitors support all three modes (VSYNC ON -and- VSYNC OFF -and- GSYNC).

.......

Now, let's give the situation, you're wanting to run an older game such as Counterstrike: GO or some game capable of 300fps or thereabouts.

Above 144fps, the input lag diverges between GSYNC and VSYNC OFF only because GSYNC has a 144fps framerate cap. Once you hit 144fps, it has to finally start waiting for the monitor to finish the previous refresh (much like waiting for VSYNC), so now it behaves like 144fps=144Hz VSYNC ON. Even so, this is more harmless than VSYNC ON because when framerate slows down during GSYNC below 144fps (e.g. 143fps), it will never suddenly halve to 72fps -- it will gracefully slow down. None of the jarring input lag change of sudden frame rate halvings. It is like driving a car with a continuously variable transmission (CVT), rather than a gear shifting effect. You don't feel/hear/notice gear effects. I can't tell apart 142fps, 143fps or 144fps. You do get less input lag at 300fps VSYNC OFF than 144fps VSYNC OFF, but GSYNC doesn't let you go 300fps. Even so, the theoretical maximum input lag divergence between GSYNC and VSYNC OFF is a theoretical 6.9 milliseconds (1/144sec) plus the GSYNC poll time (~1ms), if you successfully get infinite-framerate VSYNC OFF (which is impossible). In the real world, the average input lag diverence of 288fps VSYNC OFF relative to 144fps GSYNC would mathematically be an average of 1/288th of a second difference (only 1/288sec = 3.4 milliseconds extra lag (+ 1ms poll) for GSYNC 144fps versus VSYNC OFF 288fps). Most gamers, except the uber-elite competitive gamers, would not even care about that.

Even with elite competitive gamers, it doesn't matter with Quake Live....It is limited to 125fps so that's below the GSYNC cap. And when you play Battlefield 4, you ain't getting 300fps, either. Since most of the modern games you play, will not cap-out at 144fps, realistically, you won't be hitting the GSYNC 144fps limit. And 144fps isn't necessarily the final frontier for GSYNC monitors later this decade...
So you got it; GSYNC 144fps diverges from VSYNC OFF 288fps only a tiny bit -- about 4.5 milliseconds difference (1/2 of 6.9ms plus 1ms GSYNC "is-monitor-ready-for-new-refresh" poll time that NVIDIA mentioned).

Also, attempting to do 145fps on GSYNC, would only be a tiny difference between 1/145sec and 1/144sec = (6.993ms minus 6.944ms) = only a ~49 microsecond GPU latency difference between 145fps and 144fps. (Excluding the NVIDIA-quoted 1ms GSYNC poll time) So it's not too harmful to latency to attempt to try to exceed by a few frames per second. Latency difference grows the bigger the FPS goes between VSYNC OFF versus GSYNC. So, yes 300fps VSYNC OFF would have more divergence from 144fps GSYNC.

But say, you're playing a more GPU-hungry game such as BF4 and BF3, and your GPU is only capable of about 150fps or 160fps, then you're not being penalized much. And if this bothers you, setting fps_max to a slightly lower value like fps_max 142 or fps_max 143, to give some headroom below the monitor's maximum rate. This allows fresher input reads before rendering before immediate refresh (e.g. Direct3D Present() immediately renders & refreshes the screen with no waiting for VSYNC).

The great thing is that G-SYNC monitors also support traditional VSYNC OFF and VSYNC ON operations (and strobed and non-strobed operations), so you do have a great choice between a multitude of modes on a G-SYNC monitor.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: high fps input lag

Post by Chief Blur Buster » 21 Dec 2013, 03:09

Also, (depending on the game, the kind that bogs your GPU down), half of the problem with VSYNC ON is the annoying variable latency effect caused by sudden slowdowns (framerate halvings), as shown:



When framerate halves, you suddenly have twice as much buffering-layer input lag (e.g. 33.3ms instead of 16.6ms).

The sudden fluctuations in input latency (caused by slowdowns) throws off aiming, and is a major cause of mis-aiming, especially during the critical busy moments in games. The remainder of the VSYNC ON input latency, is the bottlenecking of the frame render rate (e.g. 144fps instead of 300fps).

G-SYNC never does that "fluctuating input lag effect", so it solves the vast majority of the annoying VSYNC ON input latency. In a proper implementation, GSYNC is theoretically more stutter-free than triple buffering (due to theoretical perfect sync between frame rendertimes & frame presentation times -- something not even triple buffering can solve; there's an inherent ultramicrostutter effect that triple buffering can introduce -- most people can't see it; but a frametime can be 0.7/144ths second ago or 0.3/144ths of a second ago in the buffer that gets actually displayed). In real life, things depends on the game engine, and GSYNC can actually make things worse in some games if the game engine doesn't play well with GSYNC. Though that's a minority of the games...

VSYNC OFF still reigns supreme for some extreme cases (e.g. 300fps), but the delta is surprisingly small (smaller than most people thought).

Good thoughts and questions though, very worthy of discussion on the Blur Busters Forum --
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

flood
Posts: 929
Joined: 21 Dec 2013, 01:25

Re: high fps input lag

Post by flood » 21 Dec 2013, 04:43

During GSYNC rate of 30fps-144fps, GSYNC is the same as VSYNC OFF, within a 1ms difference (the GSYNC poll time).
Not sure about this. It's theoretically possible but there could potentially be excessive buffering in the drivers or the game itself. And afaik that always seems to be the case, except in some dummy programs Ive made.

Generally double buffering is used when vsync is enabled and thay should have at most 1 refresh rate period of input lag. but for some reason it seems that there is often 2 or more frames of lag (e.g. windows aero's compositor, some opengl games in ios, entire android system probably)

gsync is enormously helpful but Im not convinced it does anything to the other problem associated with vsync, which is excessive buffering in the software and/or driver)

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: high fps input lag

Post by Chief Blur Buster » 21 Dec 2013, 12:37

flood wrote:Not sure about this. It's theoretically possible but there could potentially be excessive buffering in the drivers or the game itself. And afaik that always seems to be the case, except in some dummy programs Ive made.
I talked to people at NVIDIA, and they confirmed key details.

The framebuffer starts getting transmitted out of the cable after about 1ms (the GSYNC poll) after the render-finish of Direct3D Present() call. That means, if your framebuffer is simple, the first pixels are on the cable after only 1ms after Direct3D Present() -- this is provided the previous call to Present() returned at least 1/144sec ago. Also, the monitor does real-time scanout off the wire (as all BENQ and ASUS 120Hz monitors does). Right now, they are polling the monitor (1ms) to ask if it's currently refreshing or not. This poll cycle is the main source of G-SYNC latency at the moment, but they are working on eliminating this remaining major source of latency (if you call 1ms major!). One way to picture this, "on the cable", the only difference between VSYNC OFF and G-SYNC is that the scanout begins at the top edge immediately, rather than a splice mid-scan (tearing).

I presume you're already familiar with high speed videos of the sequential top-to-bottom scanning? On current ASUS and BENQ monitors, they do real-time scanout "off the wire", and this continues with the G-SYNC monitor as well.
flood wrote:Generally double buffering is used when vsync is enabled and thay should have at most 1 refresh rate period of input lag. but for some reason it seems that there is often 2 or more frames of lag (e.g. windows aero's compositor, some opengl games in ios, entire android system probably)
Yeah, that's very true. The compositor adds another layer.
In this event, turn OFF VSYNC, to eliminate one frame -- Tearing never shows up because the compositor is still buffering, but you've actually eliminated one layer by turning VSYNC OFF in windowed mode (or borderless windowed mode).
flood wrote:gsync is enormously helpful but Im not convinced it does anything to the other problem associated with vsync, which is excessive buffering in the software and/or driver)
From the perspective of the frame transmission cycle, GSYNC is the same VSYNC OFF except the tearline occurs off the top edge of the screen because the transmission to monitor occurs beginning there, and the display immediately starts scanning up there, rather than mid-scan (splice mid-refresh, because with traditional VSYNC OFF you're not able to restart the monitor refresh at the top edge. While GSYNC is able to).

Did you know: During each refresh during VSYNC OFF -- the input lag of the image above the tearline is always higher than the input lag of the image below the tearline? The freshnesses of control reads for each image slice, are all different (in games that sample control every time a frame is rendered). If the tearlines are occuring at random places, the input lag is always varying by 1/fps (e.g. 1/100th of a second variance in input lag if you're playing 100fps -- a 10ms variance), for a given specific point on the screen. The higher the framerate, the less input lag variances of VSYNC OFF. So at 300fps, you've reduced these input lag jittering to only 1/300sec (3.3ms). Other than latency itself, this is also additionally, one of the big cause of better aiming at 300fps than at 100fps. So this is another big reason why if you use VSYNC OFF, things look smoother and feels better at framerates vastly exceeding refresh rate.

However, G-SYNC theoretically eliminates this variance completely by starting the monitor scan a predictable location, so 144fps zero-jittering may actually outperform 300fps slight-jittering for certain people -- depending on how they play. The science of less lag versus less jittering is an area of study that needs to be scientifically evaluated.

Play testing by competitive gamers will be needed to determine preferences, but I predict G-SYNC will accepted (lag-wise) in the competitive world, at least for modern games that aren't able to massively exceed the monitor cap. And the monitor rate cap will eventually be raised in the future (e.g. 240Hz monitor experiment, like the one cirthix talked about in Area 51).

Good discussion of the technical aspects!
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

flood
Posts: 929
Joined: 21 Dec 2013, 01:25

Re: high fps input lag

Post by flood » 23 Dec 2013, 15:17

assume I want to run my game fullscreen on a gsync monitor

If my game runs between 30 and 144 fps, the best way would be to turn vsync off and there would be no tearing because of gsync (this is my current understanding of gsync)
If my game runs consistently above 144fps, should I set vsync on or off? Does gsync even do anything here?


The input lag associated with double buffering is another discussion and I think I'll make a new thread about it sometime.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: high fps input lag

Post by Chief Blur Buster » 23 Dec 2013, 15:50

flood wrote:assume I want to run my game fullscreen on a gsync monitor
If my game runs between 30 and 144 fps, the best way would be to turn vsync off and there would be no tearing because of gsync (this is my current understanding of gsync)
Correct. Just turn VSYNC OFF in your games.
If my game runs consistently above 144fps, should I set vsync on or off? Does gsync even do anything here?
It'll likely be a personal preference, and will probably also depend on the game engine, and what framerate you could have achieved, too. There are pros and cons, even at the competitive level.

It's not a "You suddenly get input lag once you hit 144fps" scenario for GSYNC. The input lag gradually appears the frame gets rendered faster and faster than 1/144sec. Basically if you can render the frame too fast (e.g. 1/300sec) the input lag is from the delta between 1/300sec (rendering of the keyboard/mouse input reads) and 1/144sec (actual display of refresh). So if your frames are rendered in 1/150sec, for a proper GSYNC implementation, your input lag delta is only (1/144) - (1/150) = 0.27ms lag = 278 microseconds lag between VSYNC OFF 150fps and GSYNC 144fps. Not even a competitive game player would care about that.

The input lag deltas would be:
Frame rendertime 1/150sec = VSYNC OFF 150fps versus GSYNC 144fps = ((1/144) - (1/150)) = 0.27ms delta
Frame rendertime 1/175sec = VSYNC OFF 175fps versus GSYNC 144fps = ((1/144) - (1/175)) = 1.2ms delta
Frame rendertime 1/200sec = VSYNC OFF 200fps versus GSYNC 144fps = ((1/144) - (1/200)) = 1.9ms delta
Frame rendertime 1/250sec = VSYNC OFF 250fps versus GSYNC 144fps = ((1/144) - (1/250)) = 2.9ms delta
Frame rendertime 1/300sec = VSYNC OFF 300fps versus GSYNC 144fps = ((1/144) - (1/300)) = 3.6ms delta
Then add the 1ms GSYNC poll time (this may actually get removed, depending on how NVIDIA's work is going).

One solution is simply cap your frame rate slightly below the GSYNC maximum rate. Such as an fps_max of 143 (game-engine capping of your framerate). That way, good game engines will keep input reads fresh and the refreshes are never delayed. Basically a good game would read keyboard/mouse input, then render frame (immediately after reading input) then immediately refresh (immediately after rendering). So you can avoid the progressively-increasing input latency effect of running a too-fast GPU that reads input, renders the next frame very early while the monitor is still refreshing the previous frame. Forcing a longer wait if the rendertimes are very short. So using a framerate limiter below the max refresh rate, works well with GSYNC in games that reads inputs every time a frame is rendered. Game engines that read input separately of refreshes (e.g. synchronous 250Hz input reads, far above GSYNC maximum), may actually end up working better with VSYNC OFF and will not see as much latency reductions from software-based frame rate capping. In that situation, sometimes VSYNC OFF is better (but you have to then also live with the VSYNC OFF microstutters).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

flood
Posts: 929
Joined: 21 Dec 2013, 01:25

Re: high fps input lag

Post by flood » 23 Dec 2013, 16:04

wait so

If my game runs at 234 fps and I set vsync off, will it tear on a gsync monitor?

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: high fps input lag

Post by Chief Blur Buster » 23 Dec 2013, 16:23

flood wrote:If my game runs at 234 fps and I set vsync off, will it tear on a gsync monitor?
No, to clarify, it will be capped to 144fps due to GSYNC behaviour.

This situation becomes
Frame rendertime 1/234sec = VSYNC OFF 234fps versus GSYNC 144fps = ((1/144) - (1/234)) = 2.7ms delta

What this means "You will have 2.7ms more input lag with GSYNC (144fps) than VSYNC OFF (234fps), if your game is rendering frames in 1/234sec".

(NOTE: Including the GSYNC poll time (1ms) -- that would be about 3.7ms -- this may be removed in the future)
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

flood
Posts: 929
Joined: 21 Dec 2013, 01:25

Re: high fps input lag

Post by flood » 23 Dec 2013, 16:57

Chief Blur Buster wrote:
flood wrote:If my game runs at 234 fps and I set vsync off, will it tear on a gsync monitor?
No, to clarify, it will be capped to 144fps due to GSYNC behaviour.
how does gsync cap it?
The program renders, swaps the front and back buffers immediately, and continues to render...

Post Reply