high fps input lag [if GSYNC caps out?]

Talk about NVIDIA G-SYNC, a variable refresh rate (VRR) technology. G-SYNC eliminates stutters, tearing, and reduces input lag. List of G-SYNC Monitors.
User avatar
Chief Blur Buster
Site Admin
Posts: 11648
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: high fps input lag [if GSYNC caps out?]

Post by Chief Blur Buster » 03 Jan 2014, 22:29

RealNC wrote:With G-Sync, you would set a framerate target of whatever the vertical sync rate of the monitor is (144FPS for 144Hz monitors.) This is done through the internal frame limiter in the driver, which is controlled by stuff like EVGA Precision, MSI Afterburner or NVidia Inspector. That way, you'll never get input lag.
Actually, that's not always true -- There's an explanation why external frame capping increase input lag (e.g. VSYNC ON is a famous example of an external cap on framerate). Ideally, you should use the in-game frame cap to reduce GSYNC input lag, not the driver.

Game Frame Capping
The game waits till last minute before reading input, does fresh input reads [INPUT LAG CHAIN BEGIN], then renders/presents and display immediately refreshes [INPUT LAG CHAIN END].

Driver Frame Capping
The game does input read [INPUT LAG CHAIN BEGIN], then renders/presents, then driver throttles frame rate [ADDS EXTRA LAG], then display immediately refreshes [INPUT LAG CHAIN END].

That's assuming the game engine does input read right before render time.
RealNC wrote:Actually, I'm surprised that NVidia doesn't do that by default when you enable G-Sync.
Because driver-based frame capping has more lag than game-engine-based frame capping, especially with an architecture like GSYNC.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11648
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: high fps input lag [if GSYNC caps out?]

Post by Chief Blur Buster » 03 Jan 2014, 22:36

Q83Ia7ta wrote:So if compare for example monitor Asus VG248QE at 144hz, game QuakeLive at maximum fps 250, we will have:
GSYNC: Monitor's input lag + 2.9ms input lag by GSYNC;
LIghtBoost: Monitor's input lag + ~4ms input lag by LIghtBoost;
VSYNC OFF: Monitor's input lag.
Right?
Not exactly. Explaining, alas, will take time.

LightBoost input lag is part of the monitor's input lag, and the lag differential of VSYNC OFF versus VSYNC ON behaves different with strobed versus non-strobed (due to all-at-once presentation of LightBoost). Deltas at the top edge of the screen (differences in input lag between different modes) are even very different than the deltas at the bottom edge of the screen (differences in input lag between different modes). Needless to say, I understand what is going on, but ends up it almost essentially requires writing a 100-page book to explain! Over time, I'll certainly post information that gets people closer and closer to understanding input lag -- much like I've successfully helped explain LightBoost/strobing. New TestUFO animations will be forthcoming over the coming months, which helps educate people on VSYNC behavior, too -- much as I've used these animations to explain persistence/strobing/stutters/etc.

To top it off, one needs to factor in microstutter errors.
VSYNC OFF microstutters -- typically a consistent (1/fps) second
GSYNC cap-out issue -- see deltas illustrated in previous page.

But for now, to get closer to understanding "the chain" -- study AnandTech's "Exploring Input Lag Inside And Out".
http://www.anandtech.com/show/2803/7

Image

Image
Will Asus VG248QE with GSYNC module will have less input lag when GSYNC and VSYNC are off?
By approximately 1 millisecond (or two).

Keep tuned.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chickenfeed
Posts: 21
Joined: 23 Dec 2013, 19:33

Re: high fps input lag [if GSYNC caps out?]

Post by Chickenfeed » 04 Jan 2014, 11:07

Very informative stuff Marc :) I didnt realize that capping frame rate externally via drivers added extra latency (ive been running driver capped at refresh rate sans vsync for some time)

That said few games allow engine side capping (mostly Unreal and Source engine titles) so over all driver capping has been most consistent for me. Ive still perfered thisnto vsync so far though as Im able to easily cap out 120fps in most things. The game i play the most (Guild Wars 2) ironically isnt one of them. Its extremely cpu limited, doesnt scale well with SLI and has extremly varriable performance so its the game im most hopeful with regarding gsync (Arma3 as well given how demanding it is)

Thanks for all the good info.
Feedanator 6.0
CASE:FT02B|PSU:AX850|CPU:i7 4770K|Mobo:Z87 Pro|Ram:8x2GB GSkill 2400|GPU:SLI GTX 780 3GB|HD:M4 128GB+256GB/F1 1TB|SOUND:Titanium HD,PC350|LCD:VG248QE/QH270-lite|OS:Win8 x64 Pro|INPUT:SS Sensei /w HD9/DasUltSilent|COOLING:H80

User avatar
Chief Blur Buster
Site Admin
Posts: 11648
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: high fps input lag [if GSYNC caps out?]

Post by Chief Blur Buster » 04 Jan 2014, 17:44

Chickenfeed wrote:I didnt realize that capping frame rate externally via drivers added extra latency (ive been running driver capped at refresh rate sans vsync for some time)
This may actually be negligible in many games. At 120Hz, one refresh cycle is 8 milliseconds, and if input reads are being done asynchronously of the game framerate output, the latency difference between driver-vs-game capping may be negligible.

However, if a game has a proper framerate limiter, it always has less input lag than driver-based frame rate throttling (e.g. NVInspector) or display-based frame rate throttling (e.g. VSYNC ON double buffer) because the game has full control over keeping input reads fresh (keyboard/mouse/joystick).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
nimbulan
Posts: 323
Joined: 29 Dec 2013, 23:32
Location: Oregon

Re: high fps input lag [if GSYNC caps out?]

Post by nimbulan » 04 Jan 2014, 19:13

With all this discussion about vsync on vs off input lag (250 fps off vs 125 fps on,) two questions keep coming to mind:

How does the decreased input lag benefit the player when it is not visible? That is, you still need to wait the entire screen refresh to see the result so my understanding is that the perceived input lag can never be reduced below the screen refresh rate.

Now I do understand that the game client could respond, so pressing the trigger could allow your gun to fire 1/250 faster, but that brings me to my second question:

Why does that matter if the game server can't even utilize input that quickly? Game servers are generally designed to run at a lower framerate due to performance/bandwidth concerns. Even Counter-Strike has been historically played at a tickrate of 100 for competitive play, with the new Counter-Strike: Global Offensive capping out at 128 (though many servers run at 64.) I'd be very surprised if other games such as Battlefield run any higher than 64 ticks. Do Quake Live servers actually function at 250 ticks or is it just the client that is unlocked to 250?

crun
Posts: 16
Joined: 20 Dec 2013, 15:50

Re: high fps input lag [if GSYNC caps out?]

Post by crun » 04 Jan 2014, 19:57

I'd be very surprised if other games such as Battlefield run any higher than 64 ticks.
Battlefield 4 has 10hz tickrate... Battlefield 3 had variable 10-30 tickrate (not sure about this one)

User avatar
Chief Blur Buster
Site Admin
Posts: 11648
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: high fps input lag [if GSYNC caps out?]

Post by Chief Blur Buster » 04 Jan 2014, 19:58

nimbulan wrote:With all this discussion about vsync on vs off input lag (250 fps off vs 125 fps on,) two questions keep coming to mind: How does the decreased input lag benefit the player when it is not visible?
Even if you can't see the input lag, the improved responsiveness is still felt. It's very possible to feel the difference between 125fps and 250fps, in the form of better accuracy, responsiveness, and less error:

(1) Fast 180 flicks can involve a mouse moving at 4000 pixels/second (example). During 250fps, the aliasing error is 1/250th of that, rather than 1/125th of that (e.g. 16 pixel overshoot rather than 32 pixel overshoot).

(2) Microstutters during VSYNC OFF can be reduced at 250fps than 125fps. Even during a slow screen panning speed of one half screenful per second (~1000 pixels/second) a 1/125 microstutter can mean a stutter range of 8 pixels in onscreen-object-versus-eye-tracking mispositioning (1000 / 125 = 8), and a 1/250 microstutter can mean a stutter range of 4 pixels-off error in onscreen-object-versus-eye-tracking mispositioning (1000 / 250 = 4). Reduced microstutters means better accuracy during fast-aiming manoevers.

(3) The "snappier" factor: A human often gets used to aiming with a specific input lag. For example, flicking 180 degrees and then aiming crosshairs on a target quickly. You get used to a specific response speed. If you suddenly add or subtract 5ms of input lag, this can manifest itself as slight increase in aiming errors (e.g. overshooting by, say 16 pixels versus overshooting by 32 pixels, before re-aiming). You now need to get used to the equipment, so equipment changes mid-game can throw you off because your aiming got used to a specific input lag. It's much easier to adapt when input lag is falling, than when input lag is rising, however. Not all games increase snappiness during higher frame rates, but a lot of games do, especially when input reads are very close to the rendering.

(4) It's almost negligible, but the "feel" of better aiming is widely reported by huge numbers of competitive gamers, during ultrahigh framerates. Personally, I have been able to determine that 300fps VSYNC OFF does feel "snappier". I overshoot my targets a little less when I'm flicking 180 and instantly aiming right after a flick 180. Such rapid manoever sequences push the limits of "snappiness". 32 pixels (4000 pixels/sec divided by 125) on my monitor is the width of a pinky finger -- the difference between correctly aimed and not correctly aimed.

Based on personal experience, and the experience of competitive gamers, Blur Busters completely believes in the scientific basis of "5ms does matter to some". Key word is -- "some". It usually does not matter to most, but it's perceptible, useful, and non-negligible. In the amateur leagues, differences in skill levels and differences in consistency varies so much, that this is not measurable. Also, many game engines varies a lot in input lag, so the milliseconds of input lag differences is lost in the noise floor of variability (skills, skill levels, networking, etc).

Scientific tests are sorely needed, but public funding doesn't like to pay for impartial video game tests. How would you like your taxpayer funds to pay for video game scientific studies? Also, many websites don't yet have the scientific equipment to test for these aspects (while Blur Busters is slowly figuring out ways to measure this over the coming months/years).
nimbulan wrote:That is, you still need to wait the entire screen refresh to see the result so my understanding is that the perceived input lag can never be reduced below the screen refresh rate.
Depends on when the input read occurs relative to current screen-scanout timing.

During the 1980's, it was possible to do sub-frame input lag if you do input reads a few raster lines above a sprite (e.g. karate game), then move the sprite based on the input read, and it would display the sprite at the new position. But mid-refresh input reads are not routinely done (except during certain games, and only during VSYNC OFF, especially when framerates are running in huge excess of refresh rate)

As displays are refreshed top-to-bottom (raster scanning behavior): Theoretically, you can do an input read while the display is refreshing Scan Line #200, have an ultrafast GPU render quickly enough before display finish refreshing Scan Line #500, then begin immediately transmitting the frame, and finally have the reaction occur where the crosshairs is (Scan Line #540 -- the center of a 1080p screen) which is where the human game player is staring at. That's assuming a zero-buffered display (e.g. CRT or ultrafast TN gaming LCD).

So you see, nothing stops game makers from achieving sub-frame input latencies in a highly optimized game engine. In real life, you've got other latency causes preventing this from happening (e.g. mouse latency, game code latency, display cable transmission latency such as DisplayPort micropackets, LCD pixel response speed, etc). However, nothing prevents a game from achieving sub-frame input lag, if you do input reads right at render time.

Now I do understand that the game client could respond, so pressing the trigger could allow your gun to fire 1/250 faster, but that brings me to my second question:
nimbulan wrote:Why does that matter if the game server can't even utilize input that quickly? Game servers are generally designed to run at a lower framerate due to performance/bandwidth concerns. Even Counter-Strike has been historically played at a tickrate of 100 for competitive play, with the new Counter-Strike: Global Offensive capping out at 128 (though many servers run at 64.) I'd be very surprised if other games such as Battlefield run any higher than 64 ticks. Do Quake Live servers actually function at 250 ticks or is it just the client that is unlocked to 250?
Reacting 1ms faster can put your input read into the previous tick instead of the next tick. So even during 100 ticks per second (10ms per tick), you add a 1 in 10 chance of rounding off to an earlier tick (for a random event), for every 1ms of improvement in input lag.

It's my understanding that Quake Live runs at 125 ticks per second, even at 250fps.
The delta between 1/125 and 1/250 is a 4ms difference. I'm not sure what techniques Quake Live uses to level the playing field between 125fps and 250fps, but let's say they successfully levelled it. In this case, there's no input lag difference at the game server side. But having a faster display will bypass this, e.g. upgrading 60Hz->120Hz display, or even upgrading 120Hz->240Hz (cirthix-style) display, or upgrading your mouse from 125Hz->1000Hz (8ms->1ms poll latency).

This affects your apparent reaction time independently of the game server, and you do still get increased chances of your input read being rounded off to the previous tick.

Even if Quake Live levels latency for 125fp vs 250fps (no game world latency difference via game playfield-levelling coding techniques), you still get an advantage from the responsiveness improvements (that I described above) that you do get.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Q83Ia7ta
Posts: 761
Joined: 18 Dec 2013, 09:29

Re: high fps input lag [if GSYNC caps out?]

Post by Q83Ia7ta » 04 Jan 2014, 22:07

nimbulan wrote:How does the decreased input lag benefit the player when it is not visible? That is, you still need to wait the entire screen refresh to see the result so my understanding is that the perceived input lag can never be reduced below the screen refresh rate.
For example let's take LightBoost and fast-paced fps like QuakeWorld(Multiplayer Quake 1) or QuakeLive. Most of players (70-90%) prefers non-strobed 144hz/120hz instead LightBoost coz tiny input lag are noticeable. Nobody has measured and published this value of input lag. It's not easy to do coz needs quite unique tools to do this.
nimbulan wrote: Why does that matter if the game server can't even utilize input that quickly? Game servers are generally designed to run at a lower framerate due to performance/bandwidth concerns. Even Counter-Strike has been historically played at a tickrate of 100 for competitive play, with the new Counter-Strike: Global Offensive capping out at 128 (though many servers run at 64.) I'd be very surprised if other games such as Battlefield run any higher than 64 ticks. Do Quake Live servers actually function at 250 ticks or is it just the client that is unlocked to 250?
First of all almost all calculations are client-side. For QuakeLive server's tickrate is 40 (yeah only 40 times per second). But at same time client sent 125 packets per second to server and showing 250fps to client. There are lag compensation algorithms at multiplayer games...

flood
Posts: 929
Joined: 21 Dec 2013, 01:25

Re: high fps input lag [if GSYNC caps out?]

Post by flood » 05 Jan 2014, 00:09

nimbulan wrote: How does the decreased input lag benefit the player when it is not visible?
The player feels more connected to the screen.
I'm not sure about changes around 5ms, but I know for fact that there is a significant change in feel when going from 15ms to 0ms.

User avatar
Chief Blur Buster
Site Admin
Posts: 11648
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: high fps input lag [if GSYNC caps out?]

Post by Chief Blur Buster » 05 Jan 2014, 01:50

Q83Ia7ta wrote:First of all almost all calculations are client-side. For QuakeLive server's tickrate is 40 (yeah only 40 times per second). But at same time client sent 125 packets per second to server and showing 250fps to client. There are lag compensation algorithms at multiplayer games...
Thanks for the correction; tickrate isn't necessarily packetrate.
flood wrote:I'm not sure about changes around 5ms, but I know for fact that there is a significant change in feel when going from 15ms to 0ms.
It varies a lot.
Game engines with more whole-chain lag and high variability, versus games with less whole-chain lag and low variability.

Two extreme cases:

Game engines such as Battlefield 3/4 running at only 50fps, using a 10Hz-30Hz tickrate for network gameplay, and huge variances in input lag. One gunshot getting 72ms whole-chain input lag (buttonpress-to-eyeballs), the next gunshot getting 93ms whole-chain input lag, the subsequent gunshot getting 87ms whole-chain input lag. In this case, you need very granular improvements (e.g. 30ms improvement) to really notice.

Other game engines (older ones) running VSYNC OFF, at ultrahigh framerates (e.g. 300fps+), on a 144Hz+ display, at a much higher tick rate (e.g. 100 or even 200), on a very powerful GPU, with very consistent gunshot-to-gunshot times (e.g. one gunshot gets whole-chain input lag of 15ms, next gunshot gets 17ms whole-chain input lag, next gunshot gets 14ms whole-chain input lag). In this case, a 5ms improvement in input lag can actually begin to become barely felt by elite players in an indirect manner ("hey, my aim worse" to "hey, my aim is better") everytime you increase/decrease the lag by 5ms. See my previous reply about a gamer getting used to a specific input lag, and gamers beginning to overshoot/miss if you change the input lag suddenly on them.
-> Even 5ms of aim overshoot during 4000 pixels/second turning movement, is a 20 pixel error! (4/1000ths of 4000 is equal to 20)
-> A 5ms improvement can mean screen reaction occurs almost one refresh cycle sooner on a 144Hz monitor (1/144 = 6.9ms)
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Post Reply