high fps input lag [if GSYNC caps out?]

Talk about NVIDIA G-SYNC, a variable refresh rate (VRR) technology. G-SYNC eliminates stutters, tearing, and reduces input lag. List of G-SYNC Monitors.
User avatar
nimbulan
Posts: 323
Joined: 29 Dec 2013, 23:32
Location: Oregon

Re: high fps input lag [if GSYNC caps out?]

Post by nimbulan » 05 Jan 2014, 03:19

Thank you for all that information, I certainly wasn't expecting such lengthy replies.

I do understand the point about server timing and the possibility for a command being registered in an earlier tick. Does Battlefield 4 really only run at 10 ticks? That would definitely explain the laggy hit markers.

The high framerate input lag issue is still rather fuzzy to me. The only way I see it reducing perceived input lag is for the second frame drawn during a refresh to take up the majority of the screen (so that frame contains the crosshair) which you can't guarantee, and that still leaves the issue of screen tearing causing variable input lag across the screen. In the other case with the first frame drawn during a refresh taking up the majority of the screen, the input lag should be the same as vsync at half the framerate. So what it sounds like to me is rather than simply reducing input lag, you are introducing input lag variability equal to half the refresh rate depending on the location of the screen tear which should have a negative effect on accuracy, not a positive one. Am I completely missing the point here? It honestly sounds like a placebo effect to me (IE, the player's aim improves because they know their framerate is higher, not because of the higher framerate.) I would really like to see a double-blind scientific study on the matter though like you said it is unlikely to happen.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: high fps input lag [if GSYNC caps out?]

Post by Chief Blur Buster » 05 Jan 2014, 04:12

nimbulan wrote:The high framerate input lag issue is still rather fuzzy to me. The only way I see it reducing perceived input lag is for the second frame drawn during a refresh to take up the majority of the screen (so that frame contains the crosshair) which you can't guarantee, and that still leaves the issue of screen tearing causing variable input lag across the screen.
Let's say tearline positions occurs randomly. Tearlines that occur just right above the crosshairs will mean the crosshairs have less lag, than tearlines that occurs further above the crosshairs. Higher framerates means more tearlines, which means more tearlines closer immediately above the crosshairs, which means freshly rendered frames about to be "scanned down" towards the crosshairs location. Viola. Reduced input lag.

We're assuming a game engine that does input reads at very high frequencies, or input reads at render time. It all depends on how the game decouples input reads (e.g. separate thread) from rendering.
nimbulan wrote:So what it sounds like to me is rather than simply reducing input lag, you are introducing input lag variability equal to half the refresh rate depending on the location of the screen tear which should have a negative effect on accuracy, not a positive one.
No. It averages out to a positive effect.

That said, an extra consideration arises: Stutters/stationary/rolling tearline effects caused by harmonics between framerate versus refreshrate. 119fps @ 120Hz will have a slowly-rolling tearline effect that creates one wiggly stutter per second. But once you're away from harmonics and tearline positions begin to look random again, the average net benefit becomes more positive, the higher the framerate you go. The higher the framerate, the less stuttery tearing looks. e.g. tearing looks definitely noticeably less microstuttery at 400fps@144Hz than 200fps@144Hz. (these numbers are intentionally away from noticeable harmonic effects). This is because moving object positions varies away from the correct position by 1/400ths, rather than 1/200ths in the aliasing between timing of frame rendering time, versus frame presentation time, versus current eye tracking position. Input lag can scale too, provided you're doing at least as many input reads as the framerate (e.g. 400 input reads during 400fps). But other game engines will level the playing field by reading input at a tick rate, etc. That doesn't fix the VSYNC OFF microstutters, which clearly noticeably goes down the higher the framerate you go far above the refresh rate.
nimbulan wrote:Am I completely missing the point here?
For some game engines, yes. (especially low tickrates, input reads done independently of refresh rate)
For other game engines, no. (especially if input reads at least match framerate)

With a 1000Hz mouse, you have 1ms granularity between input reads, so benefits are there.

Separately (not an input lag factor, but a microstutter factor), but during turning left/right, I can easily tell the difference between VSYNC OFF 200fps@144Hz and VSYNC OFF >400fps@144Hz. I can personally feel it myself, being a non-competitive casual gamer (but very stutter/motion blur sensitive). The reduced stutters is definitely unquestionably NOT a placebo effect -- Technically, for something like this; it's a sucker bet to me -- I would bet a large amount of money that this is true, and easily win the bet, assuming certain variables and parameters are met (e.g. mouse input reads within 1ms of rendertime). Also, strobed displays (e.g. LightBoost) makes stutters/tearing easier to see, so the detectability thresholds are higher. With LightBoost displays having only about 1 pixel of motion blurring during 1000 pixels/second motion (see Photos: 60Hz vs 120Hz vs LightBoost). Although this is not the input lag part, and strobing does slightly increase input lag (very slightly) in exchange for improved motion clarity.

Meanwhile, I'd love any visiting scientists to do a double blind study. We need to see more science being done here. Alas, public science (taxpayer funded) on gaming technology is not normally done. Especially with complications such as VSYNC OFF (an invention that mainly being taken advantage of for 3D video gameing, and are not normally accounted for in most science papers).
nimbulan wrote:It honestly sounds like a placebo effect
No it isn't.

Though, skill will definitely compensate for latency differences -- competitive gamers often use 60Hz displays at events, and game engines have leveled the input lag playing field somewhat.

Pro gamers routinely gain a bigger advantage playing at higher framerates, than upgrading to a higher-refresh-rate monitor. Better response is found by upgrading 60fps@60Hz -> 300fps@60Hz (upgraded framerate, constant refreshrate), than when ugprading 120fps@60Hz -> 120fps@120Hz (constant framerate, upgraded refreshrate). Ideally, you want to do both at the same time (e.g. upgrade your framerate and refreshrate simultaneously) but if you were competing in pro leagues, and you were forced to choose one upgrade over the other, more framerate can be preferred even when framerates are higher than refresh rate.

Among competitive game players, better scores do often occur at 300fps@60Hz than at 60fps@60Hz, because of better, snappier mouse response. Input reads are fresher, you have GPU rendertimes of 1/300sec rather than 1/60sec, even if your monitor is only 60Hz. And for zerobuffered displays (current ASUS/BENQ 120Hz/144Hz monitors are zero buffer displays), what's actually displayed at the crosshairs is a frame from the GPU only 1/300sec ago, because the tearline (buffer flip) occured 1/300sec above the crosshairs, and the display took less than 1/300sec to scanout downwards to the crosshairs location. Yes, the tearline occurs randomly, but because there is 300 frames per second, there are tearlines 1/300sec apart, and the display scanning takes only 1/300sec to scan from the previous tearline to the next tearline. (Assuming each frame took exactly 1/300sec to render).

Now if you're doing 600fps @ 60Hz, you've got only 1/600sec between tearlines, and the tearline just above crosshairs is also 1/600sec ago (e.g. At 600fps, there is about 10 tearlines per 60Hz refresh cycle, and thus there always ends up a tearline within 1/10th display height above the crosshairs, that always takes 1/600th of a second to scanout towards the crosshairs location)

There is up to as many tearlines per second as there is framerate (although some of them are sometimes hidden because they take place during the blanking interval by chance. Note that tearline offsets become half the size at double framerate, so tearlines becomes progressively fainter at higher framerates. Less skew (smaller disjoints), thus harder to see.

Obviously, this excludes a lot of the input lag chain (e.g. mouse USB cable latency, other game processing, multiple buffer layers, DisplayPort cable latency, pixel response, etc), but hopefully this conceptually explains this better.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
nimbulan
Posts: 323
Joined: 29 Dec 2013, 23:32
Location: Oregon

Re: high fps input lag [if GSYNC caps out?]

Post by nimbulan » 05 Jan 2014, 04:54

I'm pretty sure I understand the point you're trying to make now. Turning vsync off and drastically increasing your framerate (2x or more) past your refresh rate reduces input lag, but it also introduces stuttering and some variability in input lag. It doesn't sound like a good tradeoff to me but it definitely seems to be an effect I need to test myself to get a full understanding of it. I probably need more time to get used to my new mouse and having a 120 Hz monitor before I try any tests like that though.

It does make me wonder why there isn't a vsync method that allows 2 frames (or even more) per refresh to reduce input lag. Would that just be too difficult to implement?

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: high fps input lag [if GSYNC caps out?]

Post by Chief Blur Buster » 05 Jan 2014, 04:54

Also, I just ran high speed video input lag tests, that definitively confirm CS:GO has less input lag with uncapped VSYNC OFF (framerate massively exceeding refresh), than during capped VSYNC OFF (e.g. fps_max 142). This is why I consider it a sucker bet if someone wants to bet money with me, that there's no benefit to excess framerate.

Some of these findings are being written as part of my GSYNC article, Part #2, as high speed video tests of input lag of VSYNC OFF versus GSYNC. I don't think any other websites has ever done this before -- measuring the input lag of GSYNC this way.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: high fps input lag [if GSYNC caps out?]

Post by Chief Blur Buster » 05 Jan 2014, 04:55

nimbulan wrote:I'm pretty sure I understand the point you're trying to make now. Turning vsync off and drastically increasing your framerate (2x or more) past your refresh rate reduces input lag, but it also introduces stuttering and some variability in input lag.
Actually, that's not what I meant.

To clarify, there's actually always stuttering at all framerates with VSYNC OFF, below or above refresh rate. Tearing always creates microstutter effects, which also varies depending on the harmonic effects. You do reduce microstutter a lot when you hit multiples (e.g. 120fps@120Hz, or 240fps@120Hz, etc) but you introduce the harmonics such as the stationary/rolling tearline effect. Sustained visibility of tearlines is far more noticeable than single-frame visibility of tearlines.

There's always variability in input lag at all frame rates. Variability of input lag goes down, the higher the framerate you go, so that isn't a tradeoff.

For many games, that do high-frequency input reads (e.g. properly responsive with 1000Hz mouse, at least locally for turning/aiming) or input reads synchronous with rendering, there are no tradeoffs to using a higher framerate with VSYNC OFF, unless you are used to playing at the harmonics (for the improved fluidity, at the tradeoff of the stationary tearline effect).
nimbulan wrote:It does make me wonder why there isn't a vsync method that allows 2 frames (or even more) per refresh to reduce input lag. Would that just be too difficult to implement?
That creates undesirable harmonic effects. 240fps @ 120Hz creates two stationary tearlines, since they reoccur in the same positions in subsequent refresh cycles.

You could do essentially do what you want, by doing VSYNC OFF, then setting the game's own framerate cap (e.g. fps_max 240). But try it out. Use a fast GPU such as Titan or R9 290X, and run an older game such as Counterstrike or Half Life. Strafe sideways in front of bright/dark vertical edges (or poles). Witness stationary, near-stationary, vibrating, rolling tearlines. That's the harmonic effect you're witnessing.

To help conceptually explain this better -- let's remember each scanline is output at a constant rate. At 1080p@120Hz, you're outputting about (1080 lines of pixels + 45 lines of blanking interval) x 120Hz refresh = 135,000 scan lines per second = 135KHz horizontal scan rate. One row of pixels are transmitted from computer to the monitor every 1/135,000 second during this situation, and zero buffered displays output these rows of pixels immediately (scanout). Each refresh cycle is 1125 scan lines (1080 visible, 45 vertical blanking interval, also known as vertical sync, which is where "VSYNC" comes from). You see these numbers as "Vertical Total" in NVIDIA Custom Resolution, or ToastyX Custom Resolution Utility. If you hit harmonics (e.g. exactly 240fps at 120Hz), you have two splices occuring at the same vertical position during refreshes. If the match is not exact (e.g. 239.9fps@120Hz or 240.37@120Hz), the tearline positions would vary slightly in subsequent refreshes (vibrating or slowly rolling tearline effect).

To prevent harmonics, you pretty much want to use a framerate max that's a bit off, such as fps_max 142 during 144Hz. Even so, that can still create 2 stutters per second.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
RealNC
Site Admin
Posts: 3757
Joined: 24 Dec 2013, 18:32
Contact:

Re: high fps input lag [if GSYNC caps out?]

Post by RealNC » 05 Jan 2014, 07:35

Chief Blur Buster wrote:Also, I just ran high speed video input lag tests, that definitively confirm CS:GO has less input lag with uncapped VSYNC OFF (framerate massively exceeding refresh), than during capped VSYNC OFF (e.g. fps_max 142). This is why I consider it a sucker bet if someone wants to bet money with me, that there's no benefit to excess framerate.
Though with a 2500K CPU at 4GHz and a GTX 780, at maximum details (including HBAO+ "quality" setting and 4x supersampling for transparencies in the NVCP), the game will not go above 120fps to begin with :mrgreen: So the question then becomes, is it worth it dishing out 1000 bucks in GPU costs just to get those extra few milliseconds of less input lag...

Personally, I play the game with vsync on, and "fpx_max 59" followed by "fps_max 60". Yes, there's some weirdness going on: if I only use "fpx_max 60", then there's some inpug lag still with vsync on. If I cap to 59 first and then to 60, input lag goes away. (Setting to 59 would be enough for this, but this introduces a stutter every 1 second. Setting to 60 after that prevents that.)

Could be a bug in the engine, I guess. Capping to 59 first seems to enable something in the game engine that results in less input lag, and that "something" stays even after changing back to 60. Sometimes the effect goes away between map changes, so I've bound these two commands to the PageUp and PageDown keys...
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

dreamss
Posts: 6
Joined: 05 Jan 2014, 08:02

Re: high fps input lag [if GSYNC caps out?]

Post by dreamss » 05 Jan 2014, 08:14

its funny how people forget to factor the interrupt timer in the gameplay experience

User avatar
nimbulan
Posts: 323
Joined: 29 Dec 2013, 23:32
Location: Oregon

Re: high fps input lag [if GSYNC caps out?]

Post by nimbulan » 05 Jan 2014, 14:59

Chief Blur Buster wrote:
nimbulan wrote:I'm pretty sure I understand the point you're trying to make now. Turning vsync off and drastically increasing your framerate (2x or more) past your refresh rate reduces input lag, but it also introduces stuttering and some variability in input lag.
Actually, that's not what I meant.

To clarify, there's actually always stuttering at all framerates with VSYNC OFF, below or above refresh rate. Tearing always creates microstutter effects, which also varies depending on the harmonic effects. You do reduce microstutter a lot when you hit multiples (e.g. 120fps@120Hz, or 240fps@120Hz, etc) but you introduce the harmonics such as the stationary/rolling tearline effect. Sustained visibility of tearlines is far more noticeable than single-frame visibility of tearlines.

There's always variability in input lag at all frame rates. Variability of input lag goes down, the higher the framerate you go, so that isn't a tradeoff.
I think you're misunderstanding me. I'm trying to compare 120 fps vsync on to 240 fps (or higher) vsync off, not both vsync off. Doubling your framerate is undoubtedly better when both tests are done with vsync off but I'm still not convinced the slight reduction in input lag compensates for the stutter and variability that is introduced by turning vsync off to achieve a higher framerate. Input lag should be completely constant with vsync on, since frame renders are started immediately after each refresh, so you can guarantee that input is read at the same time relative to each refresh.
Chief Blur Buster wrote:
nimbulan wrote:It does make me wonder why there isn't a vsync method that allows 2 frames (or even more) per refresh to reduce input lag. Would that just be too difficult to implement?
That creates undesirable harmonic effects. 240fps @ 120Hz creates two stationary tearlines, since they reoccur in the same positions in subsequent refresh cycles.
How would it tear when vsync is on? What I'm postulating is a vsync method that allows multiple frames to be drawn per refresh but simply discards the intermediate frames in order to maintain a framerate higher than the screen refresh rate to reduce input lag yet still maintain the tear-free picture of vsync. Basically vsync 240 Hz on a 120 Hz monitor.

I apologize if it feels like you're talking to a brick wall. This whole concept of reducing input lag through extremely high framerates is quite new to me. I've heard about CS players doing it in the past but never understood the reason for it until this thread.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: high fps input lag [if GSYNC caps out?]

Post by Chief Blur Buster » 05 Jan 2014, 21:49

nimbulan wrote:I think you're misunderstanding me. I'm trying to compare 120 fps vsync on to 240 fps (or higher) vsync off, not both vsync off.
Aha, now I understand. My apologies!
nimbulan wrote:Doubling your framerate is undoubtedly better when both tests are done with vsync off but I'm still not convinced the slight reduction in input lag compensates for the stutter and variability that is introduced by turning vsync off to achieve a higher framerate. Input lag should be completely constant with vsync on, since frame renders are started immediately after each refresh, so you can guarantee that input is read at the same time relative to each refresh.
Right, assuming input reads are done close to the frames. Sometimes the input reads are in a separate thread, but with a 1000Hz mouse, we've got input reads within 1ms of each frame rendertimes (limiting variabilty of time difference between input read timing and the game render timing).
nimbulan wrote:It does make me wonder why there isn't a vsync method that allows 2 frames (or even more) per refresh to reduce input lag. Would that just be too difficult to implement?
Now that I understand better, 240Hz VSYNC ON with 120Hz is essentially triple buffering with VSYNC ON, and a frame cap of 240. I imagine you can do this already in some games by doing things this way. This might reduce triple-buffering microstutters significantly, while significantly reducing input lag.

Anybody tried triple buffering, combined with a fps of 240 (for 120Hz) or 288 (for 144Hz)? It might be an interesting experiment to see if the triple buffering microstutters disappear or mostly disappear.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Post Reply