Blur Busters Forums

Who you gonna call? The Blur Busters! For Everything Better Than 60Hz™ Skip to content

Render scale and frame buffers question

Everything about input lag. Tips, testing methods, mouse lag, display lag, game engine lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more!

Render scale and frame buffers question

Postby flagbender » 29 Dec 2016, 09:46

The game I'm referring to is Overwatch, but I think some of the concepts can be applied to other games and engines.

Overwatch is a game which suffers from input lag due to frame buffering(?). I think it might be something similar to the onethreadframelag option being enabled in UE3 games. From my understanding, it causes some input lag when the framerate is being limited by the GPU.

1. Does this mean that when capping framerate ingame so that the GPU is not being fully utilized, there is less input lag than not capping?
2. When/if the framerate dips below the cap, would there be a difference in input lag between when the GPU caused the fps drop vs the CPU?
3. Finally, if this frame buffer could be removed or reduced (which is an upcoming option in the game), would this option normalize input lag in all situations (ie. framerate capped or uncapped, framerate dipping below cap or not)? Would there be universal input lag reduction or would there still be more lag when the framerate drops below the cap?

Also, Overwatch has an ingame setting for render scale (internal resolution?), presumably similar to Battlefield and UE4's options. If you used a setting other than the one corresponding to 100% of your resolution, could it inherently cause input lag due to having to scale the internal resolution to the output resolution, even if the GPU is not being maxed out?
Last edited by flagbender on 29 Dec 2016, 10:03, edited 1 time in total.
flagbender
 
Posts: 14
Joined: 23 Feb 2016, 10:51

Re: Render scale and frame buffers question

Postby Trip » 29 Dec 2016, 17:39

I think it is similar to the unreal engine option. It has to do with buffering frames at the cpu side of the engine (https://www.reddit.com/r/Overwatch/comm ... ing_input/). I tried the ptr setting and even though I am not gpu bottlenecked I still feel that the game is more responsive with reduced buffering enabled.
I don't think render scale adds any significant input delay because it basically lets the system render a lower res image and then it up scales that on the gpu and presents it to the monitor in whatever its resolution is. So the monitor just sees it as 1080p image if you have that resolution set and not like 1080 * 0.75 = 810p.
Trip
 
Posts: 152
Joined: 23 Apr 2014, 15:44

Re: Render scale and frame buffers question

Postby flagbender » 30 Dec 2016, 03:12

So regarding render scale, the question being whether the option adds any delay at all, and you think it adds at least a little. To be honest that's enough to make me leave it alone.

About the buffering then, do you think it would make the input lag the same regardless of whether the framerate was dipping or not?
flagbender
 
Posts: 14
Joined: 23 Feb 2016, 10:51

Re: Render scale and frame buffers question

Postby RealNC » 30 Dec 2016, 03:40

It adds the same delay as other graphics options that affect performance. Less FPS = increased frame time. 50FPS has more input lag than 100 FPS.

So you need to leave ALL options alone and set everything to minimum, because EVERY option that lowers FPS adds delay.
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
User avatar
RealNC
 
Posts: 1221
Joined: 24 Dec 2013, 18:32

Re: Render scale and frame buffers question

Postby flagbender » 30 Dec 2016, 04:02

RealNC wrote:It adds the same delay as other graphics options that affect performance. Less FPS = increased frame time. 50FPS has more input lag than 100 FPS.

So you need to leave ALL options alone and set everything to minimum, because EVERY option that lowers FPS adds delay.


Thanks, that's all I wanted to know regarding this option. I have my framerate capped to 118fps and my GPU is already capable of a stable framerate at 100 or 150% render scale. This being the case, I'll be using 150% render scale.
flagbender
 
Posts: 14
Joined: 23 Feb 2016, 10:51

Re: Render scale and frame buffers question

Postby RealNC » 30 Dec 2016, 04:19

Note that even if you cap the frame rate, you can still get increased latency. It depends on the game engine and whether it tries to reduce latency by sampling input as late as possible and implements a low-latency frame capper.

I don't know if OW does this or not. In general, if a game tries to sample input as late as possible, and you would get 1000FPS when uncapped, this would result in 1ms input latency. You you cap to 100FPS, which has a 10ms frame latency, the game (if it could perfectly predict how much time is needed to render a frame) could give you 1ms input lag by using a 1000FPS frame time internally (waiting for 9ms and sampling input in the last 1ms). If you then increase the graphics settings to something that would result in 150FPS uncapped, this would give you 6.6ms input lag.

Obviously no game engine can scale that well, because you can't predict the frame time with 100% accuracy. But the principle still applies: if a game does have frame time prediction for input lag reduction, raising graphics settings will increase input lag even when you frame cap.

Which is why in the end I use my subjective experience to set up the graphics settings. If I max them out but the game feels OK, I just use that. If you can't feel any difference between render scale 100 and 150, then just use 150.

Unfortunately, I don't have any sort of list of games that do this. I only know of the method which is used to reduce input lag, but not of which games do this and which don't :-/

Also, if you use an external capper (like RTSS or the nvidia one through Inspector), all of the above doesn't apply. Only an internal frame capper can do input lag reduction. With an RTSS cap of 100FPS, you'll be getting an average delay of 10ms. The only external frame capper I'm aware of that has frame time prediction is GeDoSaTo (which AFAIK only works with DX9 games; I don't believe it's been ported to DX10/11/12.)
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
User avatar
RealNC
 
Posts: 1221
Joined: 24 Dec 2013, 18:32

Re: Render scale and frame buffers question

Postby flagbender » 30 Dec 2016, 05:35

Huh wow. Firstly, thanks very much for the lesson.

Secondly, I think based on a Youtube video called, "Overwatch Input Lag (Button 2 Pixel) Analysis - PC vs. Console", OW probably implements a low latency framerate capper, since the difference in lag between uncapped (300fps) and capped (154fps) was 1-2ms, not >3ms as you would expect with a regular capper, if I understood your post.

Also, this explains why I sometimes feel very different mouse movement despite my framerate being capped at a solid 118fps. There are times, especially before the game starts and I'm waiting in a lobby, when the mouse feels incredibly light. There are also times when I'm in the middle of a huge fight and movement feels sluggish, despite the ingame fps counter showing no deviations at all. I had no reason at all to suspect input lag could be fluctuating despite a framerate cap, so it couldn't have been placebo. I kind of thought it was just small frametime deviations causing stutter. Now I know it's probably due to OW's framerate capper implementation.

Then actually, this flies in the face of advice to cap framerate in order to keep input lag consistent for precise aiming. I don't really know how to feel about this. It means that A. even two machines running with an fps cap will have different average input lag depending on their power, and B. you can almost never get consistent input lag.

I wonder if you could elaborate on how this correlates with my original questions about the frame buffering?
flagbender
 
Posts: 14
Joined: 23 Feb 2016, 10:51

Re: Render scale and frame buffers question

Postby RealNC » 30 Dec 2016, 06:35

flagbender wrote:Then actually, this flies in the face of advice to cap framerate in order to keep input lag consistent for precise aiming. I don't really know how to feel about this. It means that A. even two machines running with an fps cap will have different average input lag depending on their power, and B. you can almost never get consistent input lag.

Consistent input lag is very hard to achieve. The only game I know of that allows you to set your own frame time, and thus achieve a constant input delay, is ezQuake (a community Quake engine.) ezQuake allows you to configure the frame cap in such a way where vsync on has almost the same input lag as vsync off.

That's basically the only game I know of that does this. With DX9 games, you can also use GeDoSaTo's frame capper and achieve a constant input delay. In theory... I don't use GeDoSaTo myself :-P

I wonder if you could elaborate on how this correlates with my original questions about the frame buffering?

I don't think frame buffering has anything to do with this. If the GPU is maxed out, that just means it takes longer to render a frame. So if OW does frame time prediction in order to lower input lag, higher GPU usage means higher frame times and thus increased input lag. In games that do frame time prediction, the less load there is on the GPU, the lower input lag gets. In other words, the lower you set the graphics settings (less GPU load, lower frame rendering time,) the less input lag you get, regardless of whether you're below the frame cap or not.
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
User avatar
RealNC
 
Posts: 1221
Joined: 24 Dec 2013, 18:32

Re: Render scale and frame buffers question

Postby flagbender » 01 Jan 2017, 00:16

Ok thanks. Your answer was really helpful but to be honest I still have a lot of questions, so will probably go and read up more on it later.

Happy New Year :)
flagbender
 
Posts: 14
Joined: 23 Feb 2016, 10:51


Return to Input Lag

Who is online

Users browsing this forum: No registered users and 1 guest