G-SYNC and framerate caps

Talk about NVIDIA G-SYNC, a variable refresh rate (VRR) technology. G-SYNC eliminates stutters, tearing, and reduces input lag. List of G-SYNC Monitors.
Sparky
Posts: 682
Joined: 15 Jan 2014, 02:29

Re: G-SYNC and framerate caps

Post by Sparky » 10 Nov 2015, 19:17

ciopenhauer wrote:Sorry for bumping an old thread, but this seems to be the place to ask: Has anything changed? Do we still need to cap FPS to ~135 in order to prevent input lag? Anyone knows?
Yes, and preferably with an in-game framerate cap. It's a question of where the bottleneck is. If the bottleneck is at the very end of the render chain(the monitor), then you have several frames each waiting on the next one to clear out of the next step in the render pipeline.

ciopenhauer
Posts: 18
Joined: 06 Nov 2014, 06:04

Re: G-SYNC and framerate caps

Post by ciopenhauer » 12 Nov 2015, 13:04

Thank you. I'M using ENB to cap frames to 135 in Fallout 4. I assume it's better than with Rivatuner or the same at least.

User avatar
lexlazootin
Posts: 1251
Joined: 16 Dec 2014, 02:57

Re: G-SYNC and framerate caps

Post by lexlazootin » 20 Nov 2015, 08:49

Maybe i don't understand the question but G-Sync only works below the maximum refresh rate, if it goes any higher it vsyncs or creates tears. so just cap at 135.

User avatar
Ahigh
Posts: 95
Joined: 17 Dec 2013, 19:22

Re: G-SYNC and framerate caps

Post by Ahigh » 23 Dec 2015, 08:43

The short answer people need to know is, "it depends."

When I write video game software for windows, I process input events in a separate thread and I process the game update including physics in an additional separate thread. In this case (how games that vary according to arrival times on input data should be designed) the effect of lag on the actual simulation of the video game is not dependent on framerate at all. Instead, each and every time a new visual frame is needed, two independent frames of simulation from the update of the game and physics are accessed. For my games, the game updates at 1000hz. So I will pull physics from t=666ms and t=667ms and maybe I want to display a frame at t=666.666ms so I take a blend of the positions with 60% coming from the position at t=667ms and 40% coming from the position at t=666ms and nothing is going to change the fact that I will integrate the game physics EXACTLY THE SAME no matter what the framerate does. In addition, and on average, I will incur a 0.5 AVERAGE latency from the need to interpolate between two physics frames in order for this approach to work.

I think that many here are (and likely correctly) talking about round-off error as a result of integrating physics in the same thread as the drawing of the graphics.

Games that wait on v-sync to begin drawing and immediately access the inputs in the queue in the same thread that does the drawing and then get the visual frame ready to be shown much sooner than the display will allow can incur up to a full frame of latency at maximum framerate in the event that the draw time is negligible (because it gets it ready to draw too quickly and has to wait unlike when you are just ready to swap it in because the video is ready to start immediately because it's running more slowly). But in each and every case of this approach, there is a roundoff error in the time-domain that is an exclusive problem to G-sync enabled games (temporal inaccuracies due to improper game design) where visual frames that take longer to draw have more input latency than visual frames that take less time to draw. So I think that the max fps ADDING to the latency might be off. What is actually happening is that your game is going to have more temporal inaccuracies the less consistent the framerate is. When you cap the fps, those artifacts should go AWAY. And again, all of this is for games that are not designed to work properly under G-sync. Having a fixed and consistent framerate (and latency) for any single-thread game is the obvious and simple way to ensure that everything is smooth, and even in this case, inputs are 100hz and you typically have a 20-40hz beat frequency between your input's arrival times and your video output frequency even without gsync in this case.

Anyone who doubts what I am saying can add a random draw-time delay for some random amount of draw time between 1 and 16ms to a g-sync application and tell me -- notice anything not smooth? G-sync working smoothly is dependent on draw times that are not erratic. You can't know how long it is going to draw and you have to decide before you know where the position of everything is.

A game application that is aware of these problems will add a PD-filter to control the actual draw time to prevent the game from having draw times that vary too greatly from one frame to the next and will intentionally degrade the framerate on some frames to improve the visual smoothness -- IE: predict the draw time and if you draw faster, insert a delay and predict the next draw time to be a little shorter.

Vegas 2047 and CasinoKat are two games that I designed the code for these details at NanoTech Gaming Labs that use this approach. We use 120Hz display and 1000hz physics and everything is 100% perfectly smooth. We have patents for how we achieve this.

http://nanotechgaming.com/patents/nanotech-hvs.pdf

On Vegas 2047, we also use interrupt driven inputs to get the arrival time accurate to sub-nanosecond. The entire game is about timing, so it's pretty important to get this stuff right.

So in summary, while it is true that the waiting for v-sync causes visual latency, in general compared to immediately starting the display the data, it's the temporal inaccuracies that result from not knowing the draw-time in advance that are more noticeable to a human being than the absolute latency value. IE: if you notice something, it's errors, not absolute latency. And errors (things not being placed at the right spot at the right time) is what G-sync aims to fix. Except that you have to at least have a good GUESS for how long it's going to take to render when you decide where things are going to be positioned BEFORE you start drawing.

I hope this helps with some of the counter-intuitive things related to this discussion for anyone who is hung up thinking they are perceiving "latency" when in fact they are more likely perceiving "temporal inaccuracies" -- which is what G-sync, generally speaking, solves, except for this little nit-picking point for games that are written by folks who don't have a firm grasp of what a temporal inaccuracy even is.

Sparky
Posts: 682
Joined: 15 Jan 2014, 02:29

Re: G-SYNC and framerate caps

Post by Sparky » 24 Dec 2015, 10:13

You get 1ms of input timing variance from USB polling alone, so I don't really understand what you mean by sub-nanosecond accuracy. It is nice to see some game developers paying attention to consistent draw/render times, as this allows you to remove latency from normal v-sync, by waiting to draw a frame until the last moment.

Also, you can tell the difference between high latency and inconsistent latency, the first causes the cursor to feel slow/laggy, and the second causes the cursor to feel inaccurate. For the whole system, mouse click to monitor, I'd like to see absolute latency stay under 20ms, and variance under 10ms(if v-sync/g-sync is on and it's a 120hz display). For the actual input timing to game physics, you can do a lot better, and that can be important where the timing between two inputs is more important than reacting to something on screen.


Edit: Vegas 2047 wasn't a PC game, so ignore that bit. I do have some things to say about low latency debouncing(SR latch with an on-mom switch?), but this is probably the wrong thread for that.

Post Reply