Is gsync that good ?

Talk about NVIDIA G-SYNC, a variable refresh rate (VRR) technology. G-SYNC eliminates stutters, tearing, and reduces input lag. List of G-SYNC Monitors.
Modin
Posts: 11
Joined: 02 Jul 2018, 09:52

Re: Is gsync that good ?

Post by Modin » 02 Jul 2018, 11:42

Modinstaller here. I wanted to post a reply sooner but I got locked out of my account. Captcha lock due to too many missed passwords (shit happens ...) and there's no way to go through the captcha since it's outdated. Even a week later, I'm still locked out of it so I just made a new one. You might want to look into it https://i.imgur.com/wKyDEBn.png

Anyway I had a few more questions to ask if you don't mind :P

On your vrr demonstration web page, I can clearly see a huge difference between vrr and vsync ... at 60 fps. Is this an oversight ? From what you've all been telling me thus far, gsync doesn't do anything more than vsync at or above the display's maximum refresh rate. On a 144hz monitor, at 144 fps with vsync, vrr wouldn't make any difference, would it ?

Now I've got one last question which doesn't have anything to do with gsync, but I figured I might as well ask. Is it correct to state that vsync does not add any noticeable input lag if used on a rig which wouldn't output much more frames per second than the monitor's maximum refresh rate ? For example on a 60 hz monitor, with a gpu capable of outputting 60-80 fps, vsync should have less of an impact on input lag than with a gpu capable of outputting 300+ fps, right ? But vsync is generally a problem because of frametime variance - if vsync didn't add any input lag on your rig because your gpu isn't crazy fast, then your fps would probably tank below 60 in some cases and there you go with input lag again on top of halved fps. If not, then your gpu probably has low enough frametimes so that vsync noticeably increases input lag. Is all of this right ? Just trying to see if I'm starting to understand this shit right or if I'm just hopeless.

And thanks for the answers as always :)

User avatar
RealNC
Site Admin
Posts: 3741
Joined: 24 Dec 2013, 18:32
Contact:

Re: Is gsync that good ?

Post by RealNC » 02 Jul 2018, 12:34

Modin wrote:On your vrr demonstration web page, I can clearly see a huge difference between vrr and vsync ... at 60 fps. Is this an oversight ? From what you've all been telling me thus far, gsync doesn't do anything more than vsync at or above the display's maximum refresh rate. On a 144hz monitor, at 144 fps with vsync, vrr wouldn't make any difference, would it ?
At 60Hz, 60FPS should look the same in the VRR simulation. If it doesn't, then that's a bug in the simulation.
Is it correct to state that vsync does not add any noticeable input lag if used on a rig which wouldn't output much more frames per second than the monitor's maximum refresh rate ? For example on a 60 hz monitor, with a gpu capable of outputting 60-80 fps, vsync should have less of an impact on input lag than with a gpu capable of outputting 300+ fps, right ? But vsync is generally a problem because of frametime variance - if vsync didn't add any input lag on your rig because your gpu isn't crazy fast, then your fps would probably tank below 60 in some cases and there you go with input lag again on top of halved fps. If not, then your gpu probably has low enough frametimes so that vsync noticeably increases input lag. Is all of this right ? Just trying to see if I'm starting to understand this shit right or if I'm just hopeless.
Vsync will buffer frames regardless of whether the game would render at 65FPS or 300FPS. The same input lag will pile up. Even at 61FPS, you will pile up 1 frame more each second, so after about 3 or 4 seconds, full vsync lag is reached. When this happens, the game is throttled to exactly 60FPS, but it now "lives" 3 or 4 frames in the past now. This is called "vsync back-pressure."

This is why we use the "low lag vsync trick" to prevent that from happening:

https://www.blurbusters.com/howto-low-lag-vsync-on
https://forums.guru3d.com/threads/the-t ... st-5380262

Doing this will lower vsync input lag to acceptable levels. It will never be as low as g-sync or vsync off, but at 120FPS@120Hz or better, the difference becomes really small. 120FPS@120Hz with the low lag vsync trick is very responsive. You can still feel a difference compared to vsync off or g-sync, but unless you're playing Quake or CS:GO, I'd say it's perfectly fine.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

Modin
Posts: 11
Joined: 02 Jul 2018, 09:52

Re: Is gsync that good ?

Post by Modin » 02 Jul 2018, 15:17

3 or 4 frames in the past ? Alright, I think I have the wrong idea about how vsync works.

The way I understand it, the gpu has a back buffer and a front buffer. It renders a frame into the back buffer, then when it's done swaps it with the front buffer before rendering the next frame. The monitor scans the front buffer, so if a swap happens during a scan, you've got tearing. What vsync does is prevent the swap from happening until the scan is done.

That's what I've been reading, anyway. But it's obviously something different. Can you give a quick explanation on how it works ? I'm utterly confused about this whole thing ...

Edit : so I've tried to figure it out myself, and it seems that my understanding is actually correct. But I can't find info anywhere on why the hell vsync would actually add any input lag ? Why the need to buffer 3-4 frames when the gpu is obviously fast enough to render frames on the fly ? How can I reduce this ? Are we talking about the "maximum pre-rendered frames" setting or something else ?

User avatar
RealNC
Site Admin
Posts: 3741
Joined: 24 Dec 2013, 18:32
Contact:

Re: Is gsync that good ?

Post by RealNC » 03 Jul 2018, 21:38

The input lag reduction is there (it's been measured), and as far as I understand it, this is why:

In the past (probably pre-DX9, not sure) presenting frames was a synchronous operation. In a double buffer setup, you render a frame and then present it. The present function would actually block your game's rendering thread from doing any other rendering related work, until the current frame has finished scanning out on the monitor. Again, I don't know when exactly this was abandoned on PC. I know for a fact though that old consoles (like the PS1) operated in this way. You always have exactly two buffers, front and back. There never is any additional buffering, unless you explicitly want that to happen.

These days, this is no longer the case. The frame presentation is asynchronous, and the game is not prevented from doing additional work even while the current frame is still being scanned out by the monitor. The game can start reading input from the player in preparation for rendering another frame right away. As a result, there's more rendering related operations "in flight" than just the two you get in an old-school synchronous present method. This is heavily influenced by the pre-render queue, where the game is allowed to prepare multiple frames (2 by default) even if there's already two rendered frames yet to be displayed (the frame that was just presented, and the frame that is still being scanned out by the monitor.) So you can end up with 4 frames (rendered frames + prepared yet to be rendered ones) stuck in buffers still waiting to be displayed. Microsoft's API documentation explicitly makes it clear that developers should not try to interfere with this, as being asynchronous results in the best possible performance.

This is a case of "buffer bloat." Too much buffering for the sake of multi-threading. In the past, we had 1 CPU core, and thus it didn't make sense to allow the game to start working on new frames while the current one is still being scanned out. Now, while the previous frame is being scanned out, the current finished frame is held in a buffer, and another one is currently being submitted for rendering, and the game is free to start preparing yet another one. This is possible since we have multiple CPU cores. Of course this results in player input being too old by the time it actually makes it to the screen, because we have vsync enabled. It doesn't matter how many CPU cores we have. In the end, the frames can only make it to the screen at a distance of 16.7ms from one another, and thus at some point the game is finally prevented from doing any additional work in advance. But it's too late. You have all these queued frames and operations that piled up, waiting for that 16.7ms bottleneck. The game is simply being prevented from preparing more frames at completely the wrong time: it happens at a time when the game has already sampled input from the player. Some games are smart enough and do not allow that to happen. Many games though do not even attempt to prevent this.

Using a CPU-based frame limiter like RTSS makes games behave more similar to old-school, synchronous frame presentation. The game is blocked and prevented from trying to render or prepare any more frames (and thus prevented from sampling player input) until the current frame has been fully displayed, and as a result you get input lag that resembles classic double buffering.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

Modin
Posts: 11
Joined: 02 Jul 2018, 09:52

Re: Is gsync that good ?

Post by Modin » 04 Jul 2018, 17:20

So out of the 4 frames of delay you're talking about, 1 is the first scan, 1 is the second scan and 2 are the pre-rendered frames ?

Are we talking about the setting in the nvidia control panel or something specific to games ? If I set it to 1, would that be 3 frames of delay instead of 2 ? 16 ms doesn't really seem that big of a deal.

I read that 0 pre-rendered frames was an option a while ago, but a minimum of 1 was set at some point. The explanation to why was that without any pre-rendered frames, the cpu could be late at queuing another frame to be rendered, and that could create uneven motion as well as a performance hit. Is that right ?

But what doesn't make sense to me is that frames would only pre-render during vsync. What gives ? Why not pre-render frames without vsync as well ?

User avatar
RealNC
Site Admin
Posts: 3741
Joined: 24 Dec 2013, 18:32
Contact:

Re: Is gsync that good ?

Post by RealNC » 04 Jul 2018, 19:21

Modin wrote:So out of the 4 frames of delay you're talking about, 1 is the first scan, 1 is the second scan and 2 are the pre-rendered frames ?
And probably one more due to the game having sampled player input but then blocked due to the pre-render queue being exhausted. Even though there's no frame involved, it's just as bad as a buffered frame, because the game will be forced to sit on old player input and only be allowed to prepare a new frame based on that input after 16.7ms have passed.
Are we talking about the setting in the nvidia control panel or something specific to games
The setting in the nvidia panel. Games themselves can choose a different pre-render length though, so even if you don't set it to 1, some games will only ever prepare one frame at most.
If I set it to 1, would that be 3 frames of delay instead of 2 ? 16 ms doesn't really seem that big of a deal.
Depends on the game. It can be 3 instead of 4 if the game doesn't take care to not sample player input too early.
I read that 0 pre-rendered frames was an option a while ago, but a minimum of 1 was set at some point. The explanation to why was that without any pre-rendered frames, the cpu could be late at queuing another frame to be rendered, and that could create uneven motion as well as a performance hit. Is that right ?
I don't know. But using RTSS pretty much makes it behave as if you've set MPRF to 0, even though that's not a valid setting.
But what doesn't make sense to me is that frames would only pre-render during vsync. What gives ? Why not pre-render frames without vsync as well ?
It works without vsync as well. But without vsync, the game never hits the vsync bottleneck. It is free to pre-render and output as fast as it can so there's no back-pressure. But only as long as the GPU can keep up. If the GPU is maxed out completely, then you can get pre-render back-pressure. The CPU runs way ahead of the GPU in that case, and you get this input lag pile-up like you do with vsync. But the GPU in this case isn't slowing down because it's waiting for the vsync signal, it just ran out of steam. You should be able to observe this effect in GPU-limited games. Like Witcher 3, for example. Run it at 4K DSR which will completely saturate the GPU. If you set MPRF to a high value, you will see the pre-render pile-up even with vsync off. As soon as you use RTSS to cap the FPS to a value that the game is able to reach, that pile-up completely disappears and input lag goes away. (This is why RTSS capping is also very useful with g-sync in GPU-heavy games.)

When all is set and done though, even if the above assumptions are not entirely correct, the fact remains that input lag tests have shown that capping your FPS in this manner can get you from something like 100ms or more of input lag down to ~50ms at 60Hz, depending on the game. That is a very big difference that can be felt. It can get you from "mouse feels floaty like a boat" to "this is almost snappy."
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

Post Reply