Blur Buster's G-SYNC 101 Series Discussion

Talk about NVIDIA G-SYNC, a variable refresh rate (VRR) technology. G-SYNC eliminates stutters, tearing, and reduces input lag. List of G-SYNC Monitors.
User avatar
RealNC
Site Admin
Posts: 3756
Joined: 24 Dec 2013, 18:32
Contact:

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by RealNC » 26 Aug 2017, 02:54

We don't have data on the NV limiter when it comes to MPRF. Most of us avoid that limiter since it adds too much input lag.

With RTSS or an in-game limiter, if you hit the cap, MPRF has no effect. If you don't hit an FPS cap, then MPRF will add input lag. Always. Vsync doesn't matter. Neither does g-sync. The rule is very simple, really:

Is the frame rate capped and the cap is reached? No input lag from MPRF.
All other cases: MPRF adds input lag.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
Aemony
Posts: 4
Joined: 26 Aug 2017, 20:27

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by Aemony » 26 Aug 2017, 21:05

Yo, great article. Just read through the "Input Lag & Optimal Settings" page and I really liked it!

Anyway, two minor points:

* In the "Optimal G-SYNC Settings", please mention that "Preferred refresh rate" in Nvidia Control Panel should be set to "Highest available" for full clarification. Using "Application-controlled" on this setting understandably limits the refresh rate to whatever refresh rate the game requested (usually 60 Hz) unless manually overridden using third-party software.

* Even if G-Sync is enabled for "Full screen mode" only, some games can be forced to run in a G-Sync'd borderless window by forcing Flip Model presentation using the Special K tool. This is often hit or miss, though, and I don't think all engines supports it. This bypasses whatever window mode Nvidia is using in their drivers. I have no idea how it relates to the "DWM Woes?" chapter, but felt it necessary to mention if nobody else hadn't done that.

Also, a minor request to jorimt: If you're up for it, I'd love to see how Special K's framerate limiter stacks up to the alternatives. I don't think anyone have done a thorough comparison test on that, and it includes both a regular sleep framerate limiter, as well as a busy-wait oriented limiter that might be more stable, at a cost of maxing out a CPU core.

Thanks again for a great article!

User avatar
jorimt
Posts: 2484
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by jorimt » 26 Aug 2017, 21:56

Vleeswolf wrote:If you run at half-rate vsync, in particular, the effect of the MPRF setting is pretty noticeable on some games, with 1 giving the lowest lag.
This situation may be the exception as I believe while this mode effectively forces a double buffer lock to half refresh, it's still technically running at 60Hz (or whatever base max refresh rate you have set), so unless you're using an FPS limiter in combination with 1/2 refresh V-SYNC, the framerate still isn't being limited by an FPS limiter or the refresh rate, thus MPRF should still have an effect.
Aemony wrote:* In the "Optimal G-SYNC Settings", please mention that "Preferred refresh rate" in Nvidia Control Panel should be set to "Highest available" for full clarification. Using "Application-controlled" on this setting understandably limits the refresh rate to whatever refresh rate the game requested (usually 60 Hz) unless manually overridden using third-party software.
I thought I was already pretty clear in "G-SYNC 101: Control Panel" when I stated:

"'Highest available' automatically engages when G-SYNC is enabled, and overrides the in-game refresh rate selector (if present), defaulting to the highest supported refresh rate of the display. This is useful for games that don’t include a selector, and ensures the display’s native refresh rate is utilized."

Since "Highest available" engages automatically with G-SYNC, I don't see the need to insist it be enabled. That, and "Application-controlled" does have its uses in games that have a refresh rate selector (a.k.a Overwatch, etc.), and for games that don't, while it isn't convenient, you can usually set the desktop to the desired refresh rate before launch (CS:GO and some others work like this).

Finally, I wouldn't always recommend "Highest available," since there are edge cases where if you have a 240Hz G-SYNC monitor connected as a secondary and a 144Hz G-SYNC monitor connected as a primary (like I do), that setting forces the 240Hz monitor to 144Hz unless "Application controlled" is enabled instead. This goes for any mixed refresh rate multi-monitor setup (darn you Microsoft...or whomever).
Aemony wrote:* Even if G-Sync is enabled for "Full screen mode" only, some games can be forced to run in a G-Sync'd borderless window by forcing Flip Model presentation using the Special K tool. This is often hit or miss, though, and I don't think all engines supports it. This bypasses whatever window mode Nvidia is using in their drivers. I have no idea how it relates to the "DWM Woes?" chapter, but felt it necessary to mention if nobody else hadn't done that.
Hm, that just sounds like a roundabout way of enabling G-SYNC's borderless/windowed mode; interesting note nonetheless.
Aemony wrote:Also, a minor request to jorimt: If you're up for it, I'd love to see how Special K's framerate limiter stacks up to the alternatives. I don't think anyone have done a thorough comparison test on that, and it includes both a regular sleep framerate limiter, as well as a busy-wait oriented limiter that might be more stable, at a cost of maxing out a CPU core.
I've had that request once before, and it's already on my to-do list. It will have to wait until I create a dedicated framerate limiter article however, along with an eventual MPRF article as well.

We have higher priorities on our list currently though, so no ETA on either, but both subjects are something I eventually want to cover.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

User avatar
Aemony
Posts: 4
Joined: 26 Aug 2017, 20:27

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by Aemony » 26 Aug 2017, 22:36

jorimt wrote:I thought I was already pretty clear in "G-SYNC 101: Control Panel" when I stated:

"'Highest available' automatically engages when G-SYNC is enabled, and overrides the in-game refresh rate selector (if present), defaulting to the highest supported refresh rate of the display. This is useful for games that don’t include a selector, and ensures the display’s native refresh rate is utilized."

Since "Highest available" engages automatically with G-SYNC, I don't see the need to insist it be enabled. That, and "Application-controlled" does have its uses in games that have a refresh rate selector (a.k.a Overwatch, etc.), and for games that don't, while it isn't convenient, you can usually set the desktop to the desired refresh rate before launch (CS:GO and some others work like this).
Ohh, okay, I see your point now. I mostly thought about being extra clear on that part, since a lot of users will just scroll down to the optimal setting (or copy/pasta them) and change accordingly, in a situation where they might've already mistakenly changed the "Preferred refresh rate" back to "Application-controlled," hence limiting themselves which they might not want to do.

jorimt wrote:
Aemony wrote:* Even if G-Sync is enabled for "Full screen mode" only, some games can be forced to run in a G-Sync'd borderless window by forcing Flip Model presentation using the Special K tool. This is often hit or miss, though, and I don't think all engines supports it. This bypasses whatever window mode Nvidia is using in their drivers. I have no idea how it relates to the "DWM Woes?" chapter, but felt it necessary to mention if nobody else hadn't done that.
Hm, that just sounds like a roundabout way of enabling G-SYNC's borderless/windowed mode; interesting note nonetheless.
I think it's a better way than Nvidia's built-in mode, myself, though obviously I don't have any actual data backing that up. I never actually use Nvidia's built-in window mode since it tends to cause all kinds of issues when it syncs the refresh rate of the monitor to a window that isn't maximized on the monitor. Not to mention that it sometimes seems to get into a state where it limits the framerates of game windows that have focus to much lower than they should actually run at. So while the game window actually have focus, G-Sync would limit it to whatever framerate it was stuck at (usually much lower than the refresh rate), then when you unfocused it the game would spin up to the max the game could render. Infuriating, and it required a G-Sync window mode disable + restart of the computer to solve. :(

Also, I think what made me think of the whole flip model presentation was the sentence regarding not being able to introduce tearing in window mode, which is possible to introduce using that override (though again, engine support is minimal).

jorimt wrote:
Aemony wrote:Also, a minor request to jorimt: If you're up for it, I'd love to see how Special K's framerate limiter stacks up to the alternatives. I don't think anyone have done a thorough comparison test on that, and it includes both a regular sleep framerate limiter, as well as a busy-wait oriented limiter that might be more stable, at a cost of maxing out a CPU core.
I've had that request once before, and it's already on my to-do list. It will have to wait until I create a dedicated framerate limiter article however, along with an eventual MPRF article as well.

We have higher priorities on our list currently though, so no ETA on either, but both subjects are something I eventually want to cover.
Ooo! I'm looking forward to it regardless when (if ever) it lands. :)

User avatar
jorimt
Posts: 2484
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by jorimt » 27 Aug 2017, 09:48

Aemony wrote: I think it's a better way than Nvidia's built-in mode, myself, though obviously I don't have any actual data backing that up. I never actually use Nvidia's built-in window mode since it tends to cause all kinds of issues when it syncs the refresh rate of the monitor to a window that isn't maximized on the monitor. Not to mention that it sometimes seems to get into a state where it limits the framerates of game windows that have focus to much lower than they should actually run at. So while the game window actually have focus, G-Sync would limit it to whatever framerate it was stuck at (usually much lower than the refresh rate), then when you unfocused it the game would spin up to the max the game could render. Infuriating, and it required a G-Sync window mode disable + restart of the computer to solve. :(

Also, I think what made me think of the whole flip model presentation was the sentence regarding not being able to introduce tearing in window mode, which is possible to introduce using that override (though again, engine support is minimal).
I don't know, I'd have to test it, though I doubt it does anything the Nvidia option doesn't (more likely enables the option or behavior remotely somehow per app).

I have to agree that the borderless/windowed G-SYNC mode can be buggy at best, but much of that is due to the limitations of the operating system, and is currently out of Nvidia's hands.

Most of the behaviors you mentioned I already described in the "Control Panel" section of the article, and some of the others can be caused by multi-monitor setups (another OS limitation). For this reason, I myself avoid the mode, and only specifically enabled it if I'm playing a borderless game. I'd like to see improvements to that setting in the future, for sure.

As for borderless/windowed mode and tearing, I just couldn't replicate it with my setup, while others could, which means this behavior could vary from setup to setup. Maybe Special K could have induced it in my test games, maybe not, but since I was testing for the most common, "out-of-the-box" scenarios possible, it really wouldn't have counted toward that goal anyway.

Appreciate the insights and suggestions (Special K testing being one I will take up eventually) thus far, regardless.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

Durante
Posts: 4
Joined: 12 May 2015, 05:58

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by Durante » 02 Sep 2017, 05:11

RealNC wrote:Btw, I should mention here that all of this is based on general knowledge about the matter. I'm not an authority on these things. Someone who actually worked on things that involved these matters should be able to confirm or deny.

Where's Durante when you need him :geek:
Took a while, but here I am :P

Now seriously, I was just reading this thread (since a friend asked me about the exact differences between G-sync with V-sync on/off in specific frame ranges, and jorimt's set of articles is the best resource I know on the subject), and I found the discussion of frame limiting latency impact that came up around the post I quoted (in June) very interesting.

I had some ideas about "predictive" external FPS capping a few years back: http://blog.metaclassofnil.com/?p=715
It never actually really worked out in my implementation, probably primarily since I didn't have any way to track the true V-sync timing at that point, and also because my implementation methodology was off. I do still think that the overall idea is viable, however. Of course, with most modern games you might well be halting the wrong thread this way.

User avatar
RealNC
Site Admin
Posts: 3756
Joined: 24 Dec 2013, 18:32
Contact:

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by RealNC » 02 Sep 2017, 10:00

Welcome back :P

What we haven't seen yet from anyone out there, is a "throttling" limiter. Meaning, do frame limiting even when not needed by always applying a few ms of wait time. In theory this should be easy to add to an existing limiter:

Code: Select all

if (frame_time < target_frame_time) {
    // Apply limit just like we always did.
} else {
    // Apply a small limit here, sacrificing 1-2FPS, while previously this was a no-op.
}
This should (I'm 99.99% certain) reduce input lag to the lowest possible value that any external, non-predictive limiter can possibly achieve.

No one has done this yet, even though it does look like it's a dead-simple modification to any frame limiter :-/
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
masterotaku
Posts: 436
Joined: 20 Dec 2013, 04:01

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by masterotaku » 02 Sep 2017, 10:14

What about SpecialK for fps limiting?: https://github.com/Kaldaien/SpecialK/releases

It has an optional busy-wait limiter, framerate variance tolerance, and lots of features. And it has a GUI where you can change the settings in real time.

I tried it in Trails of Cold Steel and it's very smooth, but hammers one of my cores so much that it reaches >90 degrees, lol. Si I have that disabled, but keep the mod for custom textures.
CPU: Intel Core i7 7700K @ 4.9GHz
GPU: Gainward Phoenix 1080 GLH
RAM: GSkill Ripjaws Z 3866MHz CL19
Motherboard: Gigabyte Gaming M5 Z270
Monitor: Asus PG278QR

Sparky
Posts: 682
Joined: 15 Jan 2014, 02:29

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by Sparky » 03 Sep 2017, 00:35

Durante wrote:
RealNC wrote:Btw, I should mention here that all of this is based on general knowledge about the matter. I'm not an authority on these things. Someone who actually worked on things that involved these matters should be able to confirm or deny.

Where's Durante when you need him :geek:
Took a while, but here I am :P

Now seriously, I was just reading this thread (since a friend asked me about the exact differences between G-sync with V-sync on/off in specific frame ranges, and jorimt's set of articles is the best resource I know on the subject), and I found the discussion of frame limiting latency impact that came up around the post I quoted (in June) very interesting.

I had some ideas about "predictive" external FPS capping a few years back: http://blog.metaclassofnil.com/?p=715
It never actually really worked out in my implementation, probably primarily since I didn't have any way to track the true V-sync timing at that point, and also because my implementation methodology was off. I do still think that the overall idea is viable, however. Of course, with most modern games you might well be halting the wrong thread this way.
I had some ideas about this. Obviously there's backpressure, but how hard would it be to modify the IO library to block the main game thread or render thread when it tries to gather mouse input, and use that block for rate limiting? Obviously this won't work for all games, as multiple threads might be reading input, but for at least a few games it should give you the lowest possible latency.

mminedune
Posts: 31
Joined: 01 Feb 2014, 08:12

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by mminedune » 04 Sep 2017, 12:19

OK im back and so after all this time with gsync this is what i found.

The tear on bottom of screen only happens when CPU is stressed like in MP or single player games that can stress CPU.
How hard really depends on your set up and cpu threshold per. Setting frame cap to relive CPU stress will help and also settings powerlimit on GPU to 120%.

Also Vsync on with gsync is not good. For most part it wont be noticed because frames are in sync because of gsync. BUT when it tears anomaly happens on lower part of screen that vsync being on will cause stutter hitching and Vsync off will tear. These issue will be more apparent when panning up and down with mouse it will look whole half of screen is tearing/stutter.
And TBH IMO minor tear is less annoying than dealing with stutter.
*Nvidia recommends vsync on because it fool proof way of capping frames it serves no purpose with gsync at all*

Game like Battlefield 1 is really good example as its most stressful game ATM on CPU. Now back to vsync and another reason why it shouldn't be turned on. For people who dfodnt know games that run on frostbite like the battlefield games use triple buffering by default when you turn on vsync. There is no setting in game options to turn it off you have to run a command from console to turn off per session or set a config to disable permanently.
With it on with gsync you get slight input lag and screen clarity isn't there panning because of the additional frame buffer added, panning around with mouse looks hazy. You can turn it off with RenderDevice.TripleBufferingEnable 0 command but still vsync should not be used because you will get stutter.

And i found nvidiaProfileInspector and riva tuner work best for setting frame cap with gsync as it sets it on driver level. Dont recommend setting from game. Game engine level frame cap can mess with gsync and framtimes.

Also good news these issue don't seem to happen on DX12 and Vulcan API probably because of better CPU utilization.
DX11 puts most of its workload only to one core.



Good luck

Post Reply