Blur Buster's G-SYNC 101 Series Discussion

Talk about NVIDIA G-SYNC, a variable refresh rate (VRR) technology. G-SYNC eliminates stutters, tearing, and reduces input lag. List of G-SYNC Monitors.
Zazie
Posts: 21
Joined: 13 Oct 2016, 14:45

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by Zazie » 18 Dec 2020, 13:57

Sorry, should vsync be enabled in the NVCP?

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by jorimt » 18 Dec 2020, 14:13

Zazie wrote:
18 Dec 2020, 13:57
Sorry, should vsync be enabled in the NVCP?
Not sure what you're getting at, so I'll have to be broad here. Feel free to clarify what you're asking if the below doesn't answer your question:
https://blurbusters.com/gsync/gsync101- ... ttings/14/
Nvidia Control Panel V-SYNC vs. In-game V-SYNC

While NVCP V-SYNC has no input lag reduction over in-game V-SYNC, and when used with G-SYNC + FPS limit, it will never engage, some in-game V-SYNC solutions may introduce their own frame buffer or frame pacing behaviors, enable triple buffer V-SYNC automatically (not optimal for the native double buffer of G-SYNC), or simply not function at all, and, thus, NVCP V-SYNC is the safest bet.

There are rare occasions, however, where V-SYNC will only function with the in-game option enabled, so if tearing or other anomalous behavior is observed with NVCP V-SYNC (or visa-versa), each solution should be tried until said behavior is resolved.
And:
https://blurbusters.com/gsync/gsync101- ... ttings/15/
Wait, why should I enable V-SYNC with G-SYNC again? And why am I still seeing tearing with G-SYNC enabled and V-SYNC disabled? Isn’t G-SYNC suppose to fix that?

The answer is frametime variances.

“Frametime” denotes how long a single frame takes to render. “Framerate” is the totaled average of each frame’s render time within a one second period.

At 144Hz, a single frame takes 6.9ms to display (the number of which depends on the max refresh rate of the display, see here), so if the framerate is 144 per second, then the average frametime of 144 FPS is 6.9ms per frame.

In reality, however, frametime from frame to frame varies, so just because an average framerate of 144 per second has an average frametime of 6.9ms per frame, doesn’t mean all 144 of those frames in each second amount to an exact 6.9ms per; one frame could render in 10ms, the next could render in 6ms, but at the end of each second, enough will hit the 6.9ms render target to average 144 FPS per.

So what happens when just one of those 144 frames renders in, say, 6.8ms (146 FPS average) instead of 6.9ms (144 FPS average) at 144Hz? The affected frame becomes ready too early, and begins to scan itself into the current “scanout” cycle (the process that physically draws each frame, pixel by pixel, left to right, top to bottom on-screen) before the previous frame has a chance to fully display (a.k.a. tearing).

G-SYNC + V-SYNC “Off” allows these instances to occur, even within the G-SYNC range, whereas G-SYNC + V-SYNC “On” (what I call “frametime compensation” in this article) allows the module (with average framerates within the G-SYNC range) to time delivery of the affected frames to the start of the next scanout cycle, which lets the previous frame finish in the existing cycle, and thus prevents tearing in all instances.

And since G-SYNC + V-SYNC “On” only holds onto the affected frames for whatever time it takes the previous frame to complete its display, virtually no input lag is added; the only input lag advantage G-SYNC + V-SYNC “Off” has over G-SYNC + V-SYNC “On” is literally the tearing seen, nothing more.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

braavosraider
Posts: 11
Joined: 17 Jan 2021, 23:42

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by braavosraider » 21 Mar 2021, 06:08

jorimt wrote:
19 Jun 2017, 08:54
This is the official discussion topic for Blur Buster's 15-part "G-SYNC 101" series featured on blurbusters.com:
http://www.blurbusters.com/gsync/gsync101/

Image

It is a continuation of my now archived original "G-Sync 101 w/Chart (WIP)" topic here:
http://forums.blurbusters.com/viewtopic.php?f=5&t=3073

Image

I welcome further input, questions, and discussion regarding the G-SYNC 101 series, or G-SYNC functionality in general...
Hi jorimt will I have slight or big input lag increase if I use V-SYNC + G-SYNC in 110fps cap (using RTSS and 280hz monitor - VG279QM) ? or do I have to turn off V-SYNC?

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by jorimt » 27 Mar 2021, 10:25

braavosraider wrote:
21 Mar 2021, 06:08
Hi jorimt will I have slight or big input lag increase if I use V-SYNC + G-SYNC in 110fps cap (using RTSS and 280hz monitor - VG279QM) ? or do I have to turn off V-SYNC?
Your primary "input lag increase" in that scenario would, if anything, be the lowered framerate (110 FPS = 9.1ms render time per frame, 280 FPS = 3.6ms render time per frame).

The only input lag difference between G-SYNC on + V-SYNC off and G-SYNC on + V-SYNC on otherwise, is tearing (<1ms difference).
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

JasonBB
Posts: 2
Joined: 31 Jan 2022, 10:42

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by JasonBB » 31 Jan 2022, 11:27

Hello! Thanks so much for this awesome guide. I have a few questions I was hoping you could clarify.

1: You recommend keeping in-game V-Sync off & NVCP V-Sync On. Does this apply to games that are not running in exclusive full-screen as well? nVidia made the following comment in their https://www.nvidia.com/en-us/geforce/gu ... ion-guide/:

"As a side note, VSYNC ON in the NVIDIA Control Panel will only work for Fullscreen applications. In addition, MS Hybrid-based laptops do not support VSYNC ON. If you are gaming in windowed mode or on one of these laptops, and want to utilize G-SYNC + VSYNC + Reflex mode, use in-game VSYNC."

2: What is the advantage of using NVCP V-Sync over in-game V-Sync? Battle(non)sense recommends using in-game V-Sync, not NVCP V-Sync.
https://youtu.be/Gub1bI12ODY?t=82

3: You recommend using Low Latency Mode On if an FPS limiter is being used. However, Battle(non)sense recommends keeping this set to Off.
https://youtu.be/Gub1bI12ODY?t=82

JasonBB
Posts: 2
Joined: 31 Jan 2022, 10:42

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by JasonBB » 31 Jan 2022, 11:33

I also wanted to ask another question...

You recommend that when not using a frame rate limiter, set Low Latency to Ultra. However, nVidia says that the frame rate limiter and NULL should be used together:

https://nvidia.custhelp.com/app/answers ... C-and-more

"Reducing System Latency: Enable Max Frame Rate and set your power management mode to “Prefer maximum performance” to reduce latency. While in this mode, the GPU is kept at higher frequencies to process frames as quickly as possible. To maximize latency reduction in GPU bound scenarios where FPS is consistent, set Max Frame Rate to a framerate slightly below the average FPS and turn Low Latency Mode to Ultra."

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by jorimt » 31 Jan 2022, 12:16

JasonBB wrote:
31 Jan 2022, 11:27
1: You recommend keeping in-game V-Sync off & NVCP V-Sync On. Does this apply to games that are not running in exclusive full-screen as well? nVidia made the following comment in their https://www.nvidia.com/en-us/geforce/gu ... ion-guide/:

"As a side note, VSYNC ON in the NVIDIA Control Panel will only work for Fullscreen applications. In addition, MS Hybrid-based laptops do not support VSYNC ON. If you are gaming in windowed mode or on one of these laptops, and want to utilize G-SYNC + VSYNC + Reflex mode, use in-game VSYNC."
Traditional "exclusive fullscreen" is rarely used any longer. Due to things such as the fullscreen optimizations setting at the OS-level, MPO (where true borderless and windowed modes can now tear), and DX12 and Vulkan's newer flip models, nearly every modern "fullscreen" mode acts as hybrid borderless/windowed exclusive fullscreen with adaptive composition.

As such, yes, I still recommend G-SYNC + V-SYNC in general. The worst that can happen in true borderless/windowed games, assuming MPO doesn't kick in, is the NVCP V-SYNC setting being ignored and composition V-SYNC being used instead (which should simply adhere to the VBLANK the same way as NVCP V-SYNC with G-SYNC and framerates within the refresh rate does).
JasonBB wrote:
31 Jan 2022, 11:27
2: What is the advantage of using NVCP V-Sync over in-game V-Sync? Battle(non)sense recommends using in-game V-Sync, not NVCP V-Sync.
https://youtu.be/Gub1bI12ODY?t=82
That NVCP V-SYNC is nearly always guaranteed to engage, whereas the same can't be said for all in-game solutions.

That said, 99% of time, either NVCP or in-game V-SYNC can be paired with G-SYNC, though there are some key situations where ULLM or Reflex may require NVCP V-SYNC to be enabled for their auto FPS limiting functionality to engage.
JasonBB wrote:
31 Jan 2022, 11:27
3: You recommend using Low Latency Mode On if an FPS limiter is being used. However, Battle(non)sense recommends keeping this set to Off.
https://youtu.be/Gub1bI12ODY?t=82
I don't necessarily recommend LLM "On"; readers kept asking me what LLM should be set to, so I added the guidance that if you want to have it enabled with G-SYNC, and you want to set a manual FPS limit, "On" is typically safe to use, as in the best case, it will reduce input lag by up to 1 frame in GPU-bound scenarios (in supported games and APIs), and at worst, it will do nothing in games that don't support it.
JasonBB wrote:
31 Jan 2022, 11:33
You recommend that when not using a frame rate limiter, set Low Latency to Ultra. However, nVidia says that the frame rate limiter and NULL should be used together:

https://nvidia.custhelp.com/app/answers ... C-and-more

"Reducing System Latency: Enable Max Frame Rate and set your power management mode to “Prefer maximum performance” to reduce latency. While in this mode, the GPU is kept at higher frequencies to process frames as quickly as possible. To maximize latency reduction in GPU bound scenarios where FPS is consistent, set Max Frame Rate to a framerate slightly below the average FPS and turn Low Latency Mode to Ultra."
That's for non-G-SYNC scenarios, where LLM "Ultra" doesn't set an auto FPS limit.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

smoothnobody
Posts: 10
Joined: 13 Feb 2022, 17:04

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by smoothnobody » 13 Feb 2022, 23:40

jorimt wrote:
31 Jan 2022, 12:16
I don't necessarily recommend LLM "On"; readers kept asking me what LLM should be set to, so I added the guidance that if you want to have it enabled with G-SYNC, and you want to set a manual FPS limit, "On" is typically safe to use, as in the best case, it will reduce input lag by up to 1 frame in GPU-bound scenarios (in supported games and APIs), and at worst, it will do nothing in games that don't support it.
what about for those who don't have a gsync display? LLM yay or nay?

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by jorimt » 14 Feb 2022, 11:13

smoothnobody wrote:
13 Feb 2022, 23:40
what about for those who don't have a gsync display? LLM yay or nay?
If your system is GPU-limited in the given game (aka GPU usage is 99%+ most or all of the time), said game isn't running DX12 or Vulkan (which handle the render queue internally), and it doesn't support Reflex, then you can try either LLM "On" or "Ultra."

Without G-SYNC, LLM "On" sets MPRF to "1" in games that support external manipulation of the render queue (no easy way to tell which do/don't), whereas LLM "Ultra" tries to deliver frame information from the CPU to the GPU "just in time" in GPU-bound non-G-SYNC scenarios as to reduce render queue latency.

If your system is capable enough, and LLM Ultra actually applies in the given game, it will typically reduce render queue latency whenever the GPU usage is maxed by around 1 to 1 1/2 frames. If your system isn't capable enough for whatever reason, it may create more stutter in the same scenario.

So, really, whether you should use LLM or not (with or without G-SYNC) highly depends on the games in question and the system running them.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

smoothnobody
Posts: 10
Joined: 13 Feb 2022, 17:04

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by smoothnobody » 15 Feb 2022, 02:27

jorimt wrote:
14 Feb 2022, 11:13
So, really, whether you should use LLM or not (with or without G-SYNC) highly depends on the games in question and the system running them.
makes sense. if you had a RTX 3080, and you didn't want to be bothered with turning LLM on and off based on what game you are playing, what would be your set it and forget it setting?

Post Reply