Blur Buster's G-SYNC 101 Series Discussion

Talk about NVIDIA G-SYNC, a variable refresh rate (VRR) technology. G-SYNC eliminates stutters, tearing, and reduces input lag. List of G-SYNC Monitors.
gameinn
Posts: 43
Joined: 16 Nov 2020, 10:11

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by gameinn » 15 Feb 2022, 20:45

My god I'm stupid.

Game #3 would be Overwatch. Sorry about that.

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by jorimt » 15 Feb 2022, 21:33

gameinn wrote:
15 Feb 2022, 20:45
My god I'm stupid.

Game #3 would be Overwatch. Sorry about that.
No worries.

If you enable Reflex, it will set it's own limit. I've tested the Reflex limiter in Overwatch, and it has as low latency as the in-game limiter, so your choice:
viewtopic.php?p=72633#p72182

As to whether you should enable/disable Reflex in Overwatch, it depends on if your GPU usage maxes at any point. If it does, enable Reflex to prevent render queue latency in said instances, if not, it won't harm anything, but it won't further reduce latency when the GPU usage isn't maxed either.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

smoothnobody
Posts: 10
Joined: 13 Feb 2022, 17:04

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by smoothnobody » 15 Feb 2022, 22:26

so, thanks to the chief blur buster, i now know the difference between g-sync and g-sync compatible. i have g-sync compatible. not sure if this makes an difference to your recommendation, but do you guys still recommend LLM ultra as my set it and forget it? if i'm using ultra, i'm assuming i should no longer set the "max frame rate" to 3 FPS below refresh rate?

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by jorimt » 15 Feb 2022, 23:11

smoothnobody wrote:
15 Feb 2022, 22:26
so, thanks to the chief blur buster, i now know the difference between g-sync and g-sync compatible. i have g-sync compatible. not sure if this makes an difference to your recommendation
It doesn't change my base recommended optimal G-SYNC settings.
smoothnobody wrote:
15 Feb 2022, 22:26
but do you guys still recommend LLM ultra as my set it and forget it? if i'm using ultra, i'm assuming i should no longer set the "max frame rate" to 3 FPS below refresh rate?
LLM Ultra's auto FPS limit doesn't work in every game, so it can't be relied on for FPS limiting with G-SYNC in all cases.

LLM settings are highly conditional, mainly applying to situations in which your GPU usage is maxed (for sustained periods or otherwise), and are not directly related to VRR operation sans the auto FPS limiter included with Ultra when used in combination with G-SYNC + V-SYNC, where supported.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

gameinn
Posts: 43
Joined: 16 Nov 2020, 10:11

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by gameinn » 16 Feb 2022, 02:43

Sorry if this is a stupid question but I thought I would make sure.

So in my LG settings I can choose between Basic and Extended in Adaptive Sync. When I google this people are saying this range is only upto 144 Hz? Huh? Should I be running at 140 even if I have a 165/180 Hz screen due to this?

Also, I don't know if I came across a really weird bug in Overwatch. If I set a frame rate limit to say 375: https://i.imgur.com/vDD9D6k.jpeg

The fps stays way below it at 157: https://i.imgur.com/nTOBsnn.jpeg

However if I disable Nvidia reflex then the fps stays at 164: https://i.imgur.com/zlQDQWe.jpeg

If I set the fps limiter to something in the g sync "range" such as 112 it seems to work: https://i.imgur.com/wJeNyef.jpeg

I guess it makes sense since v sync is enabled in Nvidia control panel but why is enabling nvidia reflex changing the fps limiter from 157 to 164?

I don't know if you know much about it but my SIM values seem to be much worse with reflex enabled?

On: https://i.imgur.com/ADYsBPA.jpeg
Off: https://i.imgur.com/QwSwHc1.jpeg

smoothnobody
Posts: 10
Joined: 13 Feb 2022, 17:04

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by smoothnobody » 16 Feb 2022, 03:33

jorimt wrote:
15 Feb 2022, 23:11
LLM Ultra's auto FPS limit doesn't work in every game, so it can't be relied on for FPS limiting with G-SYNC in all cases.

LLM settings are highly conditional.
if ultra FPS limiter doesn't work in every game, why not just use LLM on + NVCP FPS limiter? i saw you guys discussing ultra, but unless it went over my head, wasn't seeing the benefit of ultra over on.

if LLM doesn't work in the game i'm playing, or the condition isn't met for it to be beneficial, can both on and ultra be detrimental to have turned on? i know studder was mentioned, but wasn't sure if studder applied to both on and ultra, if there were other unwanted effects, or what the conditions had to be to cause problems. the potential for studder and having to adjust things based on the game being played doesn't seem like set it and forget it. sounds like LLM off is the only never have to mess with it setting.

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by jorimt » 16 Feb 2022, 09:58

smoothnobody wrote:
16 Feb 2022, 03:33
if ultra FPS limiter doesn't work in every game, why not just use LLM on + NVCP FPS limiter?
LLM "On" doesn't work in every game either. LLM doesn't work in DX12 or Vulkan at all for instance, since they handle the render queue internally, and don't allow external manipulation of said queue.
smoothnobody wrote:
16 Feb 2022, 03:33
i saw you guys discussing ultra, but unless it went over my head, wasn't seeing the benefit of ultra over on.
With G-SYNC, "Ultra" sets an auto fps limit slightly below the refresh rate to keep G-SYNC in range, whereas "On" doesn't.

More broadly, "Ultra" uses a different method to reduce the pre-rendered frames in the render queue with a "just in time" delivery component, whereas "On" simply sets the max pre-rendered frames to "1." But again, whether they'll actually apply when enabled is game and engine dependent.

Reflex is superior, is at the engine-level, replaces both LLM modes, and is guaranteed to work whenever the GPU is maxed in games that feature it, so you should use that for the same purposes, where available instead (Reflex also overrides and disables LLM when active anyway).
smoothnobody wrote:
16 Feb 2022, 03:33
if LLM doesn't work in the game i'm playing, or the condition isn't met for it to be beneficial, can both on and ultra be detrimental to have turned on? i know studder was mentioned, but wasn't sure if studder applied to both on and ultra, if there were other unwanted effects, or what the conditions had to be to cause problems. the potential for studder and having to adjust things based on the game being played doesn't seem like set it and forget it. sounds like LLM off is the only never have to mess with it setting.
If LLM actually works in the given game and your system is up to the task, it will reduce the maximum amount of pre-rendered frames that can be held in the render queue whenever the CPU is waiting on a maxed GPU.

In GPU-bound scenario, the GPU is too busy to take new frame information from the CPU and is still working on the finishing the previous frame information the CPU gave it. While the CPU waits, it creates additional frame information for future frames in advanced. The render queue holds each of these "advanced" frames, which can pile up the longer the CPU waits for the GPU to finish the previous frame(s).

If they pile up too long, once the GPU is finally ready to take information from the CPU for creation of the next frame for delivery, it will start with the oldest/first frame information the CPU prepared, which will prevent it from looking like things skipped forward in time (they did; the engine is still running regardless of when the latest frame information reaches the display). This prevents the appearance of stutter, but it also means the latest frame is now based on older input information.

LLM reduces the maximum amount of frames that can be held in the render queue at a time to "1," but this means if the GPU takes longer than 1 frame to retrieve frame information from the CPU, say a span of 3 frames instead, it will take the oldest frame information (the "1" pre-rendered frame that was allowed to be generated by LLM) for the next frame, but then it will have to rely on real-time frame information for the next frame, which may be an entire frame (or more) ahead of the pre-rendered frame, causing the appearance of stutter, repeat.

Anyway, the render queue is a convoluted and confusing manufacturing line that amounts to user input time travel.

That said, if your GPU usage isn't maxed at any point, the above process won't apply, since the GPU should now never be busy enough to make the CPU wait, so there's no frame information to pile up.

And on that point, unlike LLM, Reflex also avoids this by never letting the GPU actually become maxed, so the render queue effectively stays empty of pre-rendered frames.

So basically, if you globally enable LLM "On" or "Ultra," and IF it actually applies to the given game, at any point the GPU usage is maxed, there will typically be up to an average of 1 to 1/12 frame reduction of latency at the render queue level (whether it causes stutter depends on the parity of the CPU and GPU on the given system, and how long the GPU usage is maxed at any given point which can vary from scene to scene even), whereas if LLM doesn't apply in the given game, it won't take effect, so there's no harm in having it enabled.

TLDR; it "depends."
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

gameinn
Posts: 43
Joined: 16 Nov 2020, 10:11

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by gameinn » 16 Feb 2022, 15:53

Any ideas for my issues above? @jorimt

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by jorimt » 16 Feb 2022, 16:15

gameinn wrote:
16 Feb 2022, 02:43
So in my LG settings I can choose between Basic and Extended in Adaptive Sync. When I google this people are saying this range is only upto 144 Hz? Huh? Should I be running at 140 even if I have a 165/180 Hz screen due to this?
According to the RTINGS review (https://www.rtings.com/monitor/reviews/lg/27gp850-b), your monitor model has a 144Hz limit over HDMI. G-SYNC Compatible mode and 165/180Hz is only supported over DisplayPort.
gameinn wrote:
16 Feb 2022, 02:43
Also, I don't know if I came across a really weird bug in Overwatch. If I set a frame rate limit to say 375: https://i.imgur.com/vDD9D6k.jpeg

The fps stays way below it at 157: https://i.imgur.com/nTOBsnn.jpeg

However if I disable Nvidia reflex then the fps stays at 164: https://i.imgur.com/zlQDQWe.jpeg

If I set the fps limiter to something in the g sync "range" such as 112 it seems to work: https://i.imgur.com/wJeNyef.jpeg
That's not a bug. Reflex sets it's own auto limit when enabled with G-SYNC, which at 165Hz is ~157 FPS.

I've tested the Reflex limiter in Overwatch. It's as low latency as the in-game limiter:
viewtopic.php?p=72633#p72182
gameinn wrote:
16 Feb 2022, 02:43
I don't know if you know much about it but my SIM values seem to be much worse with reflex enabled?

On: https://i.imgur.com/ADYsBPA.jpeg
Off: https://i.imgur.com/QwSwHc1.jpeg
I answered a similar question in the comments section of my G-SYNC article:
“SIM” is directly tied to average framerate in that game; the higher the framerate, the lower the SIM number.

G-SYNC obviously only works when the framerate is within the refresh rate, which means it’s highest framerate is limited to just under the max refresh rate of the display.
As for your captures, this is what those numbers per slash represent:
SIM min frametime/ avg frametime/ max frametime/

The average frametime with Reflex enabled in your capture is 6.3ms, which almost perfectly aligns with the average frametime of 157 FPS at 6.4ms, whereas the average frametime with Reflex disabled and a 164 FPS limit in your capture is 6.1ms, which is the average frametime of 164 FPS.

So what you're seeing is entirely expected, though I'd add that 164 FPS at 165Hz isn't low enough to keep G-SYNC in range at all times anyway, so ironically, you're getting more latency (assuming you're using G-SYNC + V-SYNC) with that than with Reflex's auto limit in that scenario.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

gameinn
Posts: 43
Joined: 16 Nov 2020, 10:11

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by gameinn » 16 Feb 2022, 20:46

Thanks. The thing concerning me was the right SIM value being much higher than the other screenshot.

I don't know how long the values take to change so I expect if I kept it open more than 10 seconds it would stabilise lower but then it doesn't explain why the non reflex enabled screenshot didn't show this behavior.

Post Reply