Input Delay on "Gsync Compatible" monitors

Talk about NVIDIA G-SYNC, a variable refresh rate (VRR) technology. G-SYNC eliminates stutters, tearing, and reduces input lag. List of G-SYNC Monitors.
Post Reply
TripleWrx1987
Posts: 1
Joined: 19 Dec 2020, 11:38

Input Delay on "Gsync Compatible" monitors

Post by TripleWrx1987 » 19 Dec 2020, 11:51

Hi,

I have read all the information provided at "/gsync/gsync101-input-lag-tests-and-settings/", as well as I have watched Battle(non)sense videos regarding gsync and input lag, however some things are still not clear to me.

I own a vg279qm monitor, which is "Gsync compatible". So from my google search, there is very little info about it, but from what I gathered, gsync compatible uses adaptive sync from the cable connection (display port), which is different from a native gsync monitor which has some chip for this.

So a few of my questions.

What is the input delay difference on a Gsync native vs Gsync compatible monitor?

Does the Gsync ON + Vsync On + 3FPS below max refresh rate actually work on a "Gsync compatible monitor"? Because if gsync compatible is a form of adaptive sync, when I googled this, a user on reddit said
"Adaptive Vsync just turns off Vsync if you go below your target fps. G-Sync adjusts the fps of your monitor to match your gpu and is only utilized if you have an nvidia gpu and a g-sync monitor."

So if Gsync Compatibe is just adaptive sync modified, does it work properly with the Gsync ON + Vsync On + 3FPS below cap, wouldn't it be counter productive?

I usually leave gsync fully off in all games unless I can actually see tearing with my eyesight. One of the games where I do need gsync, is League of Legends. With Gsync/Vsync OFF, when I move the camera around at fast speeds, I can see quite a lot screen tearing. If I do Gsync ON + Vsync off + +3fps below max refresh rate cap, I get very little screen tearing, and finally if I do Gsync ON + Vsync ON + 3FPS thing, I cannot see screen tearing at all anymore.

So the above does work as explained, but then again, it could be just my brain giving me placebo effect based on what effect I should have, I don't exactly have the best eyeside at my age.

And for my last question,

Why does blurbusters recommend Vsync ON in Nvidia Control Panel, whereas Battle(non)sense recommends in the game?


Me personally, as far as I could tell, with Gsync on + vsync on control panel, I could still see some screen tearing in League of legends, but if I did the vsync inside the game, I saw zero screen tearing.

Thank you for your time to read this.

User avatar
jorimt
Posts: 2484
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Input Delay on "Gsync Compatible" monitors

Post by jorimt » 19 Dec 2020, 13:32

TripleWrx1987 wrote:
19 Dec 2020, 11:51
I own a vg279qm monitor, which is "Gsync compatible". So from my google search, there is very little info about it, but from what I gathered, gsync compatible uses adaptive sync from the cable connection (display port), which is different from a native gsync monitor which has some chip for this.
That model is an official G-SYNC compatible monitor tested by Nvidia, so while it doesn't have a module, adequate VRR operation is guaranteed.
TripleWrx1987 wrote:
19 Dec 2020, 11:51
What is the input delay difference on a Gsync native vs Gsync compatible monitor?
Beyond individual monitor processing and GtG differences, there shouldn't be one.
TripleWrx1987 wrote:
19 Dec 2020, 11:51
Does the Gsync ON + Vsync On + 3FPS below max refresh rate actually work on a "Gsync compatible monitor"?
Yes.
TripleWrx1987 wrote:
19 Dec 2020, 11:51
Because if gsync compatible is a form of adaptive sync, when I googled this, a user on reddit said
"Adaptive Vsync just turns off Vsync if you go below your target fps. G-Sync adjusts the fps of your monitor to match your gpu and is only utilized if you have an nvidia gpu and a g-sync monitor."
- Adaptive V-SYNC = normal double buffer V-SYNC that disables when the FPS is below the refresh rate.
- Adaptive Sync = the global term for all forms of VRR, be it G-SYNC, FreeSync, etc.
TripleWrx1987 wrote:
19 Dec 2020, 11:51
So if Gsync Compatibe is just adaptive sync modified, does it work properly with the Gsync ON + Vsync On + 3FPS below cap, wouldn't it be counter productive?
See above, it isn't "Adaptive V-SYNC," and adaptive V-SYNC isn't the same thing as "Adaptive Sync." Base VRR functionality on a G-SYNC Compatible monitor and a native G-SYNC monitor w/module does not differ.
TripleWrx1987 wrote:
19 Dec 2020, 11:51
I usually leave gsync fully off in all games unless I can actually see tearing with my eyesight. One of the games where I do need gsync, is League of Legends. With Gsync/Vsync OFF, when I move the camera around at fast speeds, I can see quite a lot screen tearing. If I do Gsync ON + Vsync off + +3fps below max refresh rate cap, I get very little screen tearing, and finally if I do Gsync ON + Vsync ON + 3FPS thing, I cannot see screen tearing at all anymore.
The higher the refresh rate, the less evident tearing artifacts are. At 240Hz, they are indeed very slight.

Also, what you're seeing is expected behavior; G-SYNC's primary function is to remove tearing without adding input lag or stutter within it's range.

Additionally, with G-SYNC on + V-SYNC off, tearing is still allowed, even within the G-SYNC range, whereas with G-SYNC on + V-SYNC on, it is not. See:
https://blurbusters.com/gsync/gsync101- ... ttings/15/
Wait, why should I enable V-SYNC with G-SYNC again? And why am I still seeing tearing with G-SYNC enabled and V-SYNC disabled? Isn’t G-SYNC suppose to fix that?

The answer is frametime variances.

“Frametime” denotes how long a single frame takes to render. “Framerate” is the totaled average of each frame’s render time within a one second period.

At 144Hz, a single frame takes 6.9ms to display (the number of which depends on the max refresh rate of the display, see here), so if the framerate is 144 per second, then the average frametime of 144 FPS is 6.9ms per frame.

In reality, however, frametime from frame to frame varies, so just because an average framerate of 144 per second has an average frametime of 6.9ms per frame, doesn’t mean all 144 of those frames in each second amount to an exact 6.9ms per; one frame could render in 10ms, the next could render in 6ms, but at the end of each second, enough will hit the 6.9ms render target to average 144 FPS per.

So what happens when just one of those 144 frames renders in, say, 6.8ms (146 FPS average) instead of 6.9ms (144 FPS average) at 144Hz? The affected frame becomes ready too early, and begins to scan itself into the current “scanout” cycle (the process that physically draws each frame, pixel by pixel, left to right, top to bottom on-screen) before the previous frame has a chance to fully display (a.k.a. tearing).

G-SYNC + V-SYNC “Off” allows these instances to occur, even within the G-SYNC range, whereas G-SYNC + V-SYNC “On” (what I call “frametime compensation” in this article) allows the module (with average framerates within the G-SYNC range) to time delivery of the affected frames to the start of the next scanout cycle, which lets the previous frame finish in the existing cycle, and thus prevents tearing in all instances.

And since G-SYNC + V-SYNC “On” only holds onto the affected frames for whatever time it takes the previous frame to complete its display, virtually no input lag is added; the only input lag advantage G-SYNC + V-SYNC “Off” has over G-SYNC + V-SYNC “On” is literally the tearing seen, nothing more.
--------
TripleWrx1987 wrote:
19 Dec 2020, 11:51
Why does blurbusters recommend Vsync ON in Nvidia Control Panel, whereas Battle(non)sense recommends in the game?
As stated in my article:
https://blurbusters.com/gsync/gsync101- ... ttings/14/
Nvidia Control Panel V-SYNC vs. In-game V-SYNC

While NVCP V-SYNC has no input lag reduction over in-game V-SYNC, and when used with G-SYNC + FPS limit, it will never engage, some in-game V-SYNC solutions may introduce their own frame buffer or frame pacing behaviors, enable triple buffer V-SYNC automatically (not optimal for the native double buffer of G-SYNC), or simply not function at all, and, thus, NVCP V-SYNC is the safest bet.

There are rare occasions, however, where V-SYNC will only function with the in-game option enabled, so if tearing or other anomalous behavior is observed with NVCP V-SYNC (or visa-versa), each solution should be tried until said behavior is resolved.
--------
TripleWrx1987 wrote:
19 Dec 2020, 11:51
Me personally, as far as I could tell, with Gsync on + vsync on control panel, I could still see some screen tearing in League of legends, but if I did the vsync inside the game, I saw zero screen tearing.
That means the Nvidia V-SYNC option is not working with LoL. A reason I stated what I did in the last part of the above quote from my article.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

Post Reply