Page 1 of 1

GSYNC with low and inconsistent FPS

Posted: 18 Aug 2025, 05:50
by Lord Stumpf
Hello there!

I am a pretty new to GSync and feel a little lost on this maybe more specific scenario I found myself in.

I am playing on a rather old machine, which is fine since I usually play rather old games, except for 1 more recent fast paced arena shooter.

In said game I get 100FPS at best, but will usually hover between mid 70s and lower 90s in action heavy situations. (I am playing on a 144Hz monitor)

Some time ago a friend now introduced me to GSYNC claiming that it would reduce input lag and sharpen the image and while I felt a positive impact when my FPS are stable in an empty lobby, lags during action kinda felt worse than usual.

Should I be using GSYNC at all when I am that far below my Monitors ideal FPS and if so what settings should I keep an eye on?

Below I'll put some technical data in case anyone can use it:

My specs
  • GTX 1080ti
  • i7-6700k
  • Asus Z170k
  • 16 GB RAM
  • Game is being run on an SSD
Recommended game specs
  • Intel i7-11700 or AMD Ryzen 5 5600
  • NVIDIA RTX 2080 or Radeon RX6700
  • 8 GB RAM
In-game everything is set to lowest, NVidia Low Latency + Boost is enabled

I run GSYNC and VSYNC simultaneously through the NVidia control panel, I also limited my FPS to 100 there.

The game is called Tribes 3 Rivals in case anyone cares

Thanks you for your time.

Re: GSYNC with low and inconsistent FPS

Posted: 20 Aug 2025, 09:58
by jorimt
Lord Stumpf wrote:
18 Aug 2025, 05:50
Some time ago a friend now introduced me to GSYNC claiming that it would reduce input lag and sharpen the image and while I felt a positive impact when my FPS are stable in an empty lobby, lags during action kinda felt worse than usual.

Should I be using GSYNC at all when I am that far below my Monitors ideal FPS and if so what settings should I keep an eye on?
G-SYNC reduces latency and stutter over standalone V-SYNC for framerates within the refresh rate. It does not reduce latency over no sync (G-SYNC off + V-SYNC) in the same scenario. As for it "sharpening the image," G-SYNC has no direct influence on motion clarity.

G-SYNC was originally created to prevent tearing for fluctuating framerates within the physical refresh rate without adding the latency or stutter standalone V-SYNC does in the same scenario, and is responsible for one thing and one thing only; dynamically steering the tearline off-screen into the VBLANK by adjusting the amount of times the screen refreshes per second to the current average framerate.

That's it.

So if you can't tolerate tearing artifacts, G-SYNC is the lowest latency tear-free solution available.

If, however, tearing artifacts don't bother you, the lowest possible latency and highest perceived responsiveness in your 100 FPS 144Hz scenario is no sync.

Otherwise, your stated G-SYNC on + NVCP V-SYNC on + FPS limit within the refresh rate configuration is correct.

Re: GSYNC with low and inconsistent FPS

Posted: 20 Aug 2025, 13:15
by kyube
jorimt wrote:
20 Aug 2025, 09:58
G-SYNC was originally created to prevent tearing for fluctuating framerates within the physical refresh rate without adding the latency or stutter standalone V-SYNC does in the same scenario, and is responsible for one thing and one thing only; dynamically steering the tearline off-screen into the VBLANK by adjusting the amount of times the screen refreshes per second to the current average framerate.
If I may interfere in your explanation, I've been curious about GSYNC & in extension FreeSync.
I haven't been able to find the information what the precision of VRR is (in microseconds/nanoseconds) and to what specific metric in the pipeline it hooks onto.

To be specific
Image

or the pipeline here: https://www.nvidia.com/en-us/geforce/ne ... ed-section

To which particular PresentMon metric does the monitor synchronize itself when G-SYNC is enabled?
(Note: PresentMon 1.x, 2.x differ in what they consider frame time is. There's also the MsBetweenSimulationStart metric (NV Reflex-specific metric), which according to other sources, one can consider the “true” frame time of a game.)

What is the precision of this (in ms/µs/ns) procedure?
Does NV's G-Sync, AMD"s FreeSync, VESA's Adaptive Sync or HDMI's VRR differ in precision?

This is important to me, because refresh rates (or the equivalent in the time-domain) are never integer, they're always fractional in nature. I'd like to know how many decimal points in precision it has (or the time-domain equivalent), as I haven't been able to find this information anywhere online.

Another question I have is: Why does G-Sync On + V-Sync Off not behave out of the box like G-Sync On vs V-Sync On?
From your guide:
G-SYNC + V-SYNC “Off” disables the G-SYNC module’s ability to compensate for sudden frametime variances, meaning, instead of aligning the next frame scan to the next scanout (the process that physically draws each frame, pixel by pixel, left to right, top to bottom on-screen), G-SYNC + V-SYNC “Off” will opt to start the next frame scan in the current scanout instead. This results in simultaneous delivery of more than one frame in a single scanout (tearing).

G-SYNC + V-SYNC “On”:
This is how G-SYNC was originally intended to function. Unlike G-SYNC + V-SYNC “Off,” G-SYNC + V-SYNC “On” allows the G-SYNC module to compensate for sudden frametime variances by adhering to the scanout, which ensures the affected frame scan will complete in the current scanout before the next frame scan and scanout begin. This eliminates tearing within the G-SYNC range, in spite of the frametime variances encountered.

Frametime compensation with V-SYNC “On” is performed during the vertical blanking interval (the span between the previous and next frame scan), and, as such, does not delay single frame delivery within the G-SYNC range and is recommended for a tear-free experience (see G-SYNC 101: Optimal Settings & Conclusion).
This explanation doesn't make me understand why this difference exists.
May I ask what your source on the "originally intended to function "is?
I believe there's some ambguity in your text in regards to this topic. Or maybe something from your explanation is escaping my mind.

Re: GSYNC with low and inconsistent FPS

Posted: 21 Aug 2025, 11:39
by jorimt
kyube wrote:
20 Aug 2025, 13:15
I haven't been able to find the information what the precision of VRR is (in microseconds/nanoseconds) and to what specific metric in the pipeline it hooks onto.
I'm not privy to that information, and I'm not sure anyone else beyond the Nvidia engineers are either.
kyube wrote:
20 Aug 2025, 13:15
Does NV's G-Sync, AMD"s FreeSync, VESA's Adaptive Sync or HDMI's VRR differ in precision?
Hardware G-SYNC does it via a dedicated module, while all non-module VRR does it over the cable with software, predominately at the driver-level.

Hypothetically, the module has more precision, but probably not enough to be noticed in regular usage, especially if all else is equal (instantaneous scanout at the monitor-level, full LFC support, etc).
kyube wrote:
20 Aug 2025, 13:15
Another question I have is: Why does G-Sync On + V-Sync Off not behave out of the box like G-Sync On vs V-Sync On?
Because disabling the V-SYNC option also prevents strict VBLANK adherence, since V-SYNC = waiting for the VBLANK with or without VRR.

I have a dedicated Closing FAQ entry for that. Read #2:
https://blurbusters.com/gsync/gsync101- ... ttings/15/

I've also provided more rudimentary explanations in the article comments multiple times. For example:

vrr+v-sync-option-comment-2.png
vrr+v-sync-option-comment-2.png (83.3 KiB) Viewed 8035 times
or

vrr+v-sync-option-comment-1.jpg
vrr+v-sync-option-comment-1.jpg (166.6 KiB) Viewed 8035 times
kyube wrote:
20 Aug 2025, 13:15
May I ask what your source on the "originally intended to function "is?
See emboldened:
https://blurbusters.com/gsync/gsync101- ... -settings/
G-SYNC & V-SYNC

G-SYNC (GPU Synchronization) works on the same principle as double buffer V-SYNC; buffer A begins to render frame A, and upon completion, scans it to the display. Meanwhile, as buffer A finishes scanning its first frame, buffer B begins to render frame B, and upon completion, scans it to the display, repeat.

The primary difference between G-SYNC and V-SYNC is the method in which rendered frames are synchronized. With V-SYNC, the GPU’s render rate is synchronized to the fixed refresh rate of the display. With G-SYNC, the display’s VRR (variable refresh rate) is synchronized to the GPU’s render rate.

Upon its release, G-SYNC’s ability to fall back on fixed refresh rate V-SYNC behavior when exceeding the maximum refresh rate of the display was built-in and non-optional. A 2015 driver update later exposed the option.

This update led to recurring confusion, creating a misconception that G-SYNC and V-SYNC are entirely separate options. However, with G-SYNC enabled, the “Vertical sync” option in the control panel no longer acts as V-SYNC, and actually dictates whether, one, the G-SYNC module compensates for frametime variances output by the system (which prevents tearing at all times. G-SYNC + V-SYNC “Off” disables this behavior; see G-SYNC 101: Range), and two, whether G-SYNC falls back on fixed refresh rate V-SYNC behavior; if V-SYNC is “On,” G-SYNC will revert to V-SYNC behavior above its range, if V-SYNC is “Off,” G-SYNC will disable above its range, and tearing will begin display wide.

Within its range, G-SYNC is the only syncing method active, no matter the V-SYNC “On” or “Off” setting.
For my sources, refer to the "Official" links under the "References & Sources" section in the OP of my original thread:
viewtopic.php?f=5&t=3073