How does GSYNC work technically?

Talk about NVIDIA G-SYNC, a variable refresh rate (VRR) technology. G-SYNC eliminates stutters, tearing, and reduces input lag. List of G-SYNC Monitors.
Post Reply
hemanursawarrior
Posts: 7
Joined: 09 Apr 2020, 13:08

How does GSYNC work technically?

Post by hemanursawarrior » 04 Dec 2022, 17:10

I went through the GSYNC 101 pages and still wasn't quite sure how all the pieces fit together. Was also having some trouble finding a detailed source elsewhere that really put all the pieces together (which was a bit odd given the ubiquity of the technologies).

1. How does refresh rate work? Why is it possible for another frame to draw into the same refresh cycle? I inferred that refresh rate meant that the new frames are read once every refresh cycle, but that doesn't seem to be correct. Unless the refresh cycle and the scanout don't have to be in sync? But if they are out of sync outside of VSYNC, is it because the refresh rate isn't truly consistent?

2. If the default scenario results in different frames being drawn in the same scanout, in what way is GSYNC without VSYNC an improvement? Does GSYNC introduce a variable VBI? And does it match the frame cycle per current frame time? Or does use some sort of running average to figure out what VBI to use?

3. Ultimately, what is the improvement of GSYNC+VSYNC together? It seems to mean that the next scanout will have the latest frame, but since I didn't understand how the earlier parts work, I vaguely have the idea that it should result in the shortest latency from frame to next scanout, but I don't truly understand why that's the case and different from VSYNC.

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: How does GSYNC work technically?

Post by jorimt » 05 Dec 2022, 09:20

hemanursawarrior wrote:
04 Dec 2022, 17:10
Was also having some trouble finding a detailed source elsewhere that really put all the pieces together (which was a bit odd given the ubiquity of the technologies).
The only other source other than us that covers G-SYNC operation with any real accuracy or depth is Battle(non)sense.
hemanursawarrior wrote:
04 Dec 2022, 17:10
1. How does refresh rate work? Why is it possible for another frame to draw into the same refresh cycle? I inferred that refresh rate meant that the new frames are read once every refresh cycle, but that doesn't seem to be correct. Unless the refresh cycle and the scanout don't have to be in sync? But if they are out of sync outside of VSYNC, is it because the refresh rate isn't truly consistent?

2. If the default scenario results in different frames being drawn in the same scanout, in what way is GSYNC without VSYNC an improvement? Does GSYNC introduce a variable VBI? And does it match the frame cycle per current frame time? Or does use some sort of running average to figure out what VBI to use?
This was generally covered here:
https://blurbusters.com/gsync/gsync101- ... ettings/6/

And here:
https://blurbusters.com/gsync/gsync101- ... ettings/9/
hemanursawarrior wrote:
04 Dec 2022, 17:10
3. Ultimately, what is the improvement of GSYNC+VSYNC together? It seems to mean that the next scanout will have the latest frame, but since I didn't understand how the earlier parts work, I vaguely have the idea that it should result in the shortest latency from frame to next scanout, but I don't truly understand why that's the case and different from VSYNC.
Refer to entry #2 in my Closing FAQ:
https://blurbusters.com/gsync/gsync101- ... ttings/15/
Wait, why should I enable V-SYNC with G-SYNC again? And why am I still seeing tearing with G-SYNC enabled and V-SYNC disabled? Isn’t G-SYNC suppose to fix that?

The answer is frametime variances.

“Frametime” denotes how long a single frame takes to render. “Framerate” is the totaled average of each frame’s render time within a one second period.

At 144Hz, a single frame takes 6.9ms to display (the number of which depends on the max refresh rate of the display, see here), so if the framerate is 144 per second, then the average frametime of 144 FPS is 6.9ms per frame.

In reality, however, frametime from frame to frame varies, so just because an average framerate of 144 per second has an average frametime of 6.9ms per frame, doesn’t mean all 144 of those frames in each second amount to an exact 6.9ms per; one frame could render in 10ms, the next could render in 6ms, but at the end of each second, enough will hit the 6.9ms render target to average 144 FPS per.

So what happens when just one of those 144 frames renders in, say, 6.8ms (146 FPS average) instead of 6.9ms (144 FPS average) at 144Hz? The affected frame becomes ready too early, and begins to scan itself into the current “scanout” cycle (the process that physically draws each frame, pixel by pixel, left to right, top to bottom on-screen) before the previous frame has a chance to fully display (a.k.a. tearing).

G-SYNC + V-SYNC “Off” allows these instances to occur, even within the G-SYNC range, whereas G-SYNC + V-SYNC “On” (what I call “frametime compensation” in this article) allows the module (with average framerates within the G-SYNC range) to time delivery of the affected frames to the start of the next scanout cycle, which lets the previous frame finish in the existing cycle, and thus prevents tearing in all instances.

And since G-SYNC + V-SYNC “On” only holds onto the affected frames for whatever time it takes the previous frame to complete its display, virtually no input lag is added; the only input lag advantage G-SYNC + V-SYNC “Off” has over G-SYNC + V-SYNC “On” is literally the tearing seen, nothing more.

For further explanations on this subject see part 1 “Control Panel,” part 4 “Range,” and part 6 “G-SYNC vs. V-SYNC OFF w/FPS Limit” of this article, or read the excerpts below…
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

User avatar
RealNC
Site Admin
Posts: 3737
Joined: 24 Dec 2013, 18:32
Contact:

Re: How does GSYNC work technically?

Post by RealNC » 05 Dec 2022, 09:49

hemanursawarrior wrote:
04 Dec 2022, 17:10
How does refresh rate work?
When the displays draws a frame, which it does left to right, top to bottom, it is reading each pixel in that same order as it's scanning it out. If the pixels it didn't yet read change, which happens when the GPU changes the frame buffer to a new frame, then the display simply reads the new pixels and scans those out.

It's not like the display can actually read the whole frame at once. The actual signal that goes through the cable is rather slow. At 60Hz, the signal needs 16.7ms to transmit the whole frame. The signal itself is continuous and never stops. It's a constant stream of data, and the source of that data is the frame buffer in the GPU from which the signal transmitter is reading from. The signal transmitter doesn't care and doesn't know if the frame buffer changes or not. It simply reads data from it and encodes that data for transmission to the display.

When all the data in the framebuffer have been transmitted, it doesn't just simply start reading again immediately from the beginning of the frame buffer. It instead transmits a so-called "vertical blanking interval" signal, or "vblank" for short. It isn't meant to display anything, it's just meant to give the display time to move the CRT electron gun back to the top-left. Modern displays don't strictly need that of course, but the vblank is still there because LCDs needed to be able to handle the same signal.

Variable refresh rate displays exploit the vblank. When VRR is active, the GPU simply doesn't end the vblank signal. It elongates it and only ends it when a new frame is available. Since the display will not begin scanning out a new a frame until after the vblank has ended, the GPU can control the display scanout by manipulating the length of the vblank.

It's wonderfully simple, actually. A very nice hack that works really well.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

hemanursawarrior
Posts: 7
Joined: 09 Apr 2020, 13:08

Re: How does GSYNC work technically?

Post by hemanursawarrior » 05 Dec 2022, 12:28

To clarify, the idea of the refresh cycle/refresh rate is really an abstraction? The actual process is just reading from the buffer to scanout + VBI. And this entire unit is taken to be the "refresh cycle"?

If so, I think that helps a lot because although I saw the diagram, I was still struggling with thinking that there were two separate mechanisms going on.
Since the display will not begin scanning out a new a frame until after the vblank has ended, the GPU can control the display scanout by manipulating the length of the vblank.
I thought it was written somewhere the scanout was always at the max rate? Is this saying that scanout will not happen until vblank is over?

User avatar
RealNC
Site Admin
Posts: 3737
Joined: 24 Dec 2013, 18:32
Contact:

Re: How does GSYNC work technically?

Post by RealNC » 05 Dec 2022, 13:26

hemanursawarrior wrote:
05 Dec 2022, 12:28
To clarify, the idea of the refresh cycle/refresh rate is really an abstraction? The actual process is just reading from the buffer to scanout + VBI. And this entire unit is taken to be the "refresh cycle"?
Yep. Exactly.
Since the display will not begin scanning out a new a frame until after the vblank has ended, the GPU can control the display scanout by manipulating the length of the vblank.
I thought it was written somewhere the scanout was always at the max rate? Is this saying that scanout will not happen until vblank is over?
The scanout speed is at max rate. Meaning how long it takes for the signal to transmit all pixels from top to bottom. At 144Hz, it takes 6.7ms. That is fixed. If the current FPS is less, like say, 100FPS, that just means that the VBI is increased. Each frame, even though it's now 100Hz, still takes 6.7ms to be scanned out. But after the scanout happens, the rest of the time is padded out by the VBI. In this case, 3.1ms of VBI padding.

So basically, no matter what the current FPS is, each frame gets scanned out at max fixed speed, and then the GPU just says "wait" to the display until the game presents the next frame. And "wait" just means not ending the VBI. Game presents frame, the GPU ends the VBI.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

hemanursawarrior
Posts: 7
Joined: 09 Apr 2020, 13:08

Re: How does GSYNC work technically?

Post by hemanursawarrior » 05 Dec 2022, 19:48

What causes tearing then? Is it that the GPU is dumping a new frame into the frame buffer before the scanout is finished?

Then GSYNC+VSYNC off shouldn't do much to fix this since nothing is preventing new frames from being dumped in.

VSYNC alone tries to fix this by having an extra buffer, and so the frame can't change before scanout is finished, but it also means that you might 1+ extra refresh cycles/scanouts before the next frame is ready.

So GSYNC+VSYNC together will let scanout run as soon as the next frame is ready. So the actual gain over VSYNC is that delta between when the frame was ready to when the next fixed scanout would have happened. For some reason that doesn't sound that big but maybe I'm not quantifying it right.

And in terms of input lag, I guess tearing/drawing multiple frames into the monitor hits diminishing returns on how fast humans can react anyways.

User avatar
RealNC
Site Admin
Posts: 3737
Joined: 24 Dec 2013, 18:32
Contact:

Re: How does GSYNC work technically?

Post by RealNC » 05 Dec 2022, 20:14

hemanursawarrior wrote:
05 Dec 2022, 19:48
What causes tearing then? Is it that the GPU is dumping a new frame into the frame buffer before the scanout is finished?
Yes. To be more precise, if frame buffer is changed outside of the VBI, you get a mix of two different frames. The tear line is the point at which the display scans out the new frame. If the frame buffer is changed inside the VBI, there can't be a tear line, since the display isn't scanning out anything at that point.
Then GSYNC+VSYNC off shouldn't do much to fix this since nothing is preventing new frames from being dumped in.
Only if the next frame comes to soon. If FPS is lower than max refresh rate, then there's no problem, since that means the frame buffer changes during the VBI. If FPS is faster than max refresh rate, then you get tearing.

But it's better to think in terms of frame times, not FPS. At 144Hz, each frame needs to come no sooner than 6.9ms. If you cap FPS to 140, it's almost guaranteed to be the case. Just in case some frames come too soon (there's some variance when it comes to frame pacing), you can enable vsync on top of gsync. However, since you're capping your FPS, that vsync won't introduce typical vsync backpressure lag. It's pretty much "for free".

As to why that is, you can read this post:

https://forums.guru3d.com/threads/anti- ... st-5714400

Keep in mind that backpressure can have different causes. An overloaded GPU, or an articial bottleneck like the one induced by vsync. Capping your FPS prevents the bottleneck, thus vsync on top of gsync adds no significant latency. This is why nvidia decided to force an FPS cap if you enable vsync on top of gsync while "low latency mode" is set to "ultra" in the nvidia panel. Getting rid of vsync backpressure through an FPS really removes a lot of lag. You don't even need gsync for that to happen. If you use vsync without gsync, you still eliminate tons of lag if you cap your FPS below your refresh rate.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: How does GSYNC work technically?

Post by jorimt » 05 Dec 2022, 20:53

hemanursawarrior wrote:
05 Dec 2022, 19:48
While I'm glad to see RealNC is able to relay these points in a less conceptual way for you, I will say that virtually all of your questions in this thread were covered pretty succinctly in my article.

The only reason I'm not personally breaking it down again, is I'd be repeating myself (which I've done a copious amount of times in the past five years, both in the forums and my article's comment section), just in different terms, which RealNC has thankfully had the courtesy to express in my place.
RealNC wrote:
05 Dec 2022, 20:14
Just in case some frames come too soon (there's some variance when it comes to frame pacing), you can enable vsync on top of gsync. However, since you're capping your FPS, that vsync won't introduce typical vsync backpressure lag. It's pretty much "for free".
And one important thing to mention on this point @hemanursawarrior, is G-SYNC + V-SYNC is "G-SYNC."

It's actually G-SYNC without the V-SYNC option enabled that is incomplete. I.E. it can't fully be considered G-SYNC since it will only adhere to the VBLANK (aka prevent tearing) whenever the frametime of the given frame is within the scanout time of the current physical refresh rate, even if the average framerate is still within said refresh rate.

Again, as stated in the Closing FAQ entry I shared in my previous reply, the "V-SYNC" component forces G-SYNC to adhere to the VBLANK regardless of the individual frametime of each frame (which is how G-SYNC operation was originally released and intended to function).

Further, not only is this "V-SYNC" component of G-SYNC operation enforced for frames that come "too soon," but also for any frames that come "too late" as well, such as those during frametime spikes.

Technically, G-SYNC is still V-SYNC, just flipped on its head. "V-SYNC" has become somewhat of a dirty word, but that's only because the traditional method forced the GPU output to sync to the display, causing the additional stutter (with framerates below the refresh rate) and "backpressure" latency (with framerates above the refresh rate).

So long as the average framerate remains within the refresh rate, G-SYNC (+ the V-SYNC "option") avoids both issues by instead syncing the display to the GPU output as described in my article and further by RealNC in this thread (I.E. a fancy form of dynamic VBI padding).
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

hemanursawarrior
Posts: 7
Joined: 09 Apr 2020, 13:08

Re: How does GSYNC work technically?

Post by hemanursawarrior » 06 Dec 2022, 01:32

Re: RealNC
Are you saying that for GSYNC+VSYNC off there should be a minimal amount of tearing if FPS is capped below max refresh rate? When I was playing around with it, I saw the same sort of tearing (objects looking weird) as I panned the screen. Is this caused by the intermittent variance? Seemed quite noticeable.

Also according to this, suppose I cap FPS to 141, and my machine can run at 141 stably, then there should be minimal difference between VSYNC and GSYNC+VSYNC. It should be more or less the same?

The place where GSYNC would really come in is if the FPS dips here and there, and then GSYNC basically gets rid of the various side effects that VSYNC only would have introduced.

hemanursawarrior
Posts: 7
Joined: 09 Apr 2020, 13:08

Re: How does GSYNC work technically?

Post by hemanursawarrior » 06 Dec 2022, 01:39

jorimt, RealNC, appreciate both of your efforts to explain this. I read through the explanations in the guides, but there were still gaps in my understanding, in how I parsed some of the ideas, and I couldn't get to a clear model.

Sometime it just takes some hand holding and translating concepts into the same mental language.

Post Reply