G-Sync's 1ms Polling Rate: My Findings & Questions

Talk about NVIDIA G-SYNC, a variable refresh rate (VRR) technology. G-SYNC eliminates stutters, tearing, and reduces input lag. List of G-SYNC Monitors.
Sparky
Posts: 682
Joined: 15 Jan 2014, 02:29

Re: G-Sync's 1ms Polling Rate: My Findings & Questions

Post by Sparky » 07 Nov 2016, 12:16

jorimt wrote:Alright, so what I'm calling "tweening" is called "strobing," and it actually begins at 40 fps and below, correct?
er, no. Strobing is backlight pulsing, which is called ULMB and various other names, and isn't normally available while g-sync is active, though there is a hack that enables it. In general it's just repeated frames, tweening works though I haven't seen it used. I believe g-sync does the repeating from its own onboard memory, so the more specific term is 'panel self refresh'.
If so, I can reflect that on the chart easily. I know the strobing method used is akin to older 60Hz TVs with so-called "120Hz" or "240Hz" modes to repeat the current refresh in certain intervals (most use black frame insertion, I believe), I just didn't know the technical term, or exactly when it started with G-Sync.

As for my word choice of "polling," I'm simply pulling it, yet again, from that article:
http://www.blurbusters.com/gsync/preview2/

"We currently suspect that fps_max 143 is frequently colliding near the G-SYNC frame rate cap, possibly having something to do with NVIDIA’s technique in polling the monitor whether the monitor is ready for the next refresh. I did hear they are working on eliminating polling behavior, so that eventually G-SYNC frames can begin delivering immediately upon monitor readiness, even if it means simply waiting a fraction of a millisecond in situations where the monitor is nearly finished with its previous refresh.

I did not test other fps_max settings such as fps_max 130, fps_max 140, which might get closer to the G-SYNC cap without triggering the G-SYNC capped-out slow down behavior. Normally, G-SYNC eliminates waiting for the monitor’s next refresh interval:

G-SYNC Not Capped Out:
Input Read -> Render Frame -> Display Refresh Immediately

When G-SYNC is capped out at maximum refresh rate, the behavior is identical to VSYNC ON, where the game ends up waiting for the refresh.

G-SYNC Capped Out
Input Read -> Render Frame -> Wait For Monitor Refresh Cycle -> Display Refresh"


And here in a later comment:
http://www.blurbusters.com/gsync/preview2/#comment-2591

"You want to use the highest possible frame rate cap, that’s at least several frames per second below the G-SYNC maximum rate, in order to prevent G-SYNC from being fully capped out. Testing each run took a lot of time, so I didn’t end up having time to test in-between frame caps (other than fps_max 120, 143 and 300).

Technically, input latency should “fade in” when G-SYNC caps out, so hopefully future drivers can solve this behavior, by allowing fps_max 144 to also have low latency. Even doing an fps_max 150 should still have lower input lag than fps_max 300 using G-SYNC, since the scanout of the previous refresh cycle would be more finished 1/150sec later, rather than 1/300sec later.
That's only true for a handful of frames, a fps cap of 150 on 144hz would take less than a second to get up to 5 frames of latency. If you want to eliminate tearing you're limited to displaying 144 frames per second, and if your cap is over that, what happens to the extra frames? You can either drop them(fast sync, adds a random delay between 0 and 1/framerate), or you can make them wait in line to get displayed(which adds several frames of input lag). To do 144hz v-sync without tons of latency or inconsistent latency you'd need a synchronous framerate cap early in the render chain, to hold up the game engine until the next frame gets displayed. Sort of like stop lights on freeway on-ramps that keep the freeway from getting gridlocked.
Theoretically, the drivers only needs to wait a fraction of a millisecond at that time and begin transmitting the next refresh cycle immediately after the previous refresh finished. I believe the fact that latency occured at fps_max 143, to be a strange quirk, possibly caused by the G-SYNC polling algorithm used. I’m hoping future drivers will solve this, so that I can use fps_max 144 without lag. It should be technically possible, in my opinion. It might even be theoretically possible to begin transmitting the next frame to the monitor before the display refresh is finished, by possibly utilizing some spare room in the 768MB of memory found on the G-SYNC board (To my knowledge, this isn’t currently being done, and isn’t the purpose of the 768MB memory). Either way, I think this is hopefully an easily solvable issue, as there should only be a latency fade-in effect when G-SYNC caps out at fps_max 143, fps_max 144, fps_max 150 — rather than an abrupt additional latency. I’ll likely contact NVIDIA directly and see what their comments are, about this."

Finally, he states the polling time is "1ms" on a post in this thread:
http://forums.blurbusters.com/viewtopic ... lling#p221

"I talked to people at NVIDIA, and they confirmed key details.

The framebuffer starts getting transmitted out of the cable after about 1ms (the GSYNC poll) after the render-finish of Direct3D Present() call. That means, if your framebuffer is simple, the first pixels are on the cable after only 1ms after Direct3D Present() -- this is provided the previous call to Present() returned at least 1/144sec ago. Also, the monitor does real-time scanout off the wire (as all BENQ and ASUS 120Hz monitors does). Right now, they are polling the monitor (1ms) to ask if it's currently refreshing or not. This poll cycle is the main source of G-SYNC latency at the moment, but they are working on eliminating this remaining major source of latency (if you call 1ms major!). One way to picture this, "on the cable", the only difference between VSYNC OFF and G-SYNC is that the scanout begins at the top edge immediately, rather than a splice mid-scan (tearing)."


After reading all that, and doing my own simple tests, I found that between 120ish fps and the 144Hz ceiling, some weird stuff is going on, especially with G-Sync on + V-Sync off.

Here's the bottom line. Put yourself in the average G-Sync user's shoes. You buy a G-Sync monitor, you bring it home, plug it in, you pull up google, and you type in "best G-Sync settings." A variety of results pop up, mostly from reddit, which, on average, have two recommendations in common:

1. Set a global 135 fps cap with RTSS to avoid G-Sync ceiling and additional input lag.
2. Disable V-Sync in the control panel and in-game, since it adds tons of input lag, no exceptions.
1 is a good idea, but an in game cap is even better(1 frame lower latency). The exact number you need to cap at hasn't really been demonstrated. 2 is pointless at best, and given this thread, might do nothing but reduce the acceptable maximum refresh rate.

The beginner G-Sync user then sets his display up according to the above "instructions" (*eye roll*) and launches his game. He begins to see tearing at the bottom of the screen, and worse yet, the whole screen tears sometimes (unbeknownst to him) due to the frametime spikes that happen below G-Sync's range. He angrily pulls up the Geforce forums and either starts a "G-Sync is Broken!!!" thread, or comments in the latest driver thread, exclaiming G-Sync support is broken, and that Nvidia is legally liable.

Obviously, most of this is untrue. As we already know, a 135 fps cap may not always be enough to stop the tearing seen on a 144Hz display with G-Sync on + V-Sync off. That, and external fps caps add additional input lag (I still don't know how much :\) over in-game limiters. Secondly, G-Sync on + V-Sync on isn't broken; with a proper fps cap, it is actually preferred, and is the only way to achieve a 100% tear free experience when those frametime spikes crop up.

I simply want to clear up this misinformation once and for all.
Yup.

User avatar
jorimt
Posts: 2484
Joined: 04 Nov 2016, 10:44
Location: USA

Re: G-Sync's 1ms Polling Rate: My Findings & Questions

Post by jorimt » 07 Nov 2016, 13:07

Okay, so is this getting closer?

Image

If I'm relatively on the right track so far, a couple of questions remain:

1. Does RTSS always adds an additional 1 frame of input latency over an in-game cap? And would this latency increase apply to both G-Sync + V-Sync on and G-Sync + V-Sync off?

2. An fps limit (or fps itself) between a 120-135ish range with G-Sync + V-Sync off is basically solved. G-Sync's 1ms polling rate (or whatever you want to call it) is more than likely the cause of tearing (or phase shift, as you put it) we see at the bottom of the screen. However, in that same range, it's basically still up in the air as to what is exactly happening with G-Sync + V-Sync on, which is why I have said scenario marked "???" in the chart. Right?

Any further corrections?
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

Sparky
Posts: 682
Joined: 15 Jan 2014, 02:29

Re: G-Sync's 1ms Polling Rate: My Findings & Questions

Post by Sparky » 07 Nov 2016, 13:49

jorimt wrote:Okay, so is this getting closer?
The 30fps thing is a limit of the actual LCD panel, the self-refresh keeps that limit from ever being reached. Say the framerate is 25, the monitor will self refresh at 50hz. If the framerate is 14fps, it will self refresh twice for every new frame (effectively 42hz refresh rate). and so on.
If I'm relatively on the right track so far, a couple of questions remain:

1. Does RTSS always adds an additional 1 frame of input latency over an in-game cap? And would this latency increase apply to both G-Sync + V-Sync on and G-Sync + V-Sync off?
Only if the framerate cap is the thing that's limiting the framerate, and yes it would apply regardless of what your v-sync settings are.

2. An fps limit (or fps itself) between a 120-135ish range with G-Sync + V-Sync off is basically solved. G-Sync's 1ms polling rate (or whatever you want to call it) is more than likely the cause of tearing (or phase shift, as you put it) we see at the bottom of the screen. However, in that same range, it's basically still up in the air as to what is exactly happening with G-Sync + V-Sync on, which is why I have said scenario marked "???" in the chart. Right?
As far as I know, G-sync with vsync on behaves the same at 135fps as it does at 120fps. One of the big problems is that most of the articles on g-sync and freesync were written when they first came out, and contain inaccurate or outdated information.

User avatar
jorimt
Posts: 2484
Joined: 04 Nov 2016, 10:44
Location: USA

Re: G-Sync's 1ms Polling Rate: My Findings & Questions

Post by jorimt » 07 Nov 2016, 16:07

Alright, try this one:

Image

I say "120-125" within the polling rate, because of both my findings, and the original article's findings. I say "126-139" beyond the polling rate, and "140-144" exceeds G-Sync range because of this:
http://www.blurbusters.com/gsync/preview2/#comment-2762
Since this article was written, several of us did several tests.

fps_max 130 — works good
fps_max 135 — works good
fps_max 138 — works good
fps_max 140 — slight hints of extra lag
fps_max 142 — as bad as fps_max 143

However, I am hoping that this is really just an NVIDIA driver issue (hopefully not an unfixable game behaviour), and that fps_max 143 should work great with newer drivers. By early March, Blur Busters Forums is adding a new “Input Lag” subforum area, so that people can test things out.
Note that he didn't specify whether or not they used the same methods as they did in the article, or if they did several tests solely by eye/feel to get those results. I'm sticking with a "120 fps" in-game cap as optimal, because that is (again) what both my findings and the article's findings showed to eliminate latency (or, in my tests, tearing).

For the "Exceeds 1ms Polling Range" section on the chart, I've gone ahead and made it "V-Sync On: Not Engaged, Reduced Input Latency." My current logic, thanks to your explanations, is that any cap below the "140ish-144" G-Sync ceiling is going to reduce input latency. Does it reduce latency as much as a 120 fps cap? Maybe, maybe not, not enough data (or I'm being thick again, and you can correct me). As for V-Sync behavior in that range, that's still what I'm not 100% clear on. However, I think it's safe to say, whatever it's doing to mask tearing in the 126-139 fps range, it's doesn't appear to be conflicting with G-Sync's function.

Further input is welcome, as usual.
Last edited by jorimt on 09 Nov 2016, 15:57, edited 2 times in total.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

Glide
Posts: 280
Joined: 24 Mar 2015, 20:33

Re: G-Sync's 1ms Polling Rate: My Findings & Questions

Post by Glide » 08 Nov 2016, 09:40

I'm confused by what you think this "1ms polling range" is.
At first, I thought you were just talking about it operating outside of the G-Sync range, causing it to tear when V-Sync is disabled.

You know that it's polling at 1000Hz, right? That's where the "1ms" comes from.
What does that have to do with this 120-125 FPS (Hz) range?

I would think that this tearing that you're seeing above 125 FPS or so must be specific to whatever game or framerate limiting tool you're using rather than anything to do with it being polled at 1000Hz.

User avatar
jorimt
Posts: 2484
Joined: 04 Nov 2016, 10:44
Location: USA

Re: G-Sync's 1ms Polling Rate: My Findings & Questions

Post by jorimt » 08 Nov 2016, 10:43

Theories are made to be tested, so test away.

Mine is founded on the assumption that G-Sync's 1ms polling rate is directly related to frametime.

In my testing, tearing at the bottom of the screen begins (note, G-Sync is STILL engaged at this point) just about when the frametime difference between the current fps/cap and maximum refresh rate (144Hz/fps) is less than 1.0ms apart.

I tested it at a 100Hz and 165Hz native refresh rate as well, and found the same result; the tearing at the very bottom of the screen stopped when the frametime difference between the fps cap and the max refresh was 1.0ms or more. The tearing at the bottom slowly grew in height the closer it got to the G-Sync ceiling, until full V-Sync off behavior engaged at around the 140+ range, as expected.

My understanding is that G-Sync polls for changes from the GPU every 1ms (much like Afterburner polls the system every second for graph updates), and then reflects those changes to the screen by matching the display's refresh rate to the fps output by the system.

My observation, is if it takes G-Sync 1ms to poll each time, a 126-139 fps cap on a 144Hz display isn't giving it enough of a window to sync to the entire screen, which results in the partial tearing seen at the bottom. How this is masked in that same range with G-Sync + V-Sync on however, I'm still not clear on.

Whether I'm right or wrong, the least that can be said is that G-Sync was never intended to fully function without V-Sync on.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

User avatar
jorimt
Posts: 2484
Joined: 04 Nov 2016, 10:44
Location: USA

Re: G-Sync's 1ms Polling Rate: My Findings & Questions

Post by jorimt » 09 Nov 2016, 15:54

I'm looking now to add Fast Sync scenarios to the chart, and have a couple of questions.

First off, my current understanding of Fast Sync:

1. Fast Sync is at its most functional when the framerate exceeds the display's maximum refresh rate by at least 2 or 3x, as Fast Sync's third buffer selects from the best of the excess frames to display as the final rendered frame.
2. Fast Sync input latency increase over V-Sync off is up to 1 frame, but will vary from 0 to 1.
3. Even in the most optimal scenarios, Fast Sync will introduce frame pacing issues, especially as its sample rate grows smaller.

Assuming I'm correct thus far (correction are welcome), What is Fast Sync doing when paired with G-Sync compared to being paried with V-Sync on? Does it act the same as G-Sync + V-Sync on within the G-Sync range, or are there differences? If so, what are they?

On a side note, my monitor indicates G-Sync is working with a yellow number readout that fluctuates in real time with the current refresh rate. What I found interesting, is with G-Sync + Fast Sync on, and no framerate limiter, the meter would cap out at 139 (not 144, like it does with G-Sync + V-Sync on). To me, that further confirms 140 fps is where G-Sync begins to hit its ceiling. Of course, it could also have another explanation. Thoughts are welcome, as usual.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

User avatar
RealNC
Site Admin
Posts: 3757
Joined: 24 Dec 2013, 18:32
Contact:

Re: G-Sync's 1ms Polling Rate: My Findings & Questions

Post by RealNC » 10 Nov 2016, 00:40

If your frame rate is high enough (~300FPS) to the point where fastsync begins to make sense, then that means you're way above the gsync range. So that combination doesn't make much sense. Going from 140FPS gsync to 150FPS fastsync is gonna look shitty, as it goes from perfectly smooth animation to microstutter.

If there's ever gonna be a 240Hz gsync monitor, then this might perhaps change. Until then, 150FPS fast sync on 144Hz has huge microstutter, at ~250FPS it begins to be a bit more tolerable, and at ~300FPS it finally becomes acceptable.

As a side note, fast sync does not add one frame of input lag (since it doesn't seem to be always frame capping to multiples of the refresh rate.) It's virtually the same input lag as vsync off. However, it has one frame worth of frame latency, just like vsync on. Input lag and frame latency are two different things :-) For example, you can have high frame latency but zero input lag.

However, I'm not sure what nvidia has done to fastsync. The first release seemed to cap the frame rate to multiples of the refresh rate, but with recent drivers it seem that's not the case anymore, or at least not always. It's kinda hard to tell. When there is no cap, there's just the frame latency, but no input lag. When there is a cap, then of course that's gonna add input lag. But again, I have no idea what the strategy in the driver looks like.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
jorimt
Posts: 2484
Joined: 04 Nov 2016, 10:44
Location: USA

Re: G-Sync's 1ms Polling Rate: My Findings & Questions

Post by jorimt » 10 Nov 2016, 11:57

Thanks for the reply.

I'm attaching what I expect to be the semi-final version of the chart for the time being:

Image

Beyond the "1ms Polling Rate sections/numbers" (which we can't seem to 100% confirm as of yet), please point out any corrections that need to be made.

For now however, I think I've fit in about all I can (I may add a 60 Hz scenario at some time). I believed I've learned about as much as I wanted to when I started all this, and I really appreciate all the input, especially from @Sparky.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

User avatar
jorimt
Posts: 2484
Joined: 04 Nov 2016, 10:44
Location: USA

Re: G-Sync's 1ms Polling Rate: My Findings & Questions

Post by jorimt » 11 Nov 2016, 13:32

Someone just linked me a video in the official Nvidia forums:
https://www.youtube.com/watch?v=F8bFWk61KWA

Image

It's a good video, and affirms a lot of the basics of G-Sync functionality. It also shows that Nvidia Inspector's fps limiter is adding up to 3 frames of additional latency (!!!).

Unfortunately, he didn't test G-Sync + V-Sync on + an in-game fps cap, he didn't test RTSS for latency, and he only tested the cap at 142 fps. For whatever reason, he didn't notice the tearing that occurs with G-Sync + V-Sync off at higher fps caps, but again, it is hard to spot if you aren't looking for it.

I really wish I had a high speed camera setup like that to do my own tests at this point.

Finally, some users in the Nvidia forum I'm conversing with are having uncharacteristic G-Sync issues with SLI setups. Do any of you know if it has any reason to react differently with SLI?
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

Post Reply