G-Sync's 1ms Polling Rate: My Findings & Questions

Talk about NVIDIA G-SYNC, a variable refresh rate (VRR) technology. G-SYNC eliminates stutters, tearing, and reduces input lag. List of G-SYNC Monitors.
User avatar
jorimt
Posts: 2484
Joined: 04 Nov 2016, 10:44
Location: USA

Re: G-Sync's 1ms Polling Rate: My Findings & Questions

Post by jorimt » 06 Nov 2016, 09:28

lexlazootin wrote:Are you suggesting that there isn't any input lag difference between 120-135fps in G-Sync mode? I've got a camera and i'll be happy to test it but i'm a little confused on what's going on here.
Also you can adjust your Vertical Timings to remove the tearing at the bottom of your screen BTW
I am not. I am asking which one it is.

I've established that due to its 1ms polling rate, G-Sync on + V-Sync off on a 144Hz monitor needs roughly 1ms (frametime) padding below the max refresh (via a 120 in-game fps cap) to fully function on its own without introducing tearing. E.g. 30-120 appears to be its full range on a 144Hz monitor, not 30-135ish, as once thought. At this point, I am wondering if anyone is actually reading my OP (please do, if you haven't).

What I'm saying, is what in the heck has it been doing above that range this entire time, both with V-Sync on and off? Can G-Sync continue to partially function above said range (and if so, how exactly), or is what we are seeing in this 120-135 range something more akin to certain adaptive v-sync methods' behavior, where the tearing can be isolated to less noticeable areas (very top or bottom) of the screen?

And obviously, if we're only talking about input lag, this would solely affect G-Sync on + V-Sync on; with G-Sync on + V-Sync off, there really isn't going to be any increase/decrease of input lag, no matter the fps cap.

Again, my question is two-fold. One, what is G-Sync on + V-Sync off actually doing in the 120-135 fps range, and if it's causing tearing at the bottom of the screen, how is G-Sync still functioning? Two, with G-Sync on + V-Sync on, how much additional input lag is there in the 120-135 range over a 120 fps cap? And how/how much is G-Sync still functioning in this range with V-Sync on (or is it really at all)?

Sorry to repeat myself, but everyone thus far is having a very difficult time grasping what I'm trying to say.

Answers are welcome. Tests via high speed camera setups are more so.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

User avatar
lexlazootin
Posts: 1251
Joined: 16 Dec 2014, 02:57

Re: G-Sync's 1ms Polling Rate: My Findings & Questions

Post by lexlazootin » 06 Nov 2016, 10:40

I read your post a few times but i just was having troubles following it.

"to fully function on its own without introducing tearing. E.g. 30-120 appears to be its full range on a 144Hz monitor, not 30-135ish"

Well this just isn't true for all games, give half-life a go and you will find that it doesn't tear all the way up to 135 just fine. Also i don't experience tearing on CS:GO until 128fps~
What I'm saying, is what in the heck has it been doing above that range this entire time, both with V-Sync on and off? Can G-Sync continue to partially function above said range (and if so, how exactly), or is what we are seeing in this 120-144 range something more akin to certain adaptive v-sync methods' behavior, where the tearing can be isolated to less noticeable areas (very top or bottom) of the screen?

Answers are welcome. Tests via high speed camera setups are more so.
Well if you adjust your vertical timings to lower then +100 of your active res you can get a consistent tear line that G-Sync will sync everything up too in the middle of your screen

You can even play around with the G-Sync by adjusting the framerate to .1 above my max HZ and watching the tear line scroll up my screen and then switching it to just below my max HZ to have the tear line scroll back down my screen till it catches up with the G-Sync tear line again.

I'm pretty sure it's the same G-Sync all the way up to 144 and it's frame pacing causing the issues, if you want to try the tear line thing just import this .bin in CRU http://www.mediafire.com/file/q7mfmtp7f ... yncres.zip

It's the one at 1024x768@196 with the tearline, just raise or lower the vertical total to move it out the way.

User avatar
jorimt
Posts: 2484
Joined: 04 Nov 2016, 10:44
Location: USA

Re: G-Sync's 1ms Polling Rate: My Findings & Questions

Post by jorimt » 06 Nov 2016, 11:22

So it's merely a coincidence that capping your fps roughly 1ms (frametime) below your max refresh rate eliminates tearing in most instances with G-Sync + V-Sync off, even though this lines up almost exactly with G-Sync's 1ms polling rate and the findings in the original article that a 120 fps cap eliminates additional input latency with G-Sync + V-Sync on?

I'm simply disputing G-Sync's max working range (without the help/interference of V-Sync on or off) here. I cannot believe it has ended up being so difficult to comprehend.

Also, if you read my OP (e.g. the first post of this thread), I had an edited section that showed the different max fps cap needed to eliminate tearing with G-Sync + V-Sync off, depending on if you use RTSS or the in-game limiter. Yes, it can vary a little, but It usually ends up being 1.0ms to 1.5ms frametime below the max refresh. And the tearing I'm talking about between 120-135 fps only happens at the very last, say, 25-100px of the screen, and it is very, very difficult to spot; you have to be looking for it (the tearing isn't screen wide).

I very much appreciate your response, but I think I've made myself as clear as I'm going to be able to at this point. I really hope the Chief Blur Buster will chime in at some point with whatever clarity he has to offer. Until then...
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

Sparky
Posts: 682
Joined: 15 Jan 2014, 02:29

Re: G-Sync's 1ms Polling Rate: My Findings & Questions

Post by Sparky » 06 Nov 2016, 13:17

Do you understand what I mean when I say it's backpressure that causes the vast majority of vsync input lag? As long as the monitor can display more frames than it's receiving, it won't cause backpressure.

imagine a toll booth at the end of a highway that can take payments from 15 cars per minute. As long as fewer than 15 cars arrive per minute, each car is delayed by a few seconds to pay, and then goes on its way. Now imagine there are 20 cars arriving per minute. At the end of the first minute there are 5 cars waiting in line, at the end of the second minute there are 10. This continues until the whole highway is backed up with cars. It sounds like what you're seeing is more like the first scenario than the second.

As long as the tear stays at the bottom of the screen, the display is still displaying frame as frequently as they arrive, it's just a phase offset.

User avatar
jorimt
Posts: 2484
Joined: 04 Nov 2016, 10:44
Location: USA

Re: G-Sync's 1ms Polling Rate: My Findings & Questions

Post by jorimt » 06 Nov 2016, 13:29

I do.

To clarify, I'm not a technician. My vernacular is limited to visual phenomena that can be observed on the user-end, purely through practical testing. So forgive me if there is any misunderstanding in my limited presentation here.

This would still mean G-Sync's 1ms polling rate is the cause of this tearing (or "phase offset") on anything above a 120 fps cap with G-Sync on + V-Sync off. In other words, G-Sync is still working in the 120-135 range, but it's inability to fully "catch up" in that time due to the 1ms polling rate, would theoretically results in the tearing (or, again, "phase offset") seen at the bottom of the screen, correct?

Assuming we're on the same page there (we may not be), as opposed to V-Sync "off," how does V-Sync "on" mask that tearing with G-Sync in the same 120-135 range and not further inhibit G-Sync's function?
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

Sparky
Posts: 682
Joined: 15 Jan 2014, 02:29

Re: G-Sync's 1ms Polling Rate: My Findings & Questions

Post by Sparky » 06 Nov 2016, 19:58

Hard to answer a how without inside information or specific testing. High speed video of the tearing would help, especially if the thing being displayed includes per-frame render times. The video can give a few hints, like the position of the tear vs framerate, if its position is consistent, whether the tear is on every frame or just occasional frames, etc.

User avatar
jorimt
Posts: 2484
Joined: 04 Nov 2016, 10:44
Location: USA

Re: G-Sync's 1ms Polling Rate: My Findings & Questions

Post by jorimt » 06 Nov 2016, 20:57

Understood.

I'm attaching a "G-Sync Range" chart I just made up to clarify what I've observed thus far. It's a sum-up of nearly my entire knowledge on the subject in a single image. Of course, it is incomplete and contains some speculation, but hopefully it will make the concepts I've been attempting to get across much clearer (also embedded it in the thread's OP).

Image

Further input will be very appreciated...
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

Sparky
Posts: 682
Joined: 15 Jan 2014, 02:29

Re: G-Sync's 1ms Polling Rate: My Findings & Questions

Post by Sparky » 07 Nov 2016, 02:00

Everything below about 40hz is inaccurate, as the monitor will self-refresh until it receives a new frame. As for above 120, 'polling' is almost certainly the wrong word, as I don't think displayport works like that. USB uses polling to resolve bus contention, but displayport doesn't have that problem, as the main data channels are one way. The new frame just arrives when it arrives, and it's up to the monitor to display it correctly.

There are a few possibilities for a cause of tearing between 120 and 139fps, one is just frametime variance that puts instantaneous framerate over 144hz. Another possibility is a bug that causes the monitor or driver to oscillate in and out of VRR mode.

Hypothetical possibility: Maybe a g-sync frame takes an extra millisecond to get the overdrive settings calculated compared to a fixed refresh rate frame, so the next frame arrives before the current frame is done displaying. And because a new frame arrived while the current frame is still being scanned, the monitor thinks the framerate is over 144hz and switches to fixed refresh mode. Now, this new frame ends this scan and makes up the entire next scan, but there's no new frame when this frame finishes scanning so the monitor switches back into VRR mode and waits for a new frame.

That's just one of many possible causes of that tearing, we simply don't have enough information to determine the root cause. Keep in mind that the v-sync off fallback for g-sync was something of a rushed afterthought feature, the vsync on fallback is a more mature implementation.

User avatar
masterotaku
Posts: 436
Joined: 20 Dec 2013, 04:01

Re: G-Sync's 1ms Polling Rate: My Findings & Questions

Post by masterotaku » 07 Nov 2016, 02:49

Sparky wrote:Everything below about 40hz is inaccurate, as the monitor will self-refresh until it receives a new frame.
If my G-Sync+ULMB mode can be extrapolated to all G-Sync monitors and non strobed G-Sync, that's true. <40fps is where double strobing happens.
CPU: Intel Core i7 7700K @ 4.9GHz
GPU: Gainward Phoenix 1080 GLH
RAM: GSkill Ripjaws Z 3866MHz CL19
Motherboard: Gigabyte Gaming M5 Z270
Monitor: Asus PG278QR

User avatar
jorimt
Posts: 2484
Joined: 04 Nov 2016, 10:44
Location: USA

Re: G-Sync's 1ms Polling Rate: My Findings & Questions

Post by jorimt » 07 Nov 2016, 11:07

Alright, so what I'm calling "tweening" is called "strobing," and it actually begins at 40 fps and below, correct? If so, I can reflect that on the chart easily. I know the method used (whatever it is called) is akin to older 60Hz TVs with so-called "120Hz" or "240Hz" modes to repeat the current refresh in certain intervals (some use black frame insertion, some interpolation, I believe; corrections welcome), I just didn't know the technical term, or exactly when it started with G-Sync.

I also found this post on the NeoGaf G-Sync thread, how accurate is this? (I want to get the chart threshold/wording right):
http://www.neogaf.com/forum/showpost.ph ... tcount=605
At 30-45 fps, the panel will refresh at 30-45 Hz, which is really low. You will probably observe flickering as the display updates at these low rates.

Below 30 fps, the G-Sync module uses a trick to avoid the refresh rate falling even further; it repeats frames. At 29 fps, the display is actually refreshing at 58 Hz, with doubled frames. This helps create a smoother appearance, but adds latency and will make controls feel laggy. Basically, you don't want to be beneath 40 fps at all, but if the drops are momentary, the experience is much better than with a standard fixed-rate display and VSync.
As for my word choice of "polling," I'm simply pulling it, yet again, from that article:
http://www.blurbusters.com/gsync/preview2/
We currently suspect that fps_max 143 is frequently colliding near the G-SYNC frame rate cap, possibly having something to do with NVIDIA’s technique in polling the monitor whether the monitor is ready for the next refresh. I did hear they are working on eliminating polling behavior, so that eventually G-SYNC frames can begin delivering immediately upon monitor readiness, even if it means simply waiting a fraction of a millisecond in situations where the monitor is nearly finished with its previous refresh.

I did not test other fps_max settings such as fps_max 130, fps_max 140, which might get closer to the G-SYNC cap without triggering the G-SYNC capped-out slow down behavior. Normally, G-SYNC eliminates waiting for the monitor’s next refresh interval:

G-SYNC Not Capped Out:
Input Read -> Render Frame -> Display Refresh Immediately

When G-SYNC is capped out at maximum refresh rate, the behavior is identical to VSYNC ON, where the game ends up waiting for the refresh.

G-SYNC Capped Out
Input Read -> Render Frame -> Wait For Monitor Refresh Cycle -> Display Refresh
And here in a later comment:
http://www.blurbusters.com/gsync/preview2/#comment-2591
You want to use the highest possible frame rate cap, that’s at least several frames per second below the G-SYNC maximum rate, in order to prevent G-SYNC from being fully capped out. Testing each run took a lot of time, so I didn’t end up having time to test in-between frame caps (other than fps_max 120, 143 and 300).

Technically, input latency should “fade in” when G-SYNC caps out, so hopefully future drivers can solve this behavior, by allowing fps_max 144 to also have low latency. Even doing an fps_max 150 should still have lower input lag than fps_max 300 using G-SYNC, since the scanout of the previous refresh cycle would be more finished 1/150sec later, rather than 1/300sec later. Theoretically, the drivers only needs to wait a fraction of a millisecond at that time and begin transmitting the next refresh cycle immediately after the previous refresh finished. I believe the fact that latency occured at fps_max 143, to be a strange quirk, possibly caused by the G-SYNC polling algorithm used. I’m hoping future drivers will solve this, so that I can use fps_max 144 without lag. It should be technically possible, in my opinion. It might even be theoretically possible to begin transmitting the next frame to the monitor before the display refresh is finished, by possibly utilizing some spare room in the 768MB of memory found on the G-SYNC board (To my knowledge, this isn’t currently being done, and isn’t the purpose of the 768MB memory). Either way, I think this is hopefully an easily solvable issue, as there should only be a latency fade-in effect when G-SYNC caps out at fps_max 143, fps_max 144, fps_max 150 — rather than an abrupt additional latency. I’ll likely contact NVIDIA directly and see what their comments are, about this.
Finally, he states the polling time is "1ms" on a post in this thread:
http://forums.blurbusters.com/viewtopic ... lling#p221
I talked to people at NVIDIA, and they confirmed key details.

The framebuffer starts getting transmitted out of the cable after about 1ms (the GSYNC poll) after the render-finish of Direct3D Present() call. That means, if your framebuffer is simple, the first pixels are on the cable after only 1ms after Direct3D Present() -- this is provided the previous call to Present() returned at least 1/144sec ago. Also, the monitor does real-time scanout off the wire (as all BENQ and ASUS 120Hz monitors does). Right now, they are polling the monitor (1ms) to ask if it's currently refreshing or not. This poll cycle is the main source of G-SYNC latency at the moment, but they are working on eliminating this remaining major source of latency (if you call 1ms major!). One way to picture this, "on the cable", the only difference between VSYNC OFF and G-SYNC is that the scanout begins at the top edge immediately, rather than a splice mid-scan (tearing).
After reading all that, and doing my own simple tests, I found that between 120ish fps and the 144Hz ceiling, some weird stuff is going on, especially with G-Sync on + V-Sync off.

Here's the bottom line. Put yourself in the average G-Sync user's shoes. You buy a G-Sync monitor, you bring it home, plug it in, you pull up google, and you type in "best G-Sync settings." A variety of results pop up, mostly from reddit, which, on average, have two recommendations in common:

1. Set a global 135 fps cap with RTSS to avoid G-Sync ceiling and additional input lag.
2. Disable V-Sync in the control panel and in-game, since it adds tons of input lag, no exceptions.

You then set your display up according to the above "instructions" and launch a game. You begins to see tearing at the bottom of the screen, and worse yet, the whole screen tears sometimes (unbeknownst to you) due to the frametime spikes that happen below G-Sync's range. You angrily pull up the Geforce forum and either start a "G-Sync is Broken!!!" thread, or comment in the latest driver thread, exclaiming G-Sync support is broken, and that Nvidia is legally liable.

Obviously, most of this is untrue. As we already know, a 135 fps cap may not always be enough to stop the tearing seen on a 144Hz display with G-Sync on + V-Sync off. That, and external fps caps add additional input lag (I still don't know how much :\) over in-game limiters. Secondly, G-Sync on + V-Sync on isn't broken; with a proper fps cap, it is actually preferred, and is the only way to achieve a 100% tear free experience when those frametime spikes crop up.

I simply want to clear up this misinformation once and for all.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

Post Reply