In G-Sync's relatively short history, Blur Buster's excellent article (http://www.blurbusters.com/gsync/preview2/) is the only source of information regarding G-Sync's inner workings and framerate limiters' impact on input latency. As Nvidia remains silent on the subject, I'm really hoping I can get a reply on this post and its contents directly from the Chief Blur Buster.(EDIT) My latest findings can be found in the "G-Sync 101 w/Chart (WIP)" thread here:
http://forums.blurbusters.com/viewtopic.php?f=5&t=3073
Since it's release, informed G-Sync users have gone by this comment (http://www.blurbusters.com/gsync/preview2/#comment-2762) to prevent reaching G-Sync's refresh rate limit, and reduce input latency. And with G-Sync on + V-Sync on, a framerate limit of 135 fps on a 144Hz display seems to have sufficed, that is, until Nvidia exposed the option to disable V-Sync with G-Sync enabled.
Now I and many other G-Sync users have noticed, even with a 135 fps cap (or whereabouts) on a 144Hz display, tearing can be observed at the very bottom of the screen. And I believe I have discovered why: G-Sync's 1ms polling rate.
What follows is a reshuffling of what I have written over several posts on a thread in Nvidia's official forums here:
https://forums.geforce.com/default/topi ... 1/#5010801
I began my test on my 144Hz XB271HU in Overwatch. Launching the game and entering practice mode, I positioned myself like so:
I then started strafing left/right, watching for tearing at the bottom portion of the screen. Using the game's built-in framerate limiter, I began at 144 fps, and reduced the limiter in single frame increments until the tearing disappeared at 120 fps.
What I thought was interesting, is that this finding was in line with the article's:
The same framerate limit where input lag with G-Sync + V-Sync on disappeared in the article, tearing disappeared with G-Sync + V-Sync off in my test. It then dawned on me, that if G-Sync has a 1ms polling rate, then the tearing/input latency must fully disappear at 120 fps on a 144Hz monitor, because 120 fps and 144 fps are roughly 1ms frametime apart (8.3ms - 6.9ms = 1.4ms).At first, it was pretty clear that G-SYNC had significantly more input lag than VSYNC OFF. It was observed that VSYNC OFF at 300fps versus 143fps had fairly insignificant differences in input lag (22ms/26ms at 300fps, versus 24ms/26ms at 143fps). When I began testing G-SYNC, it immediately became apparent that input lag suddenly spiked (40ms/39ms for 300fps cap, 38ms/35ms for 143fps cap). During fps_max=300, G-SYNC ran at only 144 frames per second, since that is the frame rate limit. The behavior felt like VSYNC ON suddenly got turned on.
The good news now comes: As a last-ditch, I lowered fps_max more significantly to 120, and got an immediate, sudden reduction in input lag (27ms/24ms for G-SYNC). I could no longer tell the difference in latency between G-SYNC and VSYNC OFF in Counterstrike: GO! Except there was no tearing, and no stutters anymore, the full benefits of G-SYNC without the lag of VSYNC ON.
Apparently, this 120 fps framerate limit gives G-Sync the 1ms window it needs to work its magic on the entire screen, and thus prevents the tearing we see at the bottom with G-Sync on + V-Sync off + over 120 framerate limits.
This 24Hz difference obviously doesn't scale linearly across displays with different max refresh rate limits, and instead, it looks as if you need to limit the framerate roughly 1ms below your display's max refresh; 100Hz = 90 fps, 144Hz = 120 fps, and so on.
I also found that in-game framerate limiters appear to limit the framerate, not the frametime, whereas RTSS appears to limit the frametime, not the framerate. This means that in-game limiters need a lower framerate limit in order to avoid exceeding G-Sync's 1ms polling range, as the frametime can drift 0.5ms+/-. Whereas I could push the framerate slightly higher with RTSS (125 fps vs 120 fps) before tearing at the bottom of the screen occurred, because frametime remained relatively static.
All that said, now to my questions. First of all, am I right about any of this? Secondly, this has been driving me mad for a while; how much additional input latency does RTSS add over in-game framerate limiters? I assume it varies depending on the game, but is it as much as V-Sync introduces, or much less on average? I ask, because many games don't include framerate limiters, and RTSS becomes a necessary evil.
Thirdly, if I am correct about the 1ms polling range, and G-Sync + V-Sync off is only tearing at the bottom portion of the screen with a 120+ (e.g. traditional 135) framerate limit, doesn't it simply represent the area where G-Sync ran out of time to sync the frames, and, at worst, is the only area input latency/rendering time differs from the rest of the screen, even with G-Sync + V-Sync on?
Finally, it is my understanding, that with G-Sync on + V-Sync off, even with the appropriate framerate limit, tearing still can occur below G-Sync's 30Hz range, mainly during frametime spikes that happen during asset loads, etc. Many take this as a sign that G-Sync is broken, but it's obvious that V-Sync was always intended as a buffer to G-Sync below/above its range.
Thanks in advance, and I'd be happy for replies and input from any and all who have knowledge on this subject.