Blur Buster's G-SYNC 101 Series Discussion

Talk about NVIDIA G-SYNC, a variable refresh rate (VRR) technology. G-SYNC eliminates stutters, tearing, and reduces input lag. List of G-SYNC Monitors.
MaxiJazz
Posts: 16
Joined: 14 Mar 2017, 05:13

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by MaxiJazz » 17 Apr 2018, 06:31

What about Windows 10 DWM?
I watch "G-sync range" picture and it's said, that IF V-sync is off AND fps <36 tearing will occure.
So it doesn't occure for me.
Tested with many games (ACO, NFS payback, Diablo 3, Overwatch, Painkiller ... ), forced v-sync off both driver and game, using always fullscreen mode, on\off fullscreen optimization. And not a single tearing if fps <36. (Tried even force to <20 and still no tearing in this scenarios)

If fps is > my refresh rate tearing is occure as it should.

So it's because of new Windows 10 DWM or i'm doing smth wrong?

Sparky
Posts: 682
Joined: 15 Jan 2014, 02:29

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by Sparky » 17 Apr 2018, 07:24

The monitor starts doubling frames before it has to, so that it can still display the new frame on time. You may get tearing occasionally when framerate suddenly drops, but if your framerates are consistently low there shouldn't be tearing.

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by jorimt » 17 Apr 2018, 12:53

Chief Blur Buster wrote:Below ~30Hz, not ~36Hz. (at least for GSYNC)
FYI to anyone still not clear on this, that <36 number was approximate, and was based off my original research which applied to the earlier G-SYNC 144Hz TN models that started inserting duplicate refreshes around that range, as proven by pcper.com with an oscilloscope:

Image

It's possible newer and/or G-SYNC monitors with higher max refresh rates have a different starting point for the minimum refresh range.
MaxiJazz wrote:What about Windows 10 DWM?
I watch "G-sync range" picture and it's said, that IF V-sync is off AND fps <36 tearing will occure.
So it doesn't occure for me.
Tested with many games (ACO, NFS payback, Diablo 3, Overwatch, Painkiller ... ), forced v-sync off both driver and game, using always fullscreen mode, on\off fullscreen optimization. And not a single tearing if fps <36. (Tried even force to <20 and still no tearing in this scenarios)

If fps is > my refresh rate tearing is occure as it should.

So it's because of new Windows 10 DWM or i'm doing smth wrong?
As Sparky already noted, tearing only occurs in that scenario when the framerate changes are abrupt.

Read again, and you'll see my article was only referring to tearing from "Frametime Variances" in the <36 range; if there is a gradual or steady transition between that <36 range and higher/lower framerates, or constant framerates at or below the <36 range, you won't experience tearing with G-SYNC + V-SYNC Off, but as soon as you hit a frametime spike, it will tear. This goes for any framerate with G-SYNC + V-SYNC Off.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by Chief Blur Buster » 17 Apr 2018, 13:23

Thanks for the clarification! Yes, you're right, pcper did some fantastic analysis on this low-framerate behavior.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Malinkadink
Posts: 25
Joined: 21 Dec 2017, 21:20

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by Malinkadink » 14 May 2018, 22:51

I want to lay this to rest for myself, but do we have a concrete answer as to whether there is an added frame of delay with windowed/borderless and gsync? I know from these 101 tests it showed as no added delay just 1ms which is well within margin of error, but battlenonsense did similar tests and found that gsync windowed added a frame of delay when compared to fullscreen.

Who am i to believe?

I'd like to think there's no added delay, but being able to actually tell a difference isn't always easy, and i don't have the equipment to do any tests myself. It makes sense that G-sync overrides DWM in W10 to do its job and therefore circumventing the triple buffered vsync, but just want to be sure it's the case across all titles.

User avatar
lexlazootin
Posts: 1251
Joined: 16 Dec 2014, 02:57

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by lexlazootin » 15 May 2018, 03:48

Did he run 'windowed' or 'borderless windowed'? Because just simply windowed doesn't work.

Malinkadink
Posts: 25
Joined: 21 Dec 2017, 21:20

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by Malinkadink » 15 May 2018, 09:20

lexlazootin wrote:Did he run 'windowed' or 'borderless windowed'? Because just simply windowed doesn't work.
He used both windowed and borderless windowed at 142 fps cap and vsync off. The results were from April 2017, meanwhile the Gsync 101 tests were published June 2017 so there is a 2 month gap, but i don't think that had anything to do with it. I think the culprit may be that he ran with vsync off for all the tests whereas the 101 tests had Vsync on in the NVCP.

I myself actually ran vsync off in NVCP for a time until i learned that at the high end of the refresh rate you'd actually get tearing, and needed to have vsync on in the NVCP to eliminate tearing completely when capping 2-3 fps below the max Hz.

So then my guess is that running those tests with vsync off caused gsync to behave in a bit of a limbo as the 2 fps below maximum isn't always enough to always keep gsync engaged whereas 3 fps is a safer bet and is even mentioned as being the case in the 101 tests

"As for the “perfect” number, going by the results, and taking into consideration variances in accuracy from FPS limiter to FPS limiter, along with differences in performance from system to system, a -3 FPS limit is the safest bet, and is my new recommendation. A lower FPS limit, at least for the purpose of avoiding the G-SYNC ceiling, will simply rob frames."

I suppose i then answered my own question, seems the difference in testing caused the discrepency between what battle nonsense was showing and what the Gsync 101 tests here showed.

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by jorimt » 15 May 2018, 14:13

I actually used a -2 fps limit across the board in all those tests. I simply came to the conclusion that the absolute safe minimum would be -3, but this could vary with certain in-game limiters, that can fluctuate like crazy. -2 in most instances is perfectly safe regardless.

As for G-SYNC borderless/windowed and DWM lag, I wasn't originally going to even test for that scenario until someone suggested it, and when I did, I was surprised at the results. As I say in the article, "Further testing may be required, but it appears on the latest public build of Windows 10 with out-of-the-box settings (with or without “Game Mode”), G-SYNC somehow bypasses the 1 frame of delay added by the DWM."

It could vary by system, but on my setup in the article (which I've now upgraded from), I couldn't get it to tear in borderless or windowed without G-SYNC, where others can, so it's possible that may have been a factor, but then that means I was testing the worst case scenario, and there was still no more lag in G-SYNC borderless/windowed than in exclusive fullscreen.

The slight increase for the windowed mode in some instances could be because the frame update starts right at the top of the screen with G-SYNC, and with windowed, there was obviously the window title bar at the very top, which made detection of the start of the frame impossible in that area. Otherwise, yeah, everything was well within my margin of error.

Battlenonsense also got a notable reduction in input lag with Game mode, and my undocumented results in this thread showed no improvement whatsoever.

System differences make direct comparison of some testing scenarios difficult.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by Chief Blur Buster » 15 May 2018, 14:15

In an ideal situation, if well optimized, then the occasional input lag of pushing a tearline off the bottom edge of the screen is only microseconds. For example, at a 160 KHz horizontal scanrate (display running at 160,000 scanlines per second -- pixel rows per second refreshed), then a tearline that's about 20 pixels from the bottom edge of the screen, only requires a 20/160,000th of a second wait (~0.1ms) for a briefly-activated VSYNC ON to push that tearline off the bottom edge of the screen, then things are back to GSYNC normal.

Displays scans top-to-bottom, and tearlines coincides with interruptions to frame scanout. So tearlines shift down with waits. So the closer the tearline is to bottom edge of screen, the shorter a wait would have been needed to avoid creating a tearline.

At some point, once you've capped sufficently far enough below Hz, then VSYNC OFF during GSYNC, the bottom-edge tearlines may only appear once a second, or even less often. If that happens, then using VSYNC ON during GSYNC+VSYNC ON will only momentarily input-lag -- only that particular specific refresh cycle -- only instantaneously -- and only by a sub-millisecond amount. That's why a sufficiently large cap stops showing a difference between GSYNC+VSYNC OFF to GSYNC+VSYNC ON -- the rare tearline now becomes only mere microseconds delay that only occurs to less than 1-in-100 frames.

There is a point of diminishing returns once we're touching the capping threshold.

If tearlines never happen, then VSYNC ON versus VSYNC OFF doesn't affect GSYNC equation because neither VSYNC ON nor VSYNC OFF ever happens. Those modes only engage for frametimes faster than max Hz (fastest refresh cycle time) -- i.e. frames quicker than a refresh cycle. Capping prevents that from happening and keeps it in continual VRR. As explained by Jorim's GSYNC 101, capping avoids lag problems of hitting VRR max.

That said, if you've capped well enough that having GSYNC + VSYNC OFF (during RTSS capping) creates only a very rare tearline only at the very bottom edge of the screen, then it is an insignificant amount of momentary input lag to 'push' that tearline off the screen, at the current display scanrates involved.

Yet, nitpicking on details, but once we're at the "very threshold", the fuzzy line no longer matters.... And the lag-surge is no longer measurable in test results once it's roughly ~2 to ~3fps capped below.

At some point, adding 1 frame per second to cap doesn't have an effect on lag results, even if occasional microsecond-delays are still happening. Those who are very picky, can do larger cap margins like 3fps or 4fps or 5fps, but that is generally overkill when test results no longer even moves by a single millisecond.

Summary:
-- The lower the max refresh rate (e.g. emulators 60fps cap), the tighter the capping margin can be (e.g. 1Hz below)
-- The more accurate the capping (e.g. RTSS instead of in-game capping), the tighter the capping margin can be.

This is useful for users of 4K 60Hz GSYNC or FreeSync monitors -- who cannot use a higher refresh rate -- but who want to run emulators. They can run RTSS (accurate cap) at roughly 59fps or 59.5fps and intentionally slow down their emulator slightly -- in order to get the majority of input-lag savings. Or overclock their display to 61Hz and use a 60fps cap. The lowness of the Hz combined with the accuracy of RTSS, can allow you to greatly tighten the capping, in a situation where you need to run low-lag 60fps on a 60Hz-only VRR display where you otherwise might have zero ability to add a capping margin.

But for a lot of people, having VRR means having access to higher Hz too. So a common 141fps cap for 144Hz GSYNC monitors is quite popular.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
RealNC
Site Admin
Posts: 3737
Joined: 24 Dec 2013, 18:32
Contact:

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by RealNC » 22 May 2018, 12:40

Unwinder (author of RTSS) has just posted the changes for the beta of RTSS 7.2.0:
  • Added power user oriented profile setting, allowing you to specify the limit directly as a target frametime with 1 microsecond precision
  • Added power user oriented profile setting, allowing you to adjust throttle time. Throttle time adjustment is aimed to reduce input lag when framerate is below the target limit or without limiting the framerate
This is implementing a frame limiting method we came up with a while ago in this thread. If you just limit a game to -3FPS below max refresh rate, but the game is not able to reach that frame rate because the GPU is at full load, you get increased input lag. For your CSGOs, OWs, Quakes, etc, this is not an issue. But it is an issue for your Witcher 3s, your Fallout 4s, etc. And for 4K users or DSR/VSR users. The only solution to avoid input lag inconsistency is to cap near the lowest FPS you get, to ensure that the game is reaching the cap for the majority of the time.

This upcoming RTSS feature will prevent the GPU from being maxed out, and the game will behave as if it is always reaching the target frame cap. If you cap to 141FPS for example, but the game falls to 135FPS, RTSS will behave as if the cap is 134FPS (or 134.9FPS, the throttle amount is configurable.)

This should introduce much improved input lag consistency with GPU-heavy games. This is not only useful for g-sync/freesync, but also for plain old, uncapped frame rate, vsync off without VRR for GPU-bound games.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

Post Reply