Thanks to OP and the community for making this detailed analysis about g-sync
I like g-sync very much, but i notice my monitor had an weird issue when i turn on g-sync and im not sure is my monitor defect or not, or its an cable issue? firmware/driver issue?
My monitor is XL2420G, when i turn on g-sync and loaded up some games (GuildWars2, Overwatch, For Honor) i can notice a slight brigtness fluctuating (its like blinking brightness from ex : 90 brightness to 89 brightness and then back again to 90 brigthness)
the "blinking brigtness" is very noticeable in game loading screen..
in game menu especially For Honor, less in overwatch..
and less in gameplay, but still the issue is still there sometime
I found this annoying for me
Is this normal? Or should i rma my monitor?
My system was running
1070 Founders Edition & dual monitor : BenQ XL2420G + BenQ XL2430T
Regards
[Thread Superseded] G-Sync 101 w/Chart (WIP)
Re: G-Sync 101 w/Chart (WIP)
If it is what I think it is, it's normal. The issue and its explanation was documented by pcper a few years back. This was before there were IPS G-Sync panels, and it appeared to affect all of the TN-based panels at the time.
Long story short, with G-Sync, the screen has to refresh itself more at near zero framerates (loading screens, or momentary drop from a very high fps to a very low fps, etc), else it would go black. This increase in refreshes creates a slight brightness fluctuation compared to normal operation (1-2%), thus the flicker.
More details here:
https://www.pcper.com/reviews/Editorial ... Flickering
https://www.youtube.com/watch?v=ujgRjsmwtgY
I have a newer IPS G-Sync panel myself, and while rare, I have noted the issue on certain loading screens with G-Sync enabled (Half-Life 2, etc). It virtually never happens on my panel, and when it does, it is extremely slight, and usually only visible on bright/solid loading screens when the framerate is near 0 for extended periods of time. It's possible IPS panel technology is less susceptible to the flicker when compared to TN panels, but yes, it is expected.
Long story short, with G-Sync, the screen has to refresh itself more at near zero framerates (loading screens, or momentary drop from a very high fps to a very low fps, etc), else it would go black. This increase in refreshes creates a slight brightness fluctuation compared to normal operation (1-2%), thus the flicker.
More details here:
https://www.pcper.com/reviews/Editorial ... Flickering
https://www.youtube.com/watch?v=ujgRjsmwtgY
I have a newer IPS G-Sync panel myself, and while rare, I have noted the issue on certain loading screens with G-Sync enabled (Half-Life 2, etc). It virtually never happens on my panel, and when it does, it is extremely slight, and usually only visible on bright/solid loading screens when the framerate is near 0 for extended periods of time. It's possible IPS panel technology is less susceptible to the flicker when compared to TN panels, but yes, it is expected.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series
Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)
Author: Blur Busters "G-SYNC 101" Series
Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)
Re: G-Sync 101 w/Chart (WIP)
Basically, when you stop refreshing a panel it slowly fades to white over a few seconds, and that starts immediately. So there's a slight decrease in contrast and increase in brightness when you go from a high refresh rate to the "minimum" refresh rate that you get when your framerate hitches down to zero during a load. G-sync and freesync mitigate this by repeating old frames, but you're still stuck with a panel technology that was intended to be refreshed regularly.
Re: G-Sync 101 w/Chart (WIP)
Just wanted to add that for lowest input lag we must also set max prerendered frames to 1 in NVCP, and also in the game if it has the option. Maybe this has something to do with why people have reported input lag when using gsync+vsync?
Another thing to mention is the Windows "bcdedit /set useplatformclock true" command, which should toggle whether to use the [supposedly more accurate] motherboard hardware clock for CPU timing, which RTSS may or may not be using.
Another thing to mention is the Windows "bcdedit /set useplatformclock true" command, which should toggle whether to use the [supposedly more accurate] motherboard hardware clock for CPU timing, which RTSS may or may not be using.
- lexlazootin
- Posts: 1251
- Joined: 16 Dec 2014, 02:57
Re: G-Sync 101 w/Chart (WIP)
@sippycup
Yep, this happens because when your monitors refreshrate drops below 45~ it causes the FRC to stop working. I have the same monitor and it happens to me too on loading screens.
Yep, this happens because when your monitors refreshrate drops below 45~ it causes the FRC to stop working. I have the same monitor and it happens to me too on loading screens.
Re: G-Sync 101 w/Chart (WIP)
Ah yes, max pre-rendered frames. Honestly, I considered including this setting in my tests, but it's far too difficult to measure consistently.pneu wrote:Just wanted to add that for lowest input lag we must also set max prerendered frames to 1 in NVCP, and also in the game if it has the option. Maybe this has something to do with why people have reported input lag when using gsync+vsync?
Another thing to mention is the Windows "bcdedit /set useplatformclock true" command, which should toggle whether to use the [supposedly more accurate] motherboard hardware clock for CPU timing, which RTSS may or may not be using.
Lowering that setting does not 100% guarantee lower input latency in all circumstances, and it can react differently (wildly, really) depending on the game and system in question. That, and some games already run with it set at a low number internally.
At best, you can reduce input latency by a single frame, at worst, setting it to "1" can cause a lot more frametime spikes, which happens to me far more often, so I just leave the setting at default.
As for HPET, there has been endless speculation about it over the years, but no concrete proof that it is beneficial, or that the setting does anything over the default Windows settings (Windows decides when to use it either way in most instances).
I didn't even think of that; my display is native 8-bit, no FRC. That could definitely be exacerbating the issue on that particular model.lexlazootin wrote:@sippycup
Yep, this happens because when your monitors refreshrate drops below 45~ it causes the FRC to stop working. I have the same monitor and it happens to me too on loading screens.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series
Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)
Author: Blur Busters "G-SYNC 101" Series
Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)
Re: G-Sync 101 w/Chart (WIP)
Mine is native 8bit as well. And yet I experienced what sippycup is describing. At least in some loading screens.
Monitor: Gigabyte M27Q X
- lexlazootin
- Posts: 1251
- Joined: 16 Dec 2014, 02:57
Re: G-Sync 101 w/Chart (WIP)
I'm pretty sure it stops working as soon as the frames start to get doubled to deal the minimum refresh rate limit.
Re: G-Sync 101 w/Chart (WIP)
Strange, this RTSS...
In my game (MechWarrior online) with Cryengine, it has a zero latency inducing internal FPS cap.
Setting that at 57± fps and then while in-game testing, enabling RTSS in game reducing FPS to 55 to override that internal fps cap does not introduce any additional input lag. I mean with my low latency predator screen I can just about feel the difference between 125 and 1000hz polling rate, that is like a 4ms± response difference practically.
I would sure as hell be able to feel a 1 frame delay in that case, right? Which should be higher.
On another note, it seems as if MWO behaves differently to HPET vs TSC as the internal fps cap is also affected by the timer used, and deviates by 1 fps or so. I'm not sure if its just the actual in-game fps graph that shows the difference or whether it actually changes the fps as well. Pretty sure its the latter
In my game (MechWarrior online) with Cryengine, it has a zero latency inducing internal FPS cap.
Setting that at 57± fps and then while in-game testing, enabling RTSS in game reducing FPS to 55 to override that internal fps cap does not introduce any additional input lag. I mean with my low latency predator screen I can just about feel the difference between 125 and 1000hz polling rate, that is like a 4ms± response difference practically.
I would sure as hell be able to feel a 1 frame delay in that case, right? Which should be higher.
On another note, it seems as if MWO behaves differently to HPET vs TSC as the internal fps cap is also affected by the timer used, and deviates by 1 fps or so. I'm not sure if its just the actual in-game fps graph that shows the difference or whether it actually changes the fps as well. Pretty sure its the latter
LTSC 21H2 Post-install Script
https://github.com/Marctraider/LiveScript-LTSC-21H2
System: MSI Z390 MEG Ace - 2080 Super (300W mod) - 9900K 5GHz Fixed Core (De-lid) - 32GB DDR3-3733-CL18 - Xonar Essence STX II
https://github.com/Marctraider/LiveScript-LTSC-21H2
System: MSI Z390 MEG Ace - 2080 Super (300W mod) - 9900K 5GHz Fixed Core (De-lid) - 32GB DDR3-3733-CL18 - Xonar Essence STX II
Re: G-Sync 101 w/Chart (WIP)
You only get that with vsync. With gsync there shouldn't be any added latency.MT_ wrote:Setting that at 57± fps and then while in-game testing, enabling RTSS in game reducing FPS to 55 to override that internal fps cap does not introduce any additional input lag.
Steam • GitHub • Stack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.