[Thread Superseded] G-Sync 101 w/Chart (WIP)

Talk about NVIDIA G-SYNC, a variable refresh rate (VRR) technology. G-SYNC eliminates stutters, tearing, and reduces input lag. List of G-SYNC Monitors.
Locked
User avatar
kurtextrem
Posts: 41
Joined: 05 Mar 2017, 03:35
Location: Munich, Germany

Re: G-Sync 101 w/Chart (WIP)

Post by kurtextrem » 08 May 2017, 05:50

RealNC, by any chance, do you own Rainbow Six Siege? While playing my CPU stays around 80%-100% utilization and it feels like setting MPRF to 1 introduces stuttering, while 2 doesn't. Mouse feels different too because of the stuttering.
And I still wonder if virtual prerendered frames setting does something if you're not using a VR mask?
Acer XF250Q, R6 competitive player

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: G-Sync 101 w/Chart (WIP)

Post by jorimt » 08 May 2017, 09:36

lexlazootin wrote:Just noticed that if you force V-Sync off in NVCP you get tearing from the top of the screen if you're fps goes below double framerate mode. (45fps~)

Found this out speedrunning HL, when i set my fps to 5 it would tear all over :P
I said about as much in my article:
http://www.blurbusters.com/gsync/gsync101-range/
In the Upper fps range, tearing will be limited to the bottom of the display. In the Lower fps range (<36) where frametime spikes can occur (see “What are Frametime Spikes?” section further below), it will result in a complete middle tearline.
Perhaps you're differentiating v-sync "Force off" vs. "Use 3D application setting" + v-sync off in-game in this instance (?), but I'm pretty sure the effect is the same.

I'm not sure why it's so difficult for people to catch on, but G-SYNC + v-sync off is well and truly "Adaptive G-SYNC." Only difference between it and Adaptive V-Sync, is that it disengages itself both above and below a certain fps threshold.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

pneu
Posts: 35
Joined: 01 Mar 2015, 12:12

Re: G-Sync 101 w/Chart (WIP)

Post by pneu » 15 May 2017, 23:00

jorimt wrote:
RTSS Update (01/24/2017): A recent video by Battle(non)sense (https://youtu.be/rs0PYCpBJjc?t=2m32s) has posed the possibility that RTSS is adding 1 frame of input latency with G-Sync. I'm currently investigating the cause of this discrepancy, and will update here when I learn more.

RTSS Update (03/26/2017): RTSS does indeed appear to introduce up to 1 additional frame of latency, even with G-Sync:
I hate to be a pain but are you able to retest with these RTSS settings:

-application detection level: Low
-framerate limit: 138
-on screen display rendering mode: vector 3D
-on screen display coordinate space: viewport
-game applying the vsync vs NVCP applying the vsync
-All PCI devices in message signaled interrupt mode (http://forums.guru3d.com/showthread.php?t=378044)

I suspect the application detection level is affecting input lag levels, and I believe 142 is not a sufficiently low to keep it in the gsync range 100.0% of the time either. The "raster 3d" mode of on screen display is more advanced and supports more fonts which could potentially take longer to render as well.

Having said this, I find myself surprised to seemingly be detecting differences in input lag with RTSS vs in-game limiter. It only seems to occur on asset loading, such as when getting to a new section of a map, where I think I can feel the input lag growing at the mouse cursor for a few seconds and then settling back to normal after a few seconds, whereas with the in-game limiter I seem to not be able to detect this. It's so hard to tell objectively though, and you are one of only two guys with the objective tools. It's a problem in iracing for example because even the slightest deviation of input lag , even briefly of 7ms, can cause you to miss your turn in point by a few meters which amounts to actually a big loss in time. It would be better if it actually stayed at 7ms then you could adjust to it, but when it's spiking randomly +/-7m you can't compensate for that, which RTSS seems to be exacerbating. I read somewhere that it's because RTSS is stalling the whole rendering pipeline whereas the in-game limiter is only sleeping inside the render thread while other threads are still free to do things like calculate the next scene etc.
Last edited by pneu on 15 May 2017, 23:52, edited 4 times in total.

pneu
Posts: 35
Joined: 01 Mar 2015, 12:12

Re: G-Sync 101 w/Chart (WIP)

Post by pneu » 15 May 2017, 23:05

jorimt wrote: @RealNC, The Witcher 3 has an asset streaming system that is prone to creating frequent frametime spikes (at least on initial load)
I hope you are running it off SSD because I have noticed the same thing on Witcher 2 SSD vs HDD, where the former is free of frame spikes on asset loading, or, the latter is free of spikes but only after the first run through the map, because on the second run through windows is reading it from its memory cache, so it actually loads from RAM the second time (or possibly the cache on the HDD itself - but not superfetch/prefetch because I have those disabled). Observe similar thing in other games - first run through everything is loading off disk and disk reads have latency of ~25ms for a HDD vs 4ms for a SSD (according to Windows Resource Monitor) but second run through it's all from memory and much faster, no pauses when loading new sections of the map.
Last edited by pneu on 15 May 2017, 23:53, edited 1 time in total.

User avatar
RealNC
Site Admin
Posts: 3741
Joined: 24 Dec 2013, 18:32
Contact:

Re: G-Sync 101 w/Chart (WIP)

Post by RealNC » 15 May 2017, 23:47

kurtextrem wrote:RealNC, by any chance, do you own Rainbow Six Siege? While playing my CPU stays around 80%-100% utilization and it feels like setting MPRF to 1 introduces stuttering, while 2 doesn't. Mouse feels different too because of the stuttering.
And I still wonder if virtual prerendered frames setting does something if you're not using a VR mask?
I don't have that game.

The VR setting is overriding MPRF when VR is active. It's forced to 1 by default because MPRF > 1 adds latency, and that's a big no-no for VR. Otherwise, it's the same setting. It's an override for when VR is active.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
RealNC
Site Admin
Posts: 3741
Joined: 24 Dec 2013, 18:32
Contact:

Re: G-Sync 101 w/Chart (WIP)

Post by RealNC » 16 May 2017, 00:02

pneu wrote:I suspect the application detection level is affecting input lag levels
Unlikely. The detection level is about hooking into more APIs and/or rendering methods. It's a "it works or it doesn't" setting.
pneu wrote:The "raster 3d" mode of on screen display is more advanced and supports more fonts which could potentially take longer to render as well.
Also unlikely. These are loaded once, not on every OSD update. There are differences in render times though. But today, they should be in the microsecond range. They only had an observable impact back when RTSS was first developed (we were using DX8 GPUs back then, which were orders of magnitude slower than today's cards.)
Having said this, I find myself surprised to seemingly be detecting difference in input lag with RTSS vs in-game limiter. It only seems to occur on asset loading, such as when getting to a new section of a map, where I think I can feel the input lag growing at the mouse cursor for a few seconds and then settling back to normal after a few seconds
I think this is because of the way RTSS works. The reason RTSS is so accurate is that it effectively freezes the game to prevent it from rendering more frames. This also means that asset loading can be blocked too if the game's asset loading mechanism is tied to the main loop (meaning it doesn't run on an independent thread; "independent" means a thread that doesn't lock on the main thread.) It could also be that RTSS blocks all threads of the game. I don't really know. In that case, all games would exhibit the issue. Although this sounds unlikely; blocking the whole process would probably translate into audio and/or network issues. Using a 1FPS cap (or lower, like 0.1FPS by editing the profile file) would probably make the sound stutter.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: G-Sync 101 w/Chart (WIP)

Post by jorimt » 16 May 2017, 10:42

pneu wrote: I hate to be a pain but are you able to retest with these RTSS settings:

-application detection level: Low
-framerate limit: 138
-on screen display rendering mode: vector 3D
-on screen display coordinate space: viewport
-game applying the vsync vs NVCP applying the vsync
-All PCI devices in message signaled interrupt mode (http://forums.guru3d.com/showthread.php?t=378044)

I suspect the application detection level is affecting input lag levels, and I believe 142 is not a sufficiently low to keep it in the gsync range 100.0% of the time either. The "raster 3d" mode of on screen display is more advanced and supports more fonts which could potentially take longer to render as well.

Having said this, I find myself surprised to seemingly be detecting differences in input lag with RTSS vs in-game limiter. It only seems to occur on asset loading, such as when getting to a new section of a map, where I think I can feel the input lag growing at the mouse cursor for a few seconds and then settling back to normal after a few seconds, whereas with the in-game limiter I seem to not be able to detect this. It's so hard to tell objectively though, and you are one of only two guys with the objective tools. It's a problem in iracing for example because even the slightest deviation of input lag , even briefly of 7ms, can cause you to miss your turn in point by a few meters which amounts to actually a big loss in time. It would be better if it actually stayed at 7ms then you could adjust to it, but when it's spiking randomly +/-7m you can't compensate for that, which RTSS seems to be exacerbating. I read somewhere that it's because RTSS is stalling the whole rendering pipeline whereas the in-game limiter is only sleeping inside the render thread while other threads are still free to do things like calculate the next scene etc.
The delay RTSS introduces has nothing to do with the RTSS, in-game or control panel settings. RTSS simply doesn't begin regulation of the framerate early enough in the rendering cycle, and it couldn't unless it was part of the game engine it was being used in.

I'm currently very busy finishing up my upcoming G-SYNC 101: Input Latency article, so I can assure you, that with G-SYNC, 2 frames below the refresh rate is already safe, and 3 frames is super safe.

I will be including an in-game vs RTSS vs Nvidia Inspector FPS limiter test, but only for one game. Perhaps I can dedicate a future article to testing RTSS across a series of games, but that would come much later.

As for the "PCI devices in message signaled interrupt mode," that's really getting into the weeds, and as far as I know, with Windows 10, IRQ settings and conflicts are a thing of the past, and it is all automatically managed by the system.
pneu wrote:
jorimt wrote: @RealNC, The Witcher 3 has an asset streaming system that is prone to creating frequent frametime spikes (at least on initial load)
I hope you are running it off SSD because I have noticed the same thing on Witcher 2 SSD vs HDD, where the former is free of frame spikes on asset loading, or, the latter is free of spikes but only after the first run through the map, because on the second run through windows is reading it from its memory cache, so it actually loads from RAM the second time (or possibly the cache on the HDD itself - but not superfetch/prefetch because I have those disabled). Observe similar thing in other games - first run through everything is loading off disk and disk reads have latency of ~25ms for a HDD vs 4ms for a SSD (according to Windows Resource Monitor) but second run through it's all from memory and much faster, no pauses when loading new sections of the map.
I've run it off an SSD and an HDD, and no change in frametime spikes due to asset load, but loading screens are considerably shortened, yes, which is the typical improvement seen in an SSD.

The Witcher 3 appears to stream in most of its assets upon initial load-up over a period of a few minutes, which then appear to cache, unless the game is restarted, or a save is manually reloaded. So while an SSD may load the initial assets in faster, it still has to load them in, which results in hitches here and there regardless, at least in my experience.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

User avatar
streetunder
Posts: 2
Joined: 27 Apr 2017, 12:14

Re: G-Sync 101 w/Chart (WIP)

Post by streetunder » 16 May 2017, 14:13

jorimt wrote:
streetunder wrote: Then i was surprised, i still get tearing in all games, so i went online to search for a solution or to find out if have a hardware problem, since i always heard that G-Sync doesnt need V-Sync to work...

Yesterday ive found out about this thread and read the 1st post, now everything is more "clear", so thank you very much for the information provided, you are helping lots of people. 8-)
I'm glad to hear you've found my information useful. And yes, the G-SYNC "v-sync on/off" settings has confused many. But the best way to put it, is that the setting has been poorly labeled; with G-SYNC enabled, v-sync never activates, regardless of the v-sync setting. The closest it gets, is when you exceed your refresh rate, at which point G-SYNC + v-sync on begins to behave like traditional v-sync as it can no longer adjust the refresh rate to the framerate.

My latest information on the G-SYNC v-sync on/off settings can be found here:
http://www.blurbusters.com/gsync/gsync101-range/

Much of my original post is outdated, and will become relatively irrelevant once my article is complete.
streetunder wrote: 1) If G-Sync is made to work with V-Sync, then why Nvidia never said anything about it?
The driver automatically puts g-sync on by default and v-sync "controlled by 3d application".
If G-Sync was made to work with V-Sync from the beginning, theres no advantage using G-Sync at all, because if you turn V-Sync on, G-Sync will never work, and you will get V-sync inputlag. (it will work like a screen without G-sync basically)
So, you need to limit the frame-rate in a third party program, put 2 fps below max refresh rate, for everything to work at 100% and to get G-Sync to work right:
- This is a mess of technology, and badly designed, why is this not fixed yet by Nvidia? :evil:
G-Sync was released in 2013 or before, almost 4 years ago.

There isn't an easy answer for this, as it's technically oriented and multi-faceted. First of all, when G-SYNC first released, there was no "v-sync" option. G-SYNC was simply an option alongside v-sync in the v-sync dropdown of the control panel. Later, due to popular demand, they exposed the G-SYNC "v-sync" option, so that when you exceeded your refresh rate, with G-SYNC + v-sync off, G-SYNC would deactivate, instead of reverting to v-sync behavior and adding input latency.

Since G-SYNC eliminates tearing and sync-induced input latency by adjusting the refresh rate to the framerate, it is limited by the max refresh rate of the display. G-SYNC was originally meant to be a replacement to v-sync, and made to be used within the refresh rate of the display. It also helps smooth framerate delivery, and is best used when you can't reach the max refresh rate in a given game, and you are experiencing a highly variable framerate.

If you're constantly exceeding 120 fps on your 120 Hz display, and by a significant amount in the majority of games you play (unlikely for max settings in the latest triple A games, for instance), you're better off playing with v-sync off on a high refresh rate monitor, if input latency is a prime concern. Also, the framerate limit is only necessary for games that exceed your refresh rate 99% of the time; if you're averaging 80 fps @144Hz, the framerate limit isn't needed. So it isn't G-SYNC that is "a mess of technology, and badly designed," but merely a limitation of the monitors G-SYNC is implemented on. The higher refresh rates on G-SYNC displays become, the more benefits of the technology you will see.

streetunder wrote: 2) My screen is 120hz, so i need to limit the frame rate to 118fps for everything to work as it should?

I'm currently performing input latency re-test across all available refresh rates for my upcoming article, so I'll have way more details to share then. But yes, -2 fps under your refresh rate should be enough to prevent additional sync-induced input latency.

streetunder wrote: 3) If i have G-Sync on with V-Sync off, do i get any benefits from G-Sync?
I know i will get tearing but at low fps, G-Sync smooths the frame rate right?

The benefit you'll get over G-SYNC + v-sync on is partial tearing, which means the next frame will be delivered slightly faster (sometimes) at the same framerate limit, usually in the bottom area of the screen.

I've only tested the desktop implementation of G-SYNC however, and I know that laptop implementation may differ. Battle(non)sense has a good video testing G-SYNC input latency on laptops here:
https://www.youtube.com/watch?v=ADYzuMe17q8

streetunder wrote: 4) In the screen options menu i see a 3D Option, does my screen supports 3D?

I'm not an expert on 3D settings, so I don't know, maybe? Your monitor manual will probably tell you.

streetunder wrote: 5) What happens if i turn G-Sync On with Adaptive Sync On?

Nothing, or nothing good. At best, it will simply act like G-SYNC + v-sync off.

streetunder wrote: 6) Does G-Sync affects performance at all?
With G-Sync On, FireStrike 16.531
With G-Sync Off, FireStrike 17.682
-How is this possible?

For the latest G-SYNC modules with the latest hardware on desktop, no, there isn't a performance impact. I'm not sure about laptop. But the difference in score you're probably seeing there is 1, margin of error between tests, and 2, assuming you were using G-SYNC + v-sync on, G-SYNC limits the framerate to the max refresh of your display, whereas v-sync off (and G-SYNC disabled) allows your system to output frames above your refresh rate. Higher framerates = lower frametimes = better score. You should never benchmark with any syncing method or external framerate limit active.


Thank you very much for the reply,

When your upcoming article about all refresh rates will be finished?

RealNC wrote:G-Sync on laptops runs a shader on the GPU, since there is no hardware g-sync module. That shader does overdrive calculations, and possibly other, as-of-yet undisclosed work on the GPU. This can slightly decrease performance.

On Desktops, no shader is used. All calculations are done by the g-sync module.

But I don't think the performance difference would be as high as shown in your FireStrike bench though.


Thank you for the reply,

I dont notice any performance difference with g-sync off or on, maybe 3d Mark is not compatible with g-sync.

When i run 3d mark with g-sync i get an warning message to turn g-sync off.

P.S

Keep up the good work guys!

pneu
Posts: 35
Joined: 01 Mar 2015, 12:12

Re: G-Sync 101 w/Chart (WIP)

Post by pneu » 16 May 2017, 15:36

Understood, thanks.

I am still interested to know whether in-game vsync vs NVCP vsync is affecting the input lag. I don't know if it's a placebo but I seem to feel the input lag is lesser with the in-game vsync instead.

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: G-Sync 101 w/Chart (WIP)

Post by jorimt » 16 May 2017, 15:37

streetunder wrote: Thank you very much for the reply,

When your upcoming article about all refresh rates will be finished?
Happy to offer it.

The G-SYNC 101: Input Latency article will be released anywhere within the coming 1-3 weeks. Still working on it (literally hundreds of test samples still to go through), so that's as close to an estimate as I can give at this time.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

Locked