I have been using RTSS + VSYNC to simulate double buffered VSYNC as explained in this Blur Busters article. Even though I can't take measurements, I can feel the difference in input lag versus using normal VSYNC.
I tried using the "adaptive" setting in Nvidia Control panel and I removed framerate capping from RTSS. I noticed that the input lag felt the same as the above method and it was much better than normal VSYNC. So, I think that "adaptive vsync" uses 2 buffers and not 3 buffers as the normal VSYNC in Nvidia cards. If this is the case, it makes the whole RTSS + VSYNC workaround unnecessary.
The games I tested were Overwatch and Just Cause 3 in fullscreen mode in Windows 10 using GTX1080.
I couldn't find any input lag tests that use "adaptive VSYNC". Are there any input lag tests that can confirm my hypothesis?
Adaptive VSYNC VS VSYNC input lag
Re: Adaptive VSYNC VS VSYNC input lag
I don't think anyone did any tests.
With that being said, the driver, at least for DX games, cannot decided whether to use double buffer or triple buffer vsync. That's up to the game.
If you have a 24Hz capable monitor, you might be able to tell with more certainty. A single frame makes a huge latency difference at 24Hz (41.6ms.)
With that being said, the driver, at least for DX games, cannot decided whether to use double buffer or triple buffer vsync. That's up to the game.
If you have a 24Hz capable monitor, you might be able to tell with more certainty. A single frame makes a huge latency difference at 24Hz (41.6ms.)
Steam • GitHub • Stack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
Re: Adaptive VSYNC VS VSYNC input lag
If the driver has no control over it at DX games, how it fast sync, adaptive vsync, vsync implemented at the driver level? For example Fast sync theoretically requires 3 buffers.RealNC wrote: With that being said, the driver, at least for DX games, cannot decided whether to use double buffer or triple buffer vsync. That's up to the game.
Re: Adaptive VSYNC VS VSYNC input lag
Fast sync runs the game with vsync off. Also, DX triple buffer is completely different from fast sync triple buffer. Interestingly, a video came out today from Digital Foundry that is somewhat relevant:petrakeas wrote:If the driver has no control over it at DX games, how it fast sync, adaptive vsync, vsync implemented at the driver level? For example Fast sync theoretically requires 3 buffers.
https://www.youtube.com/watch?v=seyAzw9zEoY
As for driver vsync, it will just force vsync in whatever buffer mode the game uses. If in-game vsync is disabled, that means driver vsync will result in double buffering. If in-game vsync is enabled, and the game has a triple buffer vsync option, driver vsync will also be triple buffered. At least that was the case in the past. Last time I tested this was a few years ago. But I don't think anything has changed since then though.
Steam • GitHub • Stack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
Re: Adaptive VSYNC VS VSYNC input lag
That's the video I saw and had the idea to check the input lag of adaptive vsync. I think that the video has an error though. It mentions that fullscreen borderless windows mode uses 3 buffer vsync, while in reality it uses what the video calls true triple buffer vsync (similar to fast sync).
Yes, I know the difference between fast sync and 3 buffer vsync.
So, what you are saying is that adaptive vsync can't have different input lag to regular vsync?
Yes, I know the difference between fast sync and 3 buffer vsync.
So, what you are saying is that adaptive vsync can't have different input lag to regular vsync?
Re: Adaptive VSYNC VS VSYNC input lag
NVIDIA's Adaptive V-SYNC is automatic "V-SYNC ON" whenever the framerate is above the monitor's max refresh rate, and is automatic "V-SYNC OFF" whenever the framerate is below the monitor's max refresh rate.petrakeas wrote:So, what you are saying is that adaptive vsync can't have different input lag to regular vsync?
So yes, NVIDIA's Adaptive V-SYNC has less input lag than other forms of V-SYNC, but only when the framerate is below the monitor's max refresh rate, and then you get full tearing, since it's the same thing as V-SYNC OFF in that instance.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series
Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)
Author: Blur Busters "G-SYNC 101" Series
Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)
Re: Adaptive VSYNC VS VSYNC input lag
In the games I played, the frame got rendered fast enough that Vsync was never disabled in adaptive vsync. However, I did experience significantly less input lag without having screen tearing than using regular vsync . My hypothesis, is that the adaptive v-sync implementation is different to normal vsync in a way that it results in less input lag. Perhaps less buffers are queued for the display.jorimt wrote:petrakeas wrote: So yes, NVIDIA's Adaptive V-SYNC has less input lag than other forms of V-SYNC, but only when the framerate is below the monitor's max refresh rate, and then you get full tearing, since it's the same thing as V-SYNC OFF in that instance.
I am surprised that adaptive vsync hasn't been tested for input lag with a high speed camera or that no-one has even tried it before choosing the framecap + vsync method.
Re: Adaptive VSYNC VS VSYNC input lag
Well, it could be. We don't know what tricks nvidia implements in their drivers.
But nobody tested it. Again, if your display is 24Hz capable, you could try adaptive vs full vsync and see if there's a clear difference.
But nobody tested it. Again, if your display is 24Hz capable, you could try adaptive vs full vsync and see if there's a clear difference.
Steam • GitHub • Stack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
Re: Adaptive VSYNC VS VSYNC input lag
As far as I'm aware, Adaptive V-SYNC is identical to standard V-SYNC when the framerate is above the refresh rate, and was created to remedy double buffer FPS lock when the framerate went below the refresh rate by automatically disabling V-SYNC.petrakeas wrote:My hypothesis, is that the adaptive v-sync implementation is different to normal vsync in a way that it results in less input lag. Perhaps less buffers are queued for the display.
[H]ardOCP tested Adaptive V-SYNC when it originally released, and the article seems to back my assumption up, specifically in this paragraph:
Full article link:This game scenario case [Mass Effect 3] where the framerate is always higher than the refresh rate indicates that Adaptive VSync has the same result as VSync on. There is really no benefit to it in this case because the framerate was faster than the refresh rate to begin with. Therefore, in any game where the framerate is faster than the refresh rate, Adaptive VSync has little benefit. The benefit comes in when the framerate is under the refresh rate as you will see below.
https://www.hardocp.com/article/2012/04 ... y_review/2
And yes, they aren't testing for input lag, but framerate, so it doesn't 100% rule out your "hypothesis," but it's safe to also assume that if Adaptive V-SYNC had a notable input lag reduction vs. standard V-SYNC above the refresh rate, NVIDIA would have promoted the heck out of that fact when it originally released.
If there is truly a reduction of input lag with Adaptive V-SYNC vs. standard V-SYNC above the refresh rate, the only thing I could think is that NVIDIA may have tweaked the driver since Adaptive V-SYNC's original release for it to use Fast Sync as a base (a form of true triple buffer implementation above the refresh rate) instead of double buffer V-SYNC, though that would be pretty easy to tell, as you'd get recurring stutter (more or less, depending on how high the framerate is above the refresh rate) when directly compared to double buffer V-SYNC in that scenario.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series
Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)
Author: Blur Busters "G-SYNC 101" Series
Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)
Re: Adaptive VSYNC VS VSYNC input lag
@RealNC
I'll test in on my TV at 24 Hz and come back with results.
Anyway, you can also give it a shot and see how it feels.
I'll test in on my TV at 24 Hz and come back with results.
I don't think that this is necessary. Using just 2 buffers instead of 3 (as most modern games do in regular Vsync) could result in less input lag. However, RealNC states that this can't be controlled by the driver. I am not familiar with DX, so I can't argue with that.If there is truly a reduction of input lag with Adaptive V-SYNC vs. standard V-SYNC above the refresh rate, the only thing I could think is that NVIDIA may have tweaked the driver since Adaptive V-SYNC's original release for it to use Fast Sync as a base
Anyway, you can also give it a shot and see how it feels.