Adaptive vsync vs Regular vsync Nvidia driver
Adaptive vsync vs Regular vsync Nvidia driver
Is the input lag the same between the two, when they are at both 60 Hz or 30 Hz with half refresh rate vsync? To me it feels, that adaptive vsync is more responsive, has less input lag, but I am not sure
Re: Adaptive vsync vs Regular vsync Nvidia driver
The only difference is adaptive v-sync turns v-sync off when your framerate drops below your refresh rate.sebek1 wrote:Is the input lag the same between the two, when they are at both 60 Hz or 30 Hz with half refresh rate vsync? To me it feels, that adaptive vsync is more responsive, has less input lag, but I am not sure
Re: Adaptive vsync vs Regular vsync Nvidia driver
I also feel that "Adaptive VSYNC" has smaller input lag that regular VSYNC. I think that it is using double buffered VSYNC and thus has lower input lag versus the classic VSYNC which usyes 3 buffers. Are there any input lag tests for this?
Re: Adaptive vsync vs Regular vsync Nvidia driver
Normal v-sync is double buffered, and has the same input lag as "Adaptive VSYNC" when it's active. Triple buffering has two possibilities, you can drop excess frames, or keep all frames and display them sequentially. Below the refresh rate, double buffered v-sync will drop your framerate to half your refresh rate in the event you're gpu limited. Triple buffering will not, but will increase the variance in input lag. Both triple buffering methods behave the same below the refresh rate. When framerate is above the refresh rate, dropping frames will reduce the average input lag substantially compared to the other v-sync methods, triple buffering without dropping frames will have one extra frame of latency compared to double buffered(so on a 60hz monitor that will be something like ~100ms instead of ~83ms.) If you cap framerate below your refresh rate, all the v-sync methods behave pretty similarly (much less absolute latency but more variance in latency than double buffered v-sync).petrakeas wrote:I also feel that "Adaptive VSYNC" has smaller input lag that regular VSYNC. I think that it is using double buffered VSYNC and thus has lower input lag versus the classic VSYNC which usyes 3 buffers. Are there any input lag tests for this?
Use g-sync if you have it, and cap framerate in game just below your monitor's max refresh rate. (use rtss if the game doesn't have a framerate cap)
Fast sync is a good option if you can maintain framerate > 2x your refresh rate, and want to use blur reduction instead of g-sync.
v-sync off if you want to prioritize latency.
uncapped double buffered v-sync if you want to prioritize smoothness of animation over everything else.
v-sync on capped just under refresh rate if you want to prioritize latency and tearing.
Here are a few relevant threads:
viewtopic.php?f=10&t=1381
viewtopic.php?f=10&t=2168
viewtopic.php?f=10&t=3005
viewtopic.php?f=10&t=4602
and a graph from my old testing: https://docs.google.com/spreadsheets/d/ ... nteractive
And finally, there are a lot of input lag tests in the g-sync 101 article: https://www.blurbusters.com/gsync/gsync ... ettings/4/
Re: Adaptive vsync vs Regular vsync Nvidia driver
Thanks for the details response. I was already aware of these methods tough. Unfortunately, I don't have a GSync monitor yet. I also tried the new scanline vsync-like method of RTSS which looks promising and feels really responsive.