Re: g-sync input lag vs XL2411T vs XL2420Z vs VG248QE
Posted: 10 Mar 2014, 19:13
Greetings all again!
i went to test GSYNC abit more, since i was really curious what will happen at the cap. Before i go on, one thing about the method - it's quite old method (few years) that was developed for event purpose (since infinite.cz is an event company) to test video and audio latency of various setups - and we're talking about big LED screens or video walls made of seamless plasma panels, as well as different device on the way (splitters, repeaters). For that we never cared about +/- 1ms accuracy, but rather than being able to discover what is causing +50ms. Since the device is USB, i believe many of you are aware that by default, USB have 125hz polling rate, which means +/- 4ms inaccuracy caused just by USB. This wasn't fixed for the original tests, given the fact that most mouses still run at 125hz and if we're measuring button-to-pixel, i believe it's more accurate this way. 18 measures we're done for each test, result is the average.
However, given the fact that i was going to measure a lot of values, if i had to 18 tests for each, i would spend days on it. And since I'm not paid to do it, i don't really have days. So i applied the 1000Hz fix, therefore values shown below are cca. -4ms than the values posted in the original article.
First test i did was to cap the frame rate at 125 and then suddenly switch the limiter to 200fps and measure latency for each frame as well as real fps. results below.
We can see that the game was allowed to run at 200fps for some time, while increasing the lag. When we reached vsync values, the game was capped by drivers (waiting for Present() return).
I did the same test with 400fps instead of 200. The cap was reached in 8 frames.
Naturally i did a reverse test, going from 200 to 125
The values are slowly falling back to their original levels. When i limited the fps to 60, it went down to "no vsync" values in just 3 frames.
Conclusion
In my humble opinion, it looks like there are up to 3 buffers on the way, that can store frames and compensate short frame rate drops, causing variable input delays. While this looks definitively like something that is more feature than a bug - short frame rate drops can occur when a game executes a scripted sequence, does an unexpected load etc. and this feature can compensate them. Cool. However it also introduces variable input lag, which can cause problem for competitive players.
If I'm not wrong, it would be nice if we had a little bit more control about this feature, having this mode as default for smoothest movement. Second option could be to cap the game on the very first frame, keeping input lag to minimum at all times (similar to fps cap). Third option can be something similar to triple buffering - allow 2 frames in advance, while not capping frame rate at all. (I know there is a triple buffering option in nvidia drivers, tried turning it on and off, but it had no effect). The second option can be done by setting max_fps or using external frame rate cap utility (I'll to try to produce one, if i find some more spare time). I have no idea how to emulate triple buffering, when the game thinks it runs in "vsync off" mode.
EDIT: I also hope this is not an old news, these things aren't really my field.
i went to test GSYNC abit more, since i was really curious what will happen at the cap. Before i go on, one thing about the method - it's quite old method (few years) that was developed for event purpose (since infinite.cz is an event company) to test video and audio latency of various setups - and we're talking about big LED screens or video walls made of seamless plasma panels, as well as different device on the way (splitters, repeaters). For that we never cared about +/- 1ms accuracy, but rather than being able to discover what is causing +50ms. Since the device is USB, i believe many of you are aware that by default, USB have 125hz polling rate, which means +/- 4ms inaccuracy caused just by USB. This wasn't fixed for the original tests, given the fact that most mouses still run at 125hz and if we're measuring button-to-pixel, i believe it's more accurate this way. 18 measures we're done for each test, result is the average.
However, given the fact that i was going to measure a lot of values, if i had to 18 tests for each, i would spend days on it. And since I'm not paid to do it, i don't really have days. So i applied the 1000Hz fix, therefore values shown below are cca. -4ms than the values posted in the original article.
First test i did was to cap the frame rate at 125 and then suddenly switch the limiter to 200fps and measure latency for each frame as well as real fps. results below.
We can see that the game was allowed to run at 200fps for some time, while increasing the lag. When we reached vsync values, the game was capped by drivers (waiting for Present() return).
I did the same test with 400fps instead of 200. The cap was reached in 8 frames.
Naturally i did a reverse test, going from 200 to 125
The values are slowly falling back to their original levels. When i limited the fps to 60, it went down to "no vsync" values in just 3 frames.
Conclusion
In my humble opinion, it looks like there are up to 3 buffers on the way, that can store frames and compensate short frame rate drops, causing variable input delays. While this looks definitively like something that is more feature than a bug - short frame rate drops can occur when a game executes a scripted sequence, does an unexpected load etc. and this feature can compensate them. Cool. However it also introduces variable input lag, which can cause problem for competitive players.
If I'm not wrong, it would be nice if we had a little bit more control about this feature, having this mode as default for smoothest movement. Second option could be to cap the game on the very first frame, keeping input lag to minimum at all times (similar to fps cap). Third option can be something similar to triple buffering - allow 2 frames in advance, while not capping frame rate at all. (I know there is a triple buffering option in nvidia drivers, tried turning it on and off, but it had no effect). The second option can be done by setting max_fps or using external frame rate cap utility (I'll to try to produce one, if i find some more spare time). I have no idea how to emulate triple buffering, when the game thinks it runs in "vsync off" mode.
EDIT: I also hope this is not an old news, these things aren't really my field.