Re: External framerate caps adds lag. Use in-game framerate
Posted: 08 Aug 2014, 16:28
Oh, I'm aware of that. What I was asking is how external limiters as opposed to internal ones add latency. If what you intially stated is in fact what is happening, then external limiters lead to the GPU rendering the frame and waiting with the buffer swap. But waiting with buffer swap would mean constant frame frequency and therefore fluid game simulation without microstuttering. And since that is not necessarily the case in my experience (even though variance decreases notably, there still is some and FRAPS as far as I know measures frametime in the pipe and does not look at buffer swap times), I think external frame limiters are not delaying the buffer swap after finishing a rendering process, but delay the following rendering process to maintain a capped rendering interval. If the latter is true, there should not be any added latency, but more microstuttering than if the GPU delayed the buffer swap.Chief Blur Buster wrote:
However, best possible frame frequency stability isn't equal to best possible input lag.
[...]
The image data gets onto the cable within microseconds of the buffer switch. It does not occur at grandular frame refresh intervals. There is a concept known as the "horizontal scanrate" (number of scanlines drawn per second), which carried over from the old CRT days. Data is transmitted at a finite speed over the cable. At 120Hz 1080p, the horizontal scanrate is typically about 135KHz (as seen in Custom Resolution Utilities), which is 135,000 rows of pixels per second. One row of pixels get pushed into the video cable in 1/135000th of a second. You can do a buffer flip at any time during a 1/135000th second granularity, rather than a 1/120th sec granularity. This is how tearlines occur, they are essentially refresh-cycle "splices" between an old refresh and a new refresh, caused by a buffer flip in the middle of a refresh cycle!
[...]
Again, smoothness (consistent frame delivery) goal is different from low lag goal. They can be sometimes mutually opposing goals as a trade-off.
However, what you are saying seems to be true. I used a 1000fps camera and with the external NVIDIA Inspector limiter, it takes on average 1 refresh cycle longer for changes to appear on the bottom of the screen after input compared to using fps_max. There's also that the Source engine acts strange with external caps in some regards. Server rendering intervals are capped as well when you host an offline listenserver. And it also takes longer for the server to load up.
But yeah, even if I don't yet quite understand how, external caps do seem to add latency. And the argument that internal limiters are considered in the general code's calculations while external limits are not also makes sense. I switched back to using fps_max, even though I really felt the reduced frame frequency variance from using an external cap before.
I think the best implementation of VSYNC is achieved by using tripple buffering and capping your framerate just below your refresh rate. That essentially eliminates input lag caused by VSYNC because the slightly faster refresh cycle and shorter rendering times make sure that there is not a single old frame buffered, but that every drawn frame is the most recently rendered one. Of course, you still have the "input lag" from VSYNC because you don't get multiple frames in one refresh cycle where you get fresher data constantly, but 1 refresh cycle of input lag is more than bearable when in return you get stable frametimes and no tearing.Chief Blur Buster wrote:If you want perfect fluidity on a non-GSYNC monitor, and don't care about lag, use the following:
1. VSYNC ON (external capping)
2. Double buffering, not triple buffering
3. Gaming mouse with good 1000Hz mode
4. Powerful enough GPU
5. Frame rate exactly matching refresh rate.
6. Adjust detail down until framerate stays capped out without dropping.
7. Run game engines that are capable of capping out at frame rates matching refresh rate.
Motion can look really beautiful (Sega arcade or Nintendo 8-bit platformer style smoothness of scrolling) when you combine #1-7 simultaneously. It especially looks really good with motion-blur-reducing strobe-backlight technologies (LightBoost, ULMB, Turbo240, and BENQ Blur Reduction).
I actually use VSYNC ON for solo gameplay when lag is not critical since motion on strobe backlights (LightBoost/ULMB) looks nicer with VSYNC ON.
Is VSYNC always externally handled though? When I disable VSYNC in the NVIDIA control panel and activate it in CS:GO, it does not work. And when I set NVIDIA to "use the game setting" and enable it in CS:GO, I experience the same behaviour I did with the external cap (limited rendering interval for server framerate, longer loading times). So I guess VSYNC is always external?