Alright, I think I understand a portion of what you're trying to say now...RealNC wrote: Because the in-game limiter can reduce the impact of frame latency on total input lag. At 100FPS, frame latency is 10ms. That is part of the total input lag. An in-game limiter however can remove some of that 10ms from the total latency, if the game renders at higher FPS.
Case 1: If the game needs 1ms (1000FPS) to render a frame, a in-game limiter 100FPS cap can potentially decrease total input lag by up to 9ms.
Case 2: If the game renders a frame in 10ms (100FPS), then a 100FPS in-game cap cannot reduce latency further.
RTSS on the other hand, will always behave as if the second case is always true. Regardless of whether the game renders the frame in 1ms or in 10ms, you will get the 10ms frame latency of 100FPS added to the total input lag. That latency is the same as if the game was running uncapped at 100FPS.
If you think about how external, non-predictive frame limiting is done (as RTSS does), it makes sense. The algorithm is quite simple. If the frame time is less than the target frame time (cap), wait until we reached the target frame time. That means regardless of how much time it took to render a frame, the result will always be the same as if the frame took Nms to render, where N is our target frame time. Thus, input lag is the same as if the game was rendering at the target frame rate.
In other words, RTSS is perfectly "neutral." It does not reduce nor increase latency. The in-game limiter reduces latency. (And NVidia's limiter increases latency.)
And, again, that's only "in theory". In practice, RTSS reduces latency in many games due to the frame buffering issue.
In-game Framerate Limiter:
Lets frames render at any frametime above the refresh rate. This means a frame can render faster than the set FPS limit, but each frame will be delivered in intervals of the target framerate.
Frametime can not exceed the set FPS limit, which means each frame, at best, will be scanned in after it is rendered.
This would explain why RTSS has tighter frametimes, as frametime = framerate limit, but then if that's the case, I'm not 100% certain why RTSS still delays an entire frame every few frames; e.g. why we're seeing less than 1 frame of delay (+5ms at 144Hz on average).
Unless I'm misunderstanding, this is where you're off, however:
If this were the case, even if you were using G-SYNC + V-SYNC, you'd see a tear. The fastest a single frame can be scanned in from top to bottom is dictated by the static scanout time of the panel. So even with V-SYNC OFF, if you have a 100 FPS in-game limit at 144Hz, and one frame renders in 1ms, it's still going to be delivered in intervals of 100 frames per second, and scan in each frame at 6.9ms, and, with G-SYNC, it must wait for the previous scanout, also 6.9ms, to complete before displaying.RealNC wrote:Case 1: If the game needs 1ms (1000FPS) to render a frame, a in-game limiter 100FPS cap can potentially decrease total input lag by up to 9ms.