It's a bit more complicated. I'll try to explain what happens to your input as it works it's way through the system:spacediver wrote:I *think* I'm starting to understand.
So there is:
1) the delay between input and that input being reflected in a (buffered) frame.
2) the delay between the creation of that buffered frame, and that frame being drawn on the screen.
Now what about the server or game world simulation (dunno what the correct term is here)? Let's say it takes 100 ms for a button press to be reflected in the next available buffered frame, and another 100 ms for that frame to be displayed on my screen. In the gameworld, when does that button press occur? Is it T+100ms (where T is moment of button press)?
Okay, you press the button, the first thing that notices is the microcontroller in the mouse, which sends it to the PC the next time it's polled.
[physical button actuation]
[mouse calculation](should be negligible, but some mice are programmed poorly) 0~several ms. (very few reviewers measure this)
[usb polling delay](random delay between 0 and usb polling interval) Most gaming mice have 1ms polling, so 0~1ms.
Next, your input waits in a buffer until the next time the game engine(CPU) checks for it.
[input buffer](random delay between 0ms and 1/framerate.)
[in game framerate cap] (delays the CPU's start on next frame while allowing input buffer to keep gathering user input)
Next, the CPU grabs input and starts work on the frame, deciding what the GPU should draw. This can only start if the Driver buffer is empty. (unless you have a render ahead limit greater than zero)
[CPU calculations] (this is less than or equal to 1/framerate. depending on if you're CPU limited.)
Next, the frame waits until the GPU is ready to start working on a new frame.
[Driver buffer](if you're limited by CPU or an in-game framerate cap this is 0ms. If you're not CPU limited, this = 1/framerate - CPU calculation time.
[gpu framerate cap] (I could be wrong about this, but I think a gpu framerate cap just stops the GPU from working on a frame based on how long it took to finish the last frame. If it spoofs a full driver buffer when it's really empty, that could achieve lower input lag. Hard to find good information on this, and it could be implemented differently in all the different driver based framerate limiters. Frame pacing algorithms can also throw a monkey wrench in here. I'll post a followup question in flood's input lag thread.) Edit: On AMD cards, Rivatuner appears to prevent the GPU from working on a frame, keeping the driver queue full with a real frame, Radeonpro DFC appears to block buffer flips after the GPU has worked on a frame, causing 1 additional frame of input lag over rivatuner, and 2 frames over a cpu based framerate cap.
Next the GPU starts working on the frame(the frame is stored in the back buffer during GPU calculations)*
[GPU calculations/back buffer](Calculation time is less than or equal to 1/framerate, depending on if you're GPU limited.)
Next the back buffer becomes the front buffer, and the frame is scanned out to the display(This is called a buffer flip. If you have v-sync on this can only happen when the monitor is between refresh cycles. With VRR, a buffer flip causes the monitor to start a new refresh cycle.)
[v-sync delay, if applicable](round 1/framerate up to next full multiple of refresh rate)
[front buffer](1/ current refresh rate or 1/max refresh rate, depending on if you're in fixed or VRR mode).
now the display starts to show the result of that input.
[display](processing time + pixel transition time, some reviewers measure this).
So with an in game cap you do see latency from the input buffer, but the input buffer doesn't 'wait' for the next frame, it just accumulates input there until the game engine is ready for it.
*there is a second back buffer used for triple buffered v-sync, so that if the buffer cannot be flipped when the gpu finishes a frame, the gpu can switch to the second back buffer and keep working, instead of sitting idle until the next buffer flip. Without it, v-sync causes framerate to drop to half the refresh rate if you can't maintain framerate equal to refresh rate. The disadvantage is either dropped frames or an extra frame of input lag when you can pump out frames faster than your display refreshes.