flagbender wrote:Then actually, this flies in the face of advice to cap framerate in order to keep input lag consistent for precise aiming. I don't really know how to feel about this. It means that A. even two machines running with an fps cap will have different average input lag depending on their power, and B. you can almost never get consistent input lag.
Consistent input lag is very hard to achieve. The only game I know of that allows you to set your own frame time, and thus achieve a constant input delay, is ezQuake (a community Quake engine.) ezQuake allows you to configure the frame cap in such a way where vsync on has almost the same input lag as vsync off.
That's basically the only game I know of that does this. With DX9 games, you can also use GeDoSaTo's frame capper and achieve a constant input delay. In theory... I don't use GeDoSaTo myself
I wonder if you could elaborate on how this correlates with my original questions about the frame buffering?
I don't think frame buffering has anything to do with this. If the GPU is maxed out, that just means it takes longer to render a frame. So if OW does frame time prediction in order to lower input lag, higher GPU usage means higher frame times and thus increased input lag. In games that do frame time prediction, the less load there is on the GPU, the lower input lag gets. In other words, the lower you set the graphics settings (less GPU load, lower frame rendering time,) the less input lag you get, regardless of whether you're below the frame cap or not.
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.