TheLeetFly wrote:Now my consideration: Wouldn't it be better to set my refresh-rate to 125Hz and cap my fps to for example 250? This would sync the monitor and my fps with my mouse's refresh-rate and all would have the common divisor of 125.
That won't be perfect because often 125Hz is often 125.02317Hz or 125.0095Hz or 119.9918Hz etc. You can see it when you load ToastyX or NVIDIA Custom Resolution -- the refresh rate is almost never exact, it's off by a tiny bit. And the 250fps cap may actually vary between 249-251fps. So there's no perfect sync. It may actually hurt input lag, because of the difference. e.g. 125.1Hz harmonic frequency with 250Hz is an 0.1Hz harmonic. That means, every 10 seconds, you will suddenly jump to the "next tick" upwards or downwards, and the attendant microstutter and/or input lag sudden change. i.e. your input lag will be slowly slewing upwards/downwards over a 10 second period during a 0.1Hz harmonic, and then suddenly 'reset' back, before slowly slewing. Most people won't notice, but competitive gamers may. It's best to use a framerate cap that developers higher-frequency harmonics rather than low-frequency harmonics, e.g. fps_max 300 during 144Hz often feels better.
On the other hand, there's the visual quality considerations. If that's more important than input lag, visual quality is often better when framerate is perfectly synchronized with refreshrate (ala framerate-maximized VSYNC ON with no framerate drops). But competitive gamers will tend to not do that, since that adds input lag in exchange for lack of microstutter (during situation of overkill GPU on older games, where you can maintain max-rate VSYNC ON).
If you're running source engine games on modern systems that runs triple-digit framerates, it is best to just leave framerate uncapped or if you want a cap, then cap at high numbers (300 or 500) for minimum input lag. Input lag caused by GPU is the frame render time, so running at 500fps (on a single card non-SLI system) will reduce GPU-share of input lag to 1/500th second, so the higher the framerate, the lower the input lag the GPU gives you.
There are exceptions. The exception is when the videogame (e.g. Quake Live) is intentionally automatically refreshing based on the mouse poll timing, but the vast majority of video games don't synchronize the timing of frame generation to the mouse poll. Quake Live uses a 125 or 250 framerate limit because it is synchronized to the mouse poll. But setting fps_max 250 does not do perfect sync to the mouse polls in source engine games.
Also:
Three-way
*perfect* synchronization between mouse poll rate, refresh rate, and frame rate, has historically never been done before GSYNC. With the arrival of GSYNC, it's now finally technologically possible to have 3-way sync between all Hz (mouse Hz, refresh Hz, and framerate) but I am not sure if any game has actually successfully done this, yet.