It doesn't linearly convert. We've seen situations where the input lag jump is asymmetric to refresh rate changes. Also, RTings got 3.7ms for the BenQ XL2540, for example.Notty_PT wrote:So, what´s the drill? Well, a good 144hz monitor has 9ms to 10ms input lag at 60hz. You divide that by 2,25 (because of the 144hz) and you end up with around 4,2ms input lag.
Also, lag jitter from refresh rate granularity is extremely important. 144Hz can add a 6.9ms lag-jitter error to your aiming, and 240Hz reduces that to a 4.2ms lag-jitter error. This is because your crosshairs area updates every ~4.1ms instead of every ~6.9ms.
Any game event (e.g. server tick) will round-off to the next refresh cycle, giving you a 4.1ms versus 6.9ms randomization in ability to begin reacting to an event that occured close to your crosshairs.
At the moment, according to our tests, if you do an apples-vs-apples comparision (same monitor), comparing VSYNC OFF at the same frame rate but at different refresh rates.Notty_PT wrote:Your input lag is now higher than your 144hz monitor at 144fps... see what I mean?
As part of GSYNC tests, Blur Busters also did VSYNC OFF tests.
Scroll to "Beyond The Scanout" of 240Hz Input Lag Tests.
It showed 1000fps@240Hz has less lag than 1000fps@60Hz, for high speed camera, timed from mouse button click to first-screen-reaction-anywhere.
50fps@240Hz has less lag jitter than 50fps@144Hz
300fps@240Hz has less lag jitter than 300fps@240Hz
It remains true, regardless of whether frame rate is above or below refresh rate, especially since faster scanout means stuff at the crosshairs is updated sooner within a tighter timespan of a game event. Even at 50fps, things will show up sooner at your crosshairs at 240Hz than at 144Hz, due to the faster refresh-cycle scanouts.
That's high speed camera pointed at CS:GO.
Now, there may be some 240Hz monitors that does a terrible job at the moment (e.g. full frame buffering instead of line buffering). However, the huge reduction in lag-randomization from refresh rate granularity rounding-off (1/240sec random chance rather than 1/144sec random chance) more than hugely greatly compensates even if the lag was almost identical (i.e. 240Hz handicapped to 144Hz lag).
Don't forget to account for lag randomization from refresh rate granularity.Notty_PT wrote:But is it really worth it to spend 500 bucks or more on something that has a 0,45ms advantage?
When lag equal, the reduced lag randomness, still causes 240Hz to win. The charts clearly show a much narrower MIN/MAX/AVERAGE spread for higher refresh rates at the same frame rates.
To me, the aiming-accuracy feels like the difference between a 125Hz mouse (8ms) versus a 500Hz mouse (2ms). Not quite 125Hz->1000Hz mouse jump. But it's a quite substantial reduction in the lag-randomization factor from refresh rate granularization.
From tests of Acer XB252Q running CS:GO
At 1000fps@60Hz VSYNC OFF, tests had a spread of 14ms-thru-27ms (a 13ms random-lag spread)
At 1000fps@144Hz VSYNC OFF, tests had a spread of 12ms-thru-18ms (a 6ms random-lag spread)
At 1000fps@240Hz VSYNC OFF, tests had a spread of 12ms-thru-14ms (a 2ms random-lag spread)
Averaged lag from 40 mouse button presses in CS:GO.
True, 1000fps is an extreme case. But, it actually biases to tighter spreads. Lag spreads tends to be bigger with lower frame rates because of bigger rendertime and netcode fluctuations which can amplify the lag-randomization effect.
For this test, that's far more passes than other websites -- and this is real button-to-pixels for CS:GO rather than synthetic lag tests via Leo Bodnar. Sure, you get the same 12ms MIN lag (so in one sense you are right). BUT.... look at the MAX too, 4ms lag savings over 144Hz. Tiny for a consumer, but big for eSports/competitive players.
Just to clarify a minor semantic....it's a MIN/MAX/AVERAGE that we need to look at.mello wrote:This in completely inaccurate.
Notty_PT had somewhat of a (weak) point when it comes to MIN lag.
We got 12ms vs 12ms for 144Hz-vs-240Hz for the min-lag measurement.
BUT, we must not fail to look at MIN/MAX/AVERAGE and the lag-randomization factor.
When looking at lag spread (lag randomization factor) and the max lag, that the real picture makes 240 Hz a really big winner. Lag-randomization for the win. As long as the min is not worse (which it clearly is not)
Not everyone realizes refresh rate granularity injects a lag-randomization factor in real games. Leo Bodnar is SiSoft Sandra of lag tests: A synthetic lag benchmark, albiet useful, still ignores real-world tests. Real lag tests is actual real-game CS:GO lag testing.