nimbulan wrote:With all this discussion about vsync on vs off input lag (250 fps off vs 125 fps on,) two questions keep coming to mind: How does the decreased input lag benefit the player when it is not visible?
Even if you can't see the input lag, the improved responsiveness is still felt. It's very possible to feel the difference between 125fps and 250fps, in the form of better accuracy, responsiveness, and less error:
(1) Fast 180 flicks can involve a mouse moving at 4000 pixels/second (example). During 250fps, the aliasing error is 1/250th of that, rather than 1/125th of that (e.g. 16 pixel overshoot rather than 32 pixel overshoot).
(2) Microstutters during VSYNC OFF can be reduced at 250fps than 125fps. Even during a slow screen panning speed of one half screenful per second (~1000 pixels/second) a 1/125 microstutter can mean a stutter range of 8 pixels in onscreen-object-versus-eye-tracking mispositioning (1000 / 125 = 8), and a 1/250 microstutter can mean a stutter range of 4 pixels-off error in onscreen-object-versus-eye-tracking mispositioning (1000 / 250 = 4). Reduced microstutters means better accuracy during fast-aiming manoevers.
(3) The "snappier" factor: A human often gets used to aiming with a specific input lag. For example, flicking 180 degrees and then aiming crosshairs on a target quickly. You get used to a specific response speed. If you suddenly add or subtract 5ms of input lag, this can manifest itself as slight increase in aiming errors (e.g. overshooting by, say 16 pixels versus overshooting by 32 pixels, before re-aiming). You now need to get used to the equipment, so equipment changes mid-game can throw you off because your aiming got used to a specific input lag. It's much easier to adapt when input lag is falling, than when input lag is rising, however. Not all games increase snappiness during higher frame rates, but a lot of games do, especially when input reads are very close to the rendering.
(4) It's almost negligible, but the "feel" of better aiming is widely reported by huge numbers of competitive gamers, during ultrahigh framerates. Personally, I have been able to determine that 300fps VSYNC OFF does feel "snappier". I overshoot my targets a little less when I'm flicking 180 and instantly aiming right after a flick 180. Such rapid manoever sequences push the limits of "snappiness". 32 pixels (4000 pixels/sec divided by 125) on my monitor is the width of a pinky finger -- the difference between correctly aimed and not correctly aimed.
Based on personal experience, and the experience of competitive gamers, Blur Busters completely believes in the scientific basis of "5ms does matter to some". Key word is -- "some". It usually does not matter to most, but it's perceptible, useful, and non-negligible. In the amateur leagues, differences in skill levels and differences in consistency varies so much, that this is not measurable. Also, many game engines varies a lot in input lag, so the milliseconds of input lag differences is lost in the noise floor of variability (skills, skill levels, networking, etc).
Scientific tests are sorely needed, but public funding doesn't like to pay for impartial video game tests. How would you like your taxpayer funds to pay for video game scientific studies? Also, many websites don't yet have the scientific equipment to test for these aspects (while Blur Busters is slowly figuring out ways to measure this over the coming months/years).
nimbulan wrote:That is, you still need to wait the entire screen refresh to see the result so my understanding is that the perceived input lag can never be reduced below the screen refresh rate.
Depends on when the input read occurs relative to current screen-scanout timing.
During the 1980's, it was possible to do sub-frame input lag if you do input reads a few raster lines above a sprite (e.g. karate game), then move the sprite based on the input read, and it would display the sprite at the new position. But mid-refresh input reads are not routinely done (except during certain games, and only during VSYNC OFF, especially when framerates are running in huge excess of refresh rate)
As displays are refreshed top-to-bottom (raster scanning behavior): Theoretically, you can do an input read while the display is refreshing Scan Line #200, have an ultrafast GPU render quickly enough before display finish refreshing Scan Line #500, then begin immediately transmitting the frame, and finally have the reaction occur where the crosshairs is (Scan Line #540 -- the center of a 1080p screen) which is where the human game player is staring at. That's assuming a zero-buffered display (e.g. CRT or ultrafast TN gaming LCD).
So you see, nothing stops game makers from achieving sub-frame input latencies in a highly optimized game engine. In real life, you've got other latency causes preventing this from happening (e.g. mouse latency, game code latency, display cable transmission latency such as DisplayPort micropackets, LCD pixel response speed, etc). However, nothing prevents a game from achieving sub-frame input lag, if you do input reads right at render time.
Now I do understand that the game client could respond, so pressing the trigger could allow your gun to fire 1/250 faster, but that brings me to my second question:
nimbulan wrote:Why does that matter if the game server can't even utilize input that quickly? Game servers are generally designed to run at a lower framerate due to performance/bandwidth concerns. Even Counter-Strike has been historically played at a tickrate of 100 for competitive play, with the new Counter-Strike: Global Offensive capping out at 128 (though many servers run at 64.) I'd be very surprised if other games such as Battlefield run any higher than 64 ticks. Do Quake Live servers actually function at 250 ticks or is it just the client that is unlocked to 250?
Reacting 1ms faster can put your input read into the previous tick instead of the next tick. So even during 100 ticks per second (10ms per tick), you add a 1 in 10 chance of rounding off to an earlier tick (for a random event), for every 1ms of improvement in input lag.
It's my understanding that Quake Live runs at 125 ticks per second, even at 250fps.
The delta between 1/125 and 1/250 is a 4ms difference. I'm not sure what techniques Quake Live uses to level the playing field between 125fps and 250fps, but let's say they successfully levelled it. In this case, there's no input lag difference at the game server side. But having a faster display will bypass this, e.g. upgrading 60Hz->120Hz display, or even upgrading 120Hz->240Hz (cirthix-style) display, or upgrading your mouse from 125Hz->1000Hz (8ms->1ms poll latency).
This affects your apparent reaction time independently of the game server, and you do still get increased chances of your input read being rounded off to the previous tick.
Even if Quake Live levels latency for 125fp vs 250fps (no game world latency difference via game playfield-levelling coding techniques), you still get an advantage from the responsiveness improvements (that I described above) that you do get.