Moving from CRT to Eizo FG2421 [EIZO's strobed monitor]

Ask about motion blur reduction in gaming monitors. Includes ULMB (Ultra Low Motion Blur), NVIDIA LightBoost, ASUS ELMB, BenQ/Zowie DyAc, Turbo240, ToastyX Strobelight, etc.
spacediver
Posts: 505
Joined: 18 Dec 2013, 23:51

Re: Moving from CRT to Eizo FG2421 [EIZO's strobed monitor]

Post by spacediver » 13 Jan 2014, 00:16

Chief Blur Buster wrote: At 18ms, the added lag against a zero buffered display such as VG248QE in non-LightBoost mode (~3ms) is relatively large. I am of the camp, that pro competitive gamers can notice latency differences. There's a "muscle memory" effect (aka preconditioned behavior) involved here.

Gamers get used to a specific kind of input lag. For example, doing fast flicks to aim crosshairs quickly on an enemy. If you are competively playing and a fast flick 180 degree is 4000 pixels per second, that's mathematically 4 pixels per millisecond. A 10 millisecond latency difference can mean a 40 pixel aim overshoot or undershoot during a 4000 pixel per second aiming speed (two screen widths per second panning, e.g. fast flick 180 followed by aiming crosshairs on a small target behind you far away). So once a pro gamer get used to a specific gaming setup, they aim based on the preconditioned input lag. When they upgrade, it takes time getting used to the new latency delta. It's easier to get used to smaller latency, than downgrading to a higher latency. I wholly believe competitive gamers when they say they can feel 5ms latency differences -- that's a 20 pixel overshoot/undershoot during 4000 pixel/second aiming.
I'm trying to wrap my head around this.

Suppose there is absolutely no lag anywhere in the chain from mouse output (move, or button press) to visual rendition of the event.

Suppose that someone moves the mouse such that 1000 pixels are panned over in one second, and then fires at the end of this motion.

Now add a 100 ms delay between the time that the display receives the information, and the time that it renders it. Remember, as far as the "game world" is concerned, nothing has changed.

Now if someone moves their mouse in the same way (1000 pixels in one second, and then fires), everything will happen the same way in the game world, but the player will visually perceive these events occuring with a 100 ms offset relative to her mouse movements.

I would imagine that this delayed feedback would compromise performance where "feedback control" is required, rather than "feed forward" situations.

With feedforward (pre-programmed) movements, such as a quick 180 degree "muscle memory" flick, I don't see how display lag would affect anything.

With a feedback control situation, where one needs to adjust their movements in real time response to incoming information, I can see how performance can suffer, and things like overshooting can occur.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Moving from CRT to Eizo FG2421 [EIZO's strobed monitor]

Post by Chief Blur Buster » 13 Jan 2014, 00:49

spacediver wrote:With a feedback control situation, where one needs to adjust their movements in real time response to incoming information, I can see how performance can suffer, and things like overshooting can occur.
Very good points, but what I'm saying and what you are saying isn't mutually exclusive at all. There are so many, different, interaction, vision factors / feel factors, that are occuring simultaneously. Mine is just one example of many, that attempts to educate the layman into understanding how the heck, possibly, in this world, that milliseconds matters -- once one fathoms how fast mouse 180 turns can be (e.g. 4000 pixels second), it suddenly makes sense that 10ms leads translates to 40 pixels, which can mean the difference between a crosshairs on target, or crosshairs off target. Obviously, many examples I say are hugely simplified examples.

There are multiple factors at play, as you can see:

(1) You have visual feedback of the thing coming into view from the edge of the screen, as you do a fast flick 180 degree. You know when to begin decelerating your mouse turn based on when the thing pops into view (landmarks, enemies, etc). If your flicks are fine tuned towards this, then changes to input lag will cause you to overshoot/undershoot more and require more corrective movements. There's definitely a memorized element of mouse flick based on memorized mouse distance (flicking with eyes closed and seeing how close you can get to 180 degree) -- but it's always more accurate if you do it with eyes open, so there's a visual element as gamers often compensate by changing their reaction based on what they're visually seeing, and accelerating/decelerating mouse based on what they are seeing, too. And if there's a significant enough lag change, it's definitely is going to interfere and cause you to overshoot.

-AND-

(2) Even corrective movements are fine tuned to a human input lag behavior. You rarely center crosshairs in one swipe, you often have to move back slightly to aim. The amount of corrections you need to do, slows you down from killing the enemy before the enemy shoots at you. You move mouse back and fourth until things are centered in crosshairs. If you are used to a specific latency, you do corrective behaviors a specific way. 5ms-10ms differences in input lag can cause increases in swervy overshooty/undershooty behaviors if your reaction is not fine tuned to a specific lag.

---

It's not too metaphorically different from steering wheel lag -- and swerving on a road, back-and-fourth, back-and-fourth -- witness the accidents that occur when you swerve. For example while driving, you suddenly see obstacle, you successfully swerve, but then you go into a deadly-amplifying back-and-fourth serving until you crash. You over-corrected, often because of rapid and suddenly unexpected change in steering wheel lag whenever the car reaches its very end of parameters (e.g. change of surfaces, loss of traction, skidding, slipping, sliding, etc -- it is essentially sudden change in response).

Twitch gamings, are actually millisecond-timescale manifestations of this behavior. Your mouse overcorrect/undercorrect takes slightly longer or shorter, e.g. a 5ms or 10ms change in input lag can, say, cause a cascade of slower-reaction as you try to quickly aim (and keep overshooting/undershooting, because the lag change relative to what you're used to).

But, rather, it is micro-scale, like your usual reaction time can sometimes cascade much bigger (e.g. becomes 50ms slower) with the addition of a 10ms lag, because of the amplified mouse overshoot/undershoots.
BEFORE EXAMPLE: Aim fast -- you overshoot 60 pixels, move back, you overshoot 10 pixels, move back, you fire, blam.
AFTER EXAMPLE: Add 10ms lag and it wrecks your aiming. Now aim fast, during aiming your crosshairs, you end up overshooting 80 pixels, move back, overshoot 20 pixels, move back, overshoot 5 pixels, move back, you fire, blam.

The numbers are examples, but you can now understand the cascade effect. So one tiny +10ms lag caused a cascade of, say, +50ms (maybe even +100ms, maybe even +200ms, maybe +30ms) that one time, because you weren't used to the change in lag. You need to re-practice in order to get used to the new latency change, especially if the latency change is towards the worse direction (e.g. slower response).

I have noticed cascade effects in my mouse aiming when I add a sudden +50ms. I even still notice at +10-20ms sudden change in lag. I'm not a competitive player. My personal experience leads me to believe that 5ms-10ms makes a perceptible difference to an elite few, in slowing down their reaction time.

This really all need to be scientifically measured, but there's been enough anecdotes by competitive gamers, and my rough napkin math calculations show quite a lot of logic in their complaints; but I imagine government-funded science don't like studying video-game related stuff like VSYNC OFF stuff, or the mouse crosshairs overshoot/undershoot behavior changes (over micro timescales), etc. Perhaps a few university/college students would like to take upon a few thesises, based on what they read here on Blur Busters ;) Especially as part of an education targeted towards developing in the game industry...

I would bet, however, that 5ms and 10ms latency differnces, matters a huge deal to elite fast-twitch gamers for the Quake Live / CS:GO type games.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

spacediver
Posts: 505
Joined: 18 Dec 2013, 23:51

Re: Moving from CRT to Eizo FG2421 [EIZO's strobed monitor]

Post by spacediver » 13 Jan 2014, 01:25

Chief Blur Buster wrote: It's not too metaphorically different from steering wheel lag -- and swerving on a road, back-and-fourth, back-and-fourth -- witness the accidents that occur when you swerve. For example while driving, you suddenly see obstacle, you successfully swerve, but then you go into a deadly-amplifying back-and-fourth serving until you crash. You over-corrected, often because of rapid and suddenly unexpected change in steering wheel lag whenever the car reaches its very end of parameters (e.g. change of surfaces, loss of traction, skidding, slipping, sliding, etc -- it is essentially sudden change in response).
This is exactly how I felt when I tried using lightning gun in quakelive with vsync on.

great post!

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Moving from CRT to Eizo FG2421 [EIZO's strobed monitor]

Post by Chief Blur Buster » 13 Jan 2014, 01:45

I think Blur Busters needs an "Input Lag" subforum; I think we will be adding one this year.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Arbaal
Posts: 9
Joined: 12 Jan 2014, 16:18

Re: Moving from CRT to Eizo FG2421 [EIZO's strobed monitor]

Post by Arbaal » 13 Jan 2014, 05:32

Chief Blur Buster wrote:Thanks for your feedback! Where did you get your EIZO?
I did get my EIZO from Amazon Germany. It seems that the monitor came from a EIZO distribution center in Düsseldorf and was send via air from Japan before that.
Chief Blur Buster wrote:At 18ms, the added lag against a zero buffered display such as VG248QE in non-LightBoost mode (~3ms) is relatively large. I am of the camp, that pro competitive gamers can notice latency differences. There's a "muscle memory" effect (aka preconditioned behavior) involved here.
Yes, I also think that there people that can perceive input lag, even in the lower regions. But I find it very hard to believe that there a people who claim that they can perceive latency differences in the single digit MS area. The human perception of "time" is rather tricky, since our cerebral cortex has a input lag AND output lag on it's own. I don't think that even conditioned reflexes can get to a reaction time below 100ms (and I'm not really sure that "gaming reflexes" are really conditional reflexes either).

I think there is little to no scientific data right now how AND how bad/well the motion to photon lag (thanks Palmer Lucky to bring up this great term into the general discussion!) is influencing the behavior in gaming/competitive e-sport. For now I would like to put stories of players to the "anecdotal evidence" category.

What I would say that an EIZO FG2421 is definitely in the category of lag, where the majority of even competitive gamer can't tell the difference at 120 Hz between a CRT anymore. Non the less, I knowledge that some people might "feel" the lag of 18ms. A blind test would be a great idea to do... but pulling that off would be kinda hard to do, since educated people (and I include competitive gamer in this category) can surely tell the difference between displays and display technologies apart just by looking a the picture itself (Which would influence the blind test). It might be best to run a blind test with a CRT and artificially increase the lag for testing purposes to get better data.

User avatar
RealNC
Site Admin
Posts: 3757
Joined: 24 Dec 2013, 18:32
Contact:

Re: Moving from CRT to Eizo FG2421 [EIZO's strobed monitor]

Post by RealNC » 13 Jan 2014, 07:14

The thing about input lag isn't that you feel it. It's that it hurts you even if you don't feel it at all. On LAN, where network latency is almost non-existent, the reaction times of the players are of paramount importance, and those reaction times are pretty close together. On average, the guy with less input lag is going to frag first, even if none of the players can feel any input lag. Those 5ms less means that the player starts reacting sooner, even if no one can feel 5ms difference.

Professional players are so obsessed with input lag for a good reason.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

Arbaal
Posts: 9
Joined: 12 Jan 2014, 16:18

Re: Moving from CRT to Eizo FG2421 [EIZO's strobed monitor]

Post by Arbaal » 13 Jan 2014, 12:53

RealNC wrote:The thing about input lag isn't that you feel it. It's that it hurts you even if you don't feel it at all. On LAN, where network latency is almost non-existent, the reaction times of the players are of paramount importance, and those reaction times are pretty close together. On average, the guy with less input lag is going to frag first, even if none of the players can feel any input lag. Those 5ms less means that the player starts reacting sooner, even if no one can feel 5ms difference.

Professional players are so obsessed with input lag for a good reason.
Yes sure, but you must also admit that this is pure anecdotal evidence. You can't really do a discussion with this kind of argument. Reproducible data/showcases would be great for this, like our host did with this whole blur-buster page (Which showed me the importance of 120hz and low persistence displays).
Last edited by Arbaal on 13 Jan 2014, 13:16, edited 1 time in total.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Moving from CRT to Eizo FG2421 [EIZO's strobed monitor]

Post by Chief Blur Buster » 13 Jan 2014, 13:14

RealNC wrote:The thing about input lag isn't that you feel it. It's that it hurts you even if you don't feel it at all. On LAN, where network latency is almost non-existent, the reaction times of the players are of paramount importance, and those reaction times are pretty close together. On average, the guy with less input lag is going to frag first, even if none of the players can feel any input lag. Those 5ms less means that the player starts reacting sooner, even if no one can feel 5ms difference.

Professional players are so obsessed with input lag for a good reason.
Yep, I've mentioned this too. The "cross-the-finish-line-first" factor ("shoot-first" factor); nearly equally matched players in identical environments. Like a 100 meter Olympics sprint race, unfelt milliseconds can win, in a close race, sprinters don't even know who won the race until they see the scoreboard.

It's also been pointed out the low tick rates found in some games should make a millisecond not matter, but even when game tick rates are far lower (e.g. 100Hz), a single millisecond savings can mean a ~10% chance of rounding off to the previous game tick during a 100Hz tick rate. A 5ms savings can mean a 50% chance of rounding off your "fire" button press to the previous game tick, if both persons shoot simultaneously.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Black Octagon
Posts: 216
Joined: 18 Dec 2013, 03:41

Re: Moving from CRT to Eizo FG2421 [EIZO's strobed monitor]

Post by Black Octagon » 13 Jan 2014, 14:38

Mark, I really love this thread. But it's making the wait for your review all the more painful, lol :)

I don't wish to nag - far from it - but do you still intend to post a formal review of the Eizo? Something tells me that this screen is one of the best all-round choices for gamers right now

Sent from dumbphone (pls excuse typos and dumbness)

spacediver
Posts: 505
Joined: 18 Dec 2013, 23:51

Re: Moving from CRT to Eizo FG2421 [EIZO's strobed monitor]

Post by spacediver » 13 Jan 2014, 15:44

Arbaal wrote: Yes, I also think that there people that can perceive input lag, even in the lower regions. But I find it very hard to believe that there a people who claim that they can perceive latency differences in the single digit MS area. The human perception of "time" is rather tricky, since our cerebral cortex has a input lag AND output lag on it's own. I don't think that even conditioned reflexes can get to a reaction time below 100ms (and I'm not really sure that "gaming reflexes" are really conditional reflexes either).
I'm not sure how relevant reaction time studies are the ability to perceive temporal differences. Just because it takes about 100 ms for information to be processed by our visual system, and for a motor response to be generated, doesn't mean that we cannot detect a small amount of lag in the system.

For example, it has been shown that musicians can detect a difference between a 50ms continuous tone, and a 57.6 ms continuous tone (see table 1: http://www.iapsych.com/iqclock2/LinkedD ... er2006.pdf ). That shows that they are sensitive to differences as small as around 7 ms in that particular task.

Furthermore, even if one wasn't able to notice a difference doesn't mean that performance is not impacted.

It should be clear that even a tiny amount of lag, at any point in the chain, will cause a deterioration of performance (although the degree of deterioration depends on the degree of lag).

Post Reply