
Lag measurement standardization & disclosure ideas, as discussed above, are among topics discussed.
They are not the only ones I talk to. Blur Busters will be helping pave the way for improved disclosure of lag-testing methodology.
Very good comments. Lots of thoughts.StrobeMaster wrote:Yet, one should keep in mind that GtG10% or GtG50% usually refer to momentary luminance levels. It is not that, at the time the luminance curve crosses the 10% mark, we actually can perceive a 10%-gray already; the over-all luminance profile has to be taken into account as well. This becomes especially problematic when using the numbers for comparing screens with very different luminance profiles (CRT vs. LCD/continuous backlight vs. LCD/strobed). Moreover, detecting some white digits on a black background is rather different from any real-life situation. All in all, it is very difficult to come up with a perceptually meaningful measure, which doesn't mean we shouldn't try (see also https://display-corner.epfl.ch/index.ph ... #Input_lag).Chief Blur Buster wrote: Nontheless, GtG lag is sometimes high subjective (human sensitivity varies), but the best industry standard for lag stopwatching is either GtG10% or GtG50%.
GtG10% because the photons are visible by then. 10% of the way from black to white is a dark gray. That's still visible. And manufacturer GtG response measurements are typically from 10% through the 90% point. So the GtG 10% point lines up very well with that.
GtG50% because it is a midpoint of fairness and is usually extremely close to GtG10% on many monitors (within 1ms) so reasonably close to human subjectivity.
Blur reduction, indeed, gains a competitive advantage that outweighs input lag sometimes (in certain games, usually OTHER than CS:GO). Sometimes not. You just have to know how to use the blur reduction mode in a way that outweighs its tiny lag.dhaine wrote:that's a good question, according to tftcentral, acer xb270HU would be better for a purely competitive gamer (so not taking account anti blur stuff) at 144 than even 240hz monitors ? Well the blur reduction with 240hz even without blur reduction mode is a bonus that probably gains edge over 1,5ms difference from their chart (acer xb270hu vs pg258q)
Thanks for sharing - made me change my mind.hkngo007 wrote:Hello,
Just wanted to share my personal experience on this matter as I have:
...
I'm not known as a proponent of the 1st gen 240Hz panels (a bit of the opposite, in fact; it seems to me they've been rushed to market in a somewhat half-baked state.) But I do have my reservations when it comes to people thinking they are noticeably affected by a 2ms or 3ms latency difference. There's other factors to consider. Screen size, resolution, and image quality can affect aiming.StrobeMaster wrote:Thanks for sharing - made me change my mind.hkngo007 wrote:Hello,
Just wanted to share my personal experience on this matter as I have:
...
Hmm. How tolerant are VT-tweakable 240Hz monitors regarding vertical front/back porch?RealNC wrote:If the differences are indeed down to a 3ms latency difference, then you should get the same results by using the same monitor but introducing an artificial 3ms penalty. If you then don't get the same results (and I doubt you will), then it's not the latency differences of the monitors to blame. I'm not sure how one could set up such a test though, and do so using an ABX method (where you don't know whether you currently are using a 3ms disadvantage or not.)