joeyjojo123 wrote:My understanding is this:
[*]If you run a display in a non-native resolution, you'll encounter extra display processing lag and possibly display interpolation, leading to a larger overall input lag.
Not necessarily. You might, but scaling can also be done laglessly too. Depends on the monitor. AFAIK, BENQ/Zowie scales laglessly.
joeyjojo123 wrote:[*]This regularly happens even if the input's resolution is a perfect scale of the display's resolution. (ex: 1280x720 input on a native 2560x1440 display or 1920x1080 input on a native 3840x2160 display will encounter additional display processing lag despite integer scaling being possible.)
See above.
joeyjojo123 wrote:[*]If you run a display at a lower refresh rate than its native refresh rate, pixel response times will be higher (than pixel response times seen at its native refresh rate), leading to more motion blur.
There's no difference in GtG response, but MPRT response will go up. These are two completely different millisecond measurements. THe majority of GtG (pixel transition time) is pretty much unchanged at all refresh rates on the TN gaming monitors, but MPRT (pixel static time) gets longer.
Pixel static time (sample-and-hold-effect) is the bigger cause of motion blur nowadays, see
http://www.testufo.com/eyetracking ..... GtG has far less effect on motion blur nowadays, since the majority of GtG transition (10%->90%) is a tiny fraction of a refresh cycle, e.g. 1ms GtG on a 16.67ms refresh cycle (60Hz).
joeyjojo123 wrote:[*]If you run a TV as opposed to a monitor, even with "Game Mode" (or the like) enabled, you'll still regularly see higher input lag than a PC monitor equivalent with the same native resolution and refresh rate.
Usually yes. But some of the faster panels are getting pretty good.
joeyjojo123 wrote:[*]If you run a display with anything but TN-based panel technology, pixel response times will be poorer (to an extent) due to nuances related to the panel technology, leading to marginally worse motion blur.
Yes, on VA and IPS displays, GtG becomes a much more significant percentage of a refresh cycle, and begins to contribute more to motion blur. e.g. 5ms GtG on an 8.3ms refresh cycle is quite significant (e.g. 120Hz IPS or VA). That said, MPRT still is the major part of motion blur compared to GtG.
Strobing-based blur reduction shortens pixel static/visibility time, leading to massively better MPRTs, even all the way to CRT-clarity leagues.
joeyjojo123 wrote:My understanding would suggest that getting a TN-based 1920x1080 60hz PC monitor would be ideal, but with today's market resulting in only 1920x1080 120hz+ or higher resolution markets as premium, low latency displays, it makes it kind of tough to determine whether running a 1920x1080 240hz/120hz monitor at 60hz (with nominally worse pixel response times) or getting a natively poorer 1920x1080 60hz monitor would lead to a more ideal experience (in terms of pixel response, ghosting, input lag, etc.)
Not necessarily. Done properly, the ability of a monitor to do 120Hz or 240Hz, can lead to a much better 60Hz monitor -- e.g. If BENQ XL2540 supported single strobing, then it would produce amazingly clear near-zero-crosstalk images, by permitting faster scanouts and longer pauses between blanking intervals. (See
Blur Busters Strobe Crosstalk FAQ for a better understanding of strobe crosstalk) A monitor's capability of a higher bandwidth, makes it easier to do faster scanouts between longer pauses between refresh cycles, e.g. running a 1/120sec refresh cycle at 60Hz -- to create a 1/120sec idle period between 60Hz refresh cycles. This is fantastic when used with a strobe backlight, because it gives more time (e.g. 8.3ms blanking interval -- 8.3ms VSYNC -- a humongously long blanking interval) for 1ms GtG to fully settle down (0-100% rather than 10%-90%). This gives more time for LCD GtG to finish between refresh cycles, for much cleaner strobing. So in theory, a higher Hz capability means much better-quality 60Hz strobing. Unfortunately, in the real world, monitor manufacturers artificially limit your ability to enable low-frequency strobing because they think users will hate 60Hz strobing (many do) but others really like 60Hz single-strobing. It's possible to open up a monitor and hack the backlight controller to force it to single-strobe, but it's an exercise left to very advanced electronics hackers.
joeyjojo123 wrote:Additionally, are there even any monitors that support ULMB for 60hz inputs?
Yes. Currently they're the Benq XL2420Z, XL2720Z, XL2411Z, and XL2420G, in the pre-Zowie rebrands, and works with the
Blur Busters Strobe Utility.
joeyjojo123 wrote:I have found resources such as this:
https://displaylag.com/display-database/, but I cannot vouch for its accuracy. It reports IPS-type monitors as having 9ms of input lag, which while possible as IPS panels result in worse pixel response times
They only test lag at 60Hz because their lag tester only supports 1080p 60Hz. Lag is lower at higher refresh rates.
<ADVANCED LAG ERROR FACTORS>
There are many lag measurement methodologies
- Lag of VSYNC ON
- Lag of VSYNC OFF
- Lag of GSYNC or FreeSync
- Lag of various refresh rates
- Lag of various frame rates
- Lag to top edge of screen (always lowest lag)
- Lag to center of screen (usually higher during VSYNC ON and VRR)
- Lag to bottom edge of screen (usually higher during VSYNC ON and VRR)
- Lag to beginning of GtG (a little bit of a cheat, not human-visible at this point)
- Lag to middle of GtG (preferred scientific compromise, for photodiode methods)
- Lag to end of GtG ("Better Safe Than Sorry" method; TFTCentral uses this; Add GtG to lag numbers)
- Differential lag (increase of lag relative to a CRT), ala SMTT
Different lag testing methods, with different measurement methodologies, make it very hard to do an apples-versus-oranges lag comparisons between different websites.
As far as I know, DisplayLag measure to screen centre. During 60Hz refresh cycle, it takes 16.67ms to scanout from top to bottom. So half of that is 8.3ms. So measuring 9ms lag is quite rather amazingly low lag for an "IPS panel, Leo Bodnar Method, Screen Centre", once you realize & understand the correct variables they use. DisplayLag measurements are accurate, BUT you must understand they cannot be directly compared to TFTCentral numbers, because of different lag measurement methods.
Bottom line: TFTCentral does a good job. DisplayLag does a good job. However, you must interpret their numbers slightly different because of the above factors -- because measuring via photodiode/LeoBodnar is very hard to compare directly to measuring via SMTT 2.0.
SMTT lag numbers (ONLY at 60Hz, ONLY at 1080p) are very, very, very roughly (barely) comparable to Leo Bodnar TOP EDGE ONLY. (But DisplayLag prominently measures SCREEN MIDDLE only, Leo Bodnar method). Add roughly a +/-(1/2 GtG) error margin when trying to compare TFTCentral and DisplayLag (when they publish "screen top edge") numbers. You cannot compare "screen middle" numbers to SMTT lag measurements. "Screen middle" measurements are popular because that's the location of the crosshairs, and most people want to know the lag of where they stare at the crosshairs. On the other hand, VSYNC OFF can bypass lag differentials between top edge and center of screen -- due to the way VSYNC OFF does scanout interruptions (...tearlines...), you can have identical lag for screen centre and screen top, during infinite-framerate VSYNC OFF situation. 1000fps VSYNC OFF can get rather close within a 1ms band (+/- 0.5ms) difference.
Also, as good a device as it is, the Leo Bodnar is somewhat of a black box. I don't know if it measures lag to beginning of GtG, end of GtG, or middle of GtG, so throw in a GtG error margin (e.g. +/- GtG error margin) when comparing Leo Bodnar numbers to SMTT numbers, as an example. As a black-box, we all also don't even know if Leo Bodnar measure before or after Vertical Back Porch (the blanking-interval padding above the top edge of the screen), that can fudge the lag numbers a bit.
If you're a Custom Resolution Utility user -- Large Blanking intervals will noticeably affect lag -- it can increase or decrease lag depending on whether you used larger numbers for Vertical Back Porch, or Vertical Front Porch, or Vertical Sync -- in a Custom Resolution Utility. What's important to know is knowing the horizontal scan rate. If horizontal scan rate is 135KHz, that means one pixel row is 1/135,000th of a second, so you can calculate scanout velocity, as well as the time-length of a blanking interval that way (And the elements of a blanking interval -- the sync and the porch paddings -- as it scanouts in this order: BackPorch-Active-FrontPorch-Sync).
As the website who did the world's-first-ever variable-refresh-rate (GSYNC) input lag measurements (GSYNC
preview 2 a few years ago) -- we understand "input lag gotchas/variables" far more than most display reviewers... We'll be publishing some articles about input lag measurement inaccuracies/methodologies during the coming months -- keep tuned.
The bottom line is that TFTCentral and DisplayLag methodologies are quite fine, but one needs an advanced understanding why they (correctly) result in very different numbers, and they both have a 'black box factor' error margin (usually +/- 1-2ms) so any quoted decimal digits should usually be ignored.
</ADVANCED LAG ERROR FACTORS>