forii wrote: ↑13 Apr 2020, 04:07
Now every monitor is "1ms", but after you test it, its not. Its all marketing
To make heads explode, make sure to refer to the
Pixel Response FAQ: GtG versus MPRT.
A single panel can have over 60,000 different GtG numbers -- and one has to choose cutoff thresholds (90%) and averages (fastest GtG color combo versus slowest GtG color combos).
<Advanced>
Also, temperature. In addition, a 1 degree difference in temperature changes ALL of the 60,000 different GtG numbers. Ever forgot an LCD in a car in the middle of winter (phone, watch, screen) -- they respond really slow when below zero, with GtG speeds
measured in seconds. Now a 1 or 2 degree difference can still affect GtG by a few milliseconds for the worst colors on some panels.So most reviewers and display manufacturers use the standard industry temperature of 20 degrees C (so always warm up a panel especially in a cold room in the middle of winter).
Also, measuring equipment noise. And the measuring equipment such as photodiode oscilloscope have some noise margins that needs to be dealt with (why the 90% cutoff threshold exists).
Also, marketing, yes. Marketing is one reason, but finding ways to communicate specifications is hard. It's like Contrast ratio specs to an extent -- or color gamut specs -- which can vary depending on variables (e.g. modes, VRR, strobe, brightness, panel lottery, temperature, etc).
Also, very slight GtG inconsistency along panel surface. Temperature difference along panel surface, like power supply at corner of monitor will make the panel respond faster (or have overdrive differently) at that corner. Or panel lottery differences. Or slightly faster GtG nearer the driven edge of the panel (the LVDS ribbon-cable area of the panel that injects electricity into the panel matrix). Often these are tiny such as 0.1ms, but I've seen GtG surface inconsistencies become human-visible (e.g. modulations in ghosting/coronas as TestUFO Ghosting scrolled from left to right, from a colder to hotter part of the panel).
Also, best case vs worse case. The worst-case actual honest measured GtG numbers can be more than 10x slower than the actual honest meaured GtG numbers -- for the same panel -- for the same temperature -- for the same overdrive setting -- and you start to realize why compromises are being made. The numbers aren't lies, but omission of measurement criteria. That's why I like industry standards, such as "UL-measured 1ms GtG" or other industry standard measurements -- at least I know they were measured at an industry standard temperature, industry standard colors, and industry standard cutoff points. We do often leave detailed GtG measurements (heatmaps) to vendors.
Vary all of them, and it's possible to create a situation where the worst-case GtG (cold temp, worst color combo, overdrive off) is 1000x+ slower than the best-case GtG (hot temp, best color combo, tweaked overdrive). Ouch!
</Advanced>
TL;DR: GtG is hard to simplify without compromises.