Understanding GtG versus MPRT: Two Pixel Response Benchmarks

Advanced display talk, display hackers, advanced game programmers, scientists, display researchers, display manufacturers, vision researchers & Advanced Display Articles on Blur Busters. The masters on Blur Busters.
Post Reply
User avatar
Chief Blur Buster
Site Admin
Posts: 7860
Joined: 05 Dec 2013, 15:44
Location: Toronto, Ontario, Canada
Contact:

Understanding GtG versus MPRT: Two Pixel Response Benchmarks

Post by Chief Blur Buster » 10 May 2019, 13:01

For those who are insanely curious about why there are two different pixel response benchmarks...

GtG stands for Grey-To-Grey.
MPRT stands for Moving Picture Response Time.

GtG represents how long it takes for a pixel to change between two colors.
MPRT represents how long a pixel is continuously visible for.

For a bigger explanation, with many diagrams, see:

GtG versus MPRT: Frequently Asked Questions About Display Pixel Response

Image
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

User avatar
Chief Blur Buster
Site Admin
Posts: 7860
Joined: 05 Dec 2013, 15:44
Location: Toronto, Ontario, Canada
Contact:

Re: Understanding GtG versus MPRT: Two Pixel Response Benchmarks

Post by Chief Blur Buster » 22 Apr 2020, 02:20

The Complexity of Testing GtG

Crossposting relevant stuff here about TESTING
RLCScontender wrote:
21 Apr 2020, 04:49
Heres'a n example. They had the viewsonic elite xg270qg at 5.2ms response time and the lg 27gl850-b at 4.17ms response time. Yet almost EVERY review company (TFT, Prad.de, Snowman, etc) easily had the viewsonic faster than the LG. Another example is the LG 32gk850-f, tft had it at 8ms response time while hardware unboxed had the same panel at 4.3ms respones time. TWo different revieweres, with vastly DIFFERENT g2g averages
GtG measurements are pretty tough to nail down to a single trusted number

Also different parts of the curve can be faster than others.

1. GtG curve SHAPE differences on GtG graphs.
Start of might be faster than competitor, but end of curve might be slower than competitor. While only one of the two might be more visible in-game for specific games. Even dark games can make parts of a GtG curve more important, while brighter games make different parts of a GtG curve more important. The famous VA slowness in darks is a famous example but also affects TN and IPS to lesser extents.

2. Colors measured. Different colors have different GtG.
For an 8-bit panel, there are over 60,000 different GtG color values -- 256x256 color combination minus the no-change colors (256), for 8-bit = 65536-256 = 65280 different GtG numbers! But most reviewers only test a 5x5 grid, which is too small to get accurate averages on VA panels.

3. Temperature differences. Even 1 degrees colder can create 1ms slower on some VA panels on some colors.
Ever used a frozen LCD in a freezing car in middle-winter? GtG measured in seconds. But even 1 degree differences is important.

4. Warm-up differences. Receive a monitor by FedEx in winter versus summer. GtG tests will be different.
You need 24hr warm-up to room temp, followed by at least 30-60 minutes of full power-on time in a temperature-controlled room (20C).

5. Panel lottery factor, You know those black nonuniformities that can't be explained-away as backlight bleed etc?
That, too, can affect GtG by fractions of milliseconds

6. Sensor location factor. You know the cold corner of the LCD and the hot corner of the LCD where the power supply is?
That, too, can affect GtG speeds, just by position of your GtG measuring sensor

7. Aging differences.
Yup, yup. I've seen different numbers after 400 hours of breaking-in.

So... See, it's a bigger rabbit hole than you thought, eh?

Image

TL;DR Review Quality GtG Lab Analysis Preparation
  • If freshly received by Amazon, wait 24 hours till monitor equalizes to the temperature of your home.
  • Get 2 accurate lab thermometers and place them 2 meters aparts on left/right side of monitors.
    Two is preferred as a verifier of each other, and to verify lack of room hotspots
  • Make sure no computer nearby (Keep hot computers far away from monitor -- preferably moved to UNDER the desk or next to desk side, and diagonally away from monitor to prevent hot rising air from computer). Prevents temperature interference from hot GPUs etc.
  • Thermostat your room until 20C reads on both thermometers sitting a meter away from both sides of monitor
    There are also different temperature standards, but I use 20C as a standardized room temperature
    Make sure your computer desk is far away from your heat vents, e.g. no baseboard heater or air conditioner near monitor
  • Power up the monitor for an hour or so minimum
  • Measure consistent position (i.e. screen dead centre) Never, never, never, never, never randomize sensor position.
  • Measure a grid, the biggest you can. For VA panels, even 5x5 grid vs 9x9 grid makes big diff because of small size of GtG slowness hotspot in the dim area of the color gamut.
  • Disclose your GtG measurement standard. For example, use VESA GtG cutoff points 10% and 90% accurately. Make sure your oscilloscope is sensitive enough not to be noisy at GtG10%. Many cheap oscilloscopes are, and will massively change your GtG number.
  • Average the numbers for GtG averages.
  • You can color-code it as a GtG heatmap (like TFTCentral and ApertureGrille, etc)
This is the TL;DR simplified version. GtG measurement standard can get more complex than the above. Caveat emptor. HUGE rabbit hole. Harder to get reliable GtG averages than measuring contrast ratios.

See? Monitor manufacturers aren't necessarily "lying". They just are following imperfect standards of GtG measurements which are sometimes inconsistent with each other. GtG is hard to measure accurately at these accuracy levels.

Image

Image

TL;DR: Do a full disclosure of the limitations of your GtG measurements. Anything less than an attempt at perfection will guarantee even-more-different results.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

RLCScontender
Posts: 494
Joined: 24 Mar 2020, 14:14

Re: Understanding GtG versus MPRT: Two Pixel Response Benchmarks

Post by RLCScontender » 22 Apr 2020, 20:12

Chief Blur Buster wrote:
22 Apr 2020, 02:20
The Complexity of Testing GtG

Crossposting relevant stuff here about TESTING
RLCScontender wrote:
21 Apr 2020, 04:49
Heres'a n example. They had the viewsonic elite xg270qg at 5.2ms response time and the lg 27gl850-b at 4.17ms response time. Yet almost EVERY review company (TFT, Prad.de, Snowman, etc) easily had the viewsonic faster than the LG. Another example is the LG 32gk850-f, tft had it at 8ms response time while hardware unboxed had the same panel at 4.3ms respones time. TWo different revieweres, with vastly DIFFERENT g2g averages
GtG measurements are pretty tough to nail down to a single trusted number

Also different parts of the curve can be faster than others.

1. GtG curve SHAPE differences on GtG graphs.
Start of might be faster than competitor, but end of curve might be slower than competitor. While only one of the two might be more visible in-game for specific games. Even dark games can make parts of a GtG curve more important, while brighter games make different parts of a GtG curve more important. The famous VA slowness in darks is a famous example but also affects TN and IPS to lesser extents.

2. Colors measured. Different colors have different GtG.
For an 8-bit panel, there are over 60,000 different GtG color values -- 256x256 color combination minus the no-change colors (256), for 8-bit = 65536-256 = 65280 different GtG numbers! But most reviewers only test a 5x5 grid, which is too small to get accurate averages on VA panels.

3. Temperature differences. Even 1 degrees colder can create 1ms slower on some VA panels on some colors.
Ever used a frozen LCD in a freezing car in middle-winter? GtG measured in seconds. But even 1 degree differences is important.

4. Warm-up differences. Receive a monitor by FedEx in winter versus summer. GtG tests will be different.
You need 24hr warm-up to room temp, followed by at least 30-60 minutes of full power-on time in a temperature-controlled room (20C).

5. Panel lottery factor, You know those black nonuniformities that can't be explained-away as backlight bleed etc?
That, too, can affect GtG by fractions of milliseconds

6. Sensor location factor. You know the cold corner of the LCD and the hot corner of the LCD where the power supply is?
That, too, can affect GtG speeds, just by position of your GtG measuring sensor

7. Aging differences.
Yup, yup. I've seen different numbers after 400 hours of breaking-in.

So... See, it's a bigger rabbit hole than you thought, eh?

Image

TL;DR Review Quality GtG Lab Analysis Preparation
  • If freshly received by Amazon, wait 24 hours till monitor equalizes to the temperature of your home.
  • Get 2 accurate lab thermometers and place them 2 meters aparts on left/right side of monitors.
    Two is preferred as a verifier of each other, and to verify lack of room hotspots
  • Make sure no computer nearby (Keep hot computers far away from monitor -- preferably moved to UNDER the desk or next to desk side, and diagonally away from monitor to prevent hot rising air from computer). Prevents temperature interference from hot GPUs etc.
  • Thermostat your room until 20C reads on both thermometers sitting a meter away from both sides of monitor
    There are also different temperature standards, but I use 20C as a standardized room temperature
    Make sure your computer desk is far away from your heat vents, e.g. no baseboard heater or air conditioner near monitor
  • Power up the monitor for an hour or so minimum
  • Measure consistent position (i.e. screen dead centre) Never, never, never, never, never randomize sensor position.
  • Measure a grid, the biggest you can. For VA panels, even 5x5 grid vs 9x9 grid makes big diff because of small size of GtG slowness hotspot in the dim area of the color gamut.
  • Disclose your GtG measurement standard. For example, use VESA GtG cutoff points 10% and 90% accurately. Make sure your oscilloscope is sensitive enough not to be noisy at GtG10%. Many cheap oscilloscopes are, and will massively change your GtG number.
  • Average the numbers for GtG averages.
  • You can color-code it as a GtG heatmap (like TFTCentral and ApertureGrille, etc)
This is the TL;DR simplified version. GtG measurement standard can get more complex than the above. Caveat emptor. HUGE rabbit hole. Harder to get reliable GtG averages than measuring contrast ratios.

See? Monitor manufacturers aren't necessarily "lying". They just are following imperfect standards of GtG measurements which are sometimes inconsistent with each other. GtG is hard to measure accurately at these accuracy levels.

Image

Image

TL;DR: Do a full disclosure of the limitations of your GtG measurements. Anything less than an attempt at perfection will guarantee even-more-different results.
Thanks, TFT uses the 90-10 and i'll probably follow the same criteria when i test out the Alienware AW2521hf on my $500 oscilloscope.

Hey chief, do you seriously believe an IPS monitor is really capable of true 1ms response timte with no ghosting? Apparently, there have been claims that the Alienware AW2521hf has "no ghosting" on its extreme overdrive setting(which allegedly claims 1ms)

From my recollection, i did see slight ghosting above its standard overdrive and it's not even remotely as fast as the MSi even on its extreme setting. I do however feel sorry for people who wanted the AW 240hz monitor since its backordered to JUNE. yesterday, it was available on dell on a ONE week delivery.

That means plenty of people ordered and i'm kinda apprehensive showing everyone these measurements on the monitor they purchased.

User avatar
Chief Blur Buster
Site Admin
Posts: 7860
Joined: 05 Dec 2013, 15:44
Location: Toronto, Ontario, Canada
Contact:

Re: Understanding GtG versus MPRT: Two Pixel Response Benchmarks

Post by Chief Blur Buster » 23 Apr 2020, 12:47

RLCScontender wrote:
22 Apr 2020, 20:12
From my recollection, i did see slight ghosting above its standard overdrive and it's not even remotely as fast as the MSi even on its extreme setting. I do however feel sorry for people who wanted the AW 240hz monitor since its backordered to JUNE. yesterday, it was available on dell on a ONE week delivery.

That means plenty of people ordered and i'm kinda apprehensive showing everyone these measurements on the monitor they purchased.
The complexity of GtG measurement means it's necessary to explain when you simpify.

Hobbyists who just want to do 1 GtG measurement and show a chart, you can say: "I warmed up my monitor for 1 hour with my house thermostat at 20 degrees C on a best-effort basis. I measured GtG from black to white. According to this chart, GtG is X ms (I have not yet measured GtG of other colors). Here's the image:"

That's honest and fair; it acknowledges limitation of measurement yet provides handy useful information.

Hobbyists that goes a step further can use Microsoft Excel and record a 5x5 grid, 7x7 grid or 9x9 grid. Oscilloscope all of them, save the images (under files like GTGgraph0to255.png or such, to keep track of source-destination colors), then write the GtG numbers into the grid, and average it all.

You can use www.testufo.com/flicker -- I've added a GtG photodiode oscilloscope testing mode to it.

The important thing is: Fair disclosure of your testing process. We know GtG numbers can never be perfect because of the cesspool of variables ("As Seen On Blur Busters!"). Cue the "Thank you for dropping a Pandora Box in this Rabbit Hole" music. ;)
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

User avatar
Chief Blur Buster
Site Admin
Posts: 7860
Joined: 05 Dec 2013, 15:44
Location: Toronto, Ontario, Canada
Contact:

Re: Understanding GtG versus MPRT: Two Pixel Response Benchmarks

Post by Chief Blur Buster » 04 Jun 2020, 02:11

We Need 100-Level Overdrive Adjustments

We're very famous raw material for new researcher study material, so I'm going to be keeping this thread up to date.

Different Human Reaction Time Responses To GtG Pixel Response
Those familiar with Pixel Response FAQ: GtG versus MPRT, as well as LCD Overdrive Artifacts show that different humans have very different reaction time behaviours to different levels of overdrive.
--> Some users prefer no overdrive at all (vision gets distracted by coronas, slowing them down)
--> Some users want super-excess blatant overdrive (BenQ AMA Premium) because it's like a tracer-bullet assist feature
--> Freezing homes (artic) will slow pixel response, requiring slightly higher overdrive
--> Hot homes (tropics) will speed pixel response, requiring slightly less overdrive
--> Some users have a preference to faster pixel response with slight coronas

So different humans have different reaction-time responses to different GtG/overdrive settings. An excessively fast 0.5ms GtG may actually slow a player down because coronas distract them. But may speed up other players because they're trained to treat coronas as a "highlight marker" for movement.

The website, PROSETTINGS.NET show that about 50% of esports players are using BenQ monitors (famous for AMA with exaggerated overdrive), and many of them are using the AMA feature as a "motion highlight assist" feature, similar to shadow boost and other esportsy features.

For a long time, Blur Busters has been dissapointed at fixed overdrive (calibrated at 20C), which is why Blur Busters is an advocate of the 100-Level Overdrive Gain Slider (like Brightess/Contrast), it should never be locked, and should be a User Defined option in main monitor menus. As well as User Defined Overrdrive Lookup Tables (since Blur Busters can generate better LUTs than many scaler/TCON vendors), because there are over 60,000 GtG numbers on a LCD surface

In the past, manufacturers didn't want to add extra overdrive to monitor menus because it complicates laypeople. However, it should at least be a "User Defined" setting hidden in the same area as RGB adjustments or ULMB Pulse Width or other advanced adjustments (among other needed features such as 60Hz single-strobe, for MAME arcade-machine enthusiasts). Monitor manufacturers sadly limit flexibility to keep things easier for users (but hurt the market for other users). It is all often just 1-line firmware programming changes to exaggerate overdrive, or re-add features that expand the market sideways (even features that don't push the refresh rate race upwards).

Anyway, we've amazingly noticed how overdrive is an unexpected "esports assist" and why it's very popular on BenQ monitors. People don't believe Blur Busters until the researchers test these out, and they grudgingly say "Blur Busters Is Right", years ahead of schedule: It appears that people react very differently (lagged reactions & accelerated reactions) to pixel response behaviours such as overdrive.

Eventually, We Desire User-Defined Overdrive Lookup Tables

Eventually we desire an API for user-defined overdrive lookup tables, so Blur Busters can provide better calibrated overdrive, viewtopic.php?f=7&t=6739
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

Post Reply