Gamma Corrected Pixel Response is a LIE

Everything about displays and monitors. 120Hz, 144Hz, 240Hz, 4K, 1440p, input lag, display shopping, monitor purchase decisions, compare, versus, debate, and more. Questions? Just ask!
Post Reply
namcost
Posts: 21
Joined: 02 Dec 2021, 19:18

Gamma Corrected Pixel Response is a LIE

Post by namcost » 07 May 2022, 02:33

I wanted to started this thread for the huge controversy that is fake news. You can hate that term all you want. It accurately describes how absolutely wrong these new age reviewers are with their "gamma corrected is the only proper way to test" bull.

However, gamma correction is a false term used to make them feel superior. Also to prevent people from getting into testing their own monitors. Here is why....

Usually, I test the simple 0% aka black, 20% grey, 40% grey, 60% grey, 80% grey, and 100% aka white. They claim this is "wrong". However, if I were to "gamma correct" that, all I have to do is give the RGB value. 20% grey is just "51-51-51" and if you look at hardware unboxed, 51 is part of their graph spec sheet. 40% grey is just 102. 60% grey is 153, 80% is 204, and 100% is 255. There is no "gamma correction" that is just their way of lying to consumers of their content.

HUB gives values of 0-26-51-77-102-128-153-179-204-230-255.... guess how that correlates to percentages? 0-10-20-30-40-50-60-70-80-90-100.... in terms of percentages. This "gamma correction" is a flat out lie. They never gamma corrected for anything. All they are doing is giving you new numbers to confuse the consumer of their content.

TFT central gives their bull excuses in their article which is just laughable. 10-90 has nothing to do with perceived color. 10-90 of a measurement means you start 10% after the transition starts and end 10% before it actually finishes. Think of it like running a race. It takes you 10 minutes to run the race, but your 10-90 time would be 8 minutes, losing 10% at the beginning (1 minute) and then 10% at the end (1 minute) for a total of 2 minutes taken away. So monitor reviewers confuse this idea and thinks it has anything to do with perceived color/shade/brightness. The issue here is their absolute WRONG testing fashion. They shouldn't be testing for 10-90 times. They should be testing for 0-100 aka full start to stop. HUB and TFT central both seem to agree that testing 3-97 is more appropriate to go along with "gamma correction" meaning you still lose 6% of data all because "gamma correction" which is just a laugh and a half. They keep wanting to go around the real issue. JUST TEST FULL 0-100. And there won't be a need for gamma correction or any kind of extra bull they throw in.

The issue comes down to one simple fact. Reviewrs are given "free samples" usually to keep, as long as their review meets the criteria given from the manufacturer. Don't believe me? Find the old Nvidia Titan (one of them) from Linus Tech Tips. He says "we got this gpu for free from Nvidia, and they gave us a PR spec sheet. We are only allowed to talk about what is on this PR sheet" meaning they cannot do any kind of real review. Monitor reviewers generally want to keep the 10-90 bull going so they can keep selling trash displays to consumers. God forbid they actually make better displays! HUB actually got into a little bit of heat when they announced switching to 3-97 over 10-90 by LG. This was because LG absolutely wants their displays to look as good as possible. If you rate full 0-100 times, no one is going to buy display X over display Y because display Y is actually superior. They are literally afraid of actual competition. Its kinda sad. ANYONE who claims reviewers aren't being "light hearted" in reviews from free products, are fooling themselves. I know people who review things (other products not monitors) who absolutely are given guidelines for their reviews. IT'S COMMON PRACTICE. Any reviewer claiming otherwise is just trying to save face with their consumers (those who watch/read their content).

I for one, am not on a leash. I don't get monitors for free and when I do test I always give absolute accurate results. I honestly believe we should hold these monitor reviewers to higher standards. I don't care of 10-90 is the industry standard and I don't care if they "meet half way" and choose to do 3-97. full 0-100 times are all that matters.... I myself was lazy quite awhile back as I was going through health issues, and ended up only testing 10-90 times for the viewsonic xg2431.... I should have done full 0-100 times.... that's on me. my excuse isn't valid as if my health was a concern, I simple should have waited to do testing....

Breaking that down into factual information however, refresh rate and pixel response times go hand in hand. NO, not the 10-90 times, the full 0-100 transition time!

Let's say you have a Display that is 60hz. That means every 1hz is changing every 16.66ms. The higher the refresh rate, the faster each single hertz will change. Blur Busters himself have talked about this using the term "persistence" in terms of motion clarity. IF YOUR PIXEL RESPONSE IS SLOWER THAN YOUR REFRESH RATE, you will have motion blur. The limiting factor is always the slowest unit measured.

You want an example. FINE. 240hz monitors. Native persistence 4.16ms. Meaning a PERFECT 240hz display will have no pixel transition worse than 4.16ms. So for ease of explaining, the slowest pixel change should be no slower than 4ms. This would result in a perfect image for 240hz. The problem there is most of these displays at 240hz are not capable of 4ms. The issue is exacerbated by reviewers using 10-90 or even 3-97 times. Because they are flat out lying about a monitors performance using excuses like "your human eye can't tell the difference" BULL. You absolutely can see motion blur caused by slow pixel response on fast displays.

A great example of that is when i did 10-90 times for the viewsonic xg2431 right here posted on these very forums. Those 10-90 times I have for the xg2431 with the highest overdrive showed a rather nice 3.33ms as the slowest pixel response. HOWEVER, because its 10-90, it is NOT taking into account the overshoot or undershoot from pixel response changes. Which can double even triple the actual pixel response time. In fact, I just might go back and do a proper 0-100 pixel response test to show said proof.... considering I am ramping up getting ready for the AW3423DW to hit my doorstep any day now.

We need to hold mainstream reviewers to higher standards. They literally get paid to lie to people and its flat out wrong. The main issue is everyone who thinks they know "everything" because "the guy on youtube who makes videos says so." That is not how society works. Just because they make a youtube video doesn't mean they are 100% telling the truth. People out there making flat earth videos, should we take everything they say as 100% fact because "they spent the time to make a video"?!?!? Lets start holding these monitor reviewers to higher standards. Boycott the living hell out of them. We need to move the industry forward. Lies like gamma correction, 3-97 and 10-90 times are only going to keep monitor makers from actually moving forward. How long has it taken to even get OLED into the gaming market? How long has it taken for monitor makers to improve pixel response times? We are in the SLOWEST growth of monitor technology we have ever seen. And that's a joke in itself.

User avatar
Discorz
VIP Member
Posts: 999
Joined: 06 Sep 2019, 02:39
Location: Europe, Croatia
Contact:

Re: Gamma Corrected Pixel Response is a LIE

Post by Discorz » 07 May 2022, 07:10

namcost wrote:
07 May 2022, 02:33
You must of misunderstood what gets gamma corrected. It's the curve. Applying tolerance to (non)linear (non-corrected) curve will crosspoint too high/low therefore skip/cut big chunk of transition because non-corrected curves do not have evenly sorted RGB values on Y axis (voltage/counts).

Here's what I meant. Notice how unevenly spaced are RGB values on Y axis for (non)linear curve.
Response Times Curve Conversion - (non)Linear to Gamma Corrected RGB.png
Response Times Curve Conversion - (non)Linear to Gamma Corrected RGB.png (330.46 KiB) Viewed 2349 times

Only after this conversion is done we should apply % tolerances. Otherwise we get distorted data. Now gamma correction with 3% tolerance is much better but still has minor flaws due to nature of percentage - as transition size increases it captures less of the transition, and when it decreases it captures more of the transition. Fixed RGB tolerance solves this. Fixed RGB tolerance can even be used on (non)linear curve, it just needs to be correctly positioned on Y axis.

I recommend u watch Aperture Grille's video again, multiple times if needed until it sinks in.
https://youtu.be/MbZUgKpzTA0 and this one https://youtu.be/YaLsFRZRQB0
U can also download all his charts on Apperture Grille.com to see how LCDs and overdrive work together. Amazing stuff.

I would agree with you on part with showing total response times only, without any tolerances. But I'd split this into "first response time" and "total response time" just to differentiate the target crosspoint from over/under/shoot. Even though this still doesn't show how high it over/under/shoots.

Ideally, non of these numbers are accurate demonstration of what our eyes see because pixel response is actually a curve, not just a number. That is why we have Cumulative Absolute Deviation (CAD) which takes Response Area (defined by curve) into account. For this to work properly responses must be gamma corrected.
Compare UFOs | Do you use Blur Reduction? | Smooth Frog | Latency Split Test
Alienware AW2521H, Gigabyte M32Q, Asus VG279QM, Alienware AW2518HF, AOC C24G1, AOC G2790PX, Setup

namcost
Posts: 21
Joined: 02 Dec 2021, 19:18

Re: Gamma Corrected Pixel Response is a LIE

Post by namcost » 08 May 2022, 11:22

You must of misunderstood what gets gamma corrected. It's the curve.
I understand quite well. I literally have gear to actually test monitors myself. They are making information up to feel important. The tools do not care about gamma. Its reading the raw light output and converting that to voltage. I can literally right now set any levels of brightness and you will 100% see a change in the voltage readings. Gamma correction is not required. It doesn't matter if the change between two sets of data has a difference in gap. As I stated, you literally see typical reviews (mostly older now) where people go with 0-20-40-60-80-100 in terms of brightness percentage, which relates to 0-51-102-153-204-255. IT DOES NOT MATTER if you gamma correct at all. Gamma correction is not going to change the brightness output! THEY MADE IT UP.
I recommend u watch Aperture Grille's video again, multiple times if needed until it sinks in.
That's very ignorant of you. Presumptuous even. Just because someone makes a video doesn't make it 100% factual.

AG video you linked, stated exactly what I said, that 10-90 is wrong. Sure he rambles on the whole video to explain his point. But he even shows you, gamma correction isn't needed for full 0-100 times. It's only there because of a non existent issue, which is 10-90 times. Literally run tests properly by showing full 0-100 times, and gamma correction won't be needed. Plain and simple. So as I stated, its made up bullcrap.

If you were to pull the 10-90 in sales, you would get SUED for false advertisement. So why do consumers accept it for reviews? Because they don't know any better. An ill-informed customer is a happy customer. Because then they won't complain and demand more from a business. In the sense of "if a slave doesn't know they are a slave they will never seek freedom".... An ill-informed customer will never demand a company do better. So that company can continue to "rip customers off". Which is what we've seen in the monitor market for so long. Hell even televisions are generally ahead in terms of technological growth.... that's just crazy to me. I mean I get the difference, generally televisions are more about picture quality instead of speed/performance. Where as gaming monitors are more speed/performance over color. Only in the most recent years have we seen color become a demand of gaming monitors. And more on that, reviewers are controlled by the people giving them the gear to review. I don't care how many times these reviewers claim "its not real" and "this guy is crazy" its 100% real. Because I have WORKED in the review industry in other product fields. They absolutely expect you to give them a favorable review, and you get to keep the product for your silence. I don't care if HUB posted their little disagreement with LG. In the end they came to a conclusion that only benefits LG, regardless of what HUB claims.

The second video says the same shit. That gamma correction is needed because of 10-90 times, the "vesa standard." He claims it's too hard to know exactly when pixel response starts and when it stops. BULLCRAP. Firstly, there is no "noise" with my equipment. And my equipment is on the cheap end. 160 for the USB oscilloscope and then 80 for the mounted photodiode and 20 for an SM to BNC cable. And as far as noise, non existent. Not in the example he showed where the readings were WAY OFF. That's an issue with his personal gear. That's what happens when you try to hack together your own kit without actually using quality parts. He then says that for gamma correction, they are giving the change times between the gamma correction. So instead of 0-255 it might show 26-230. And then mentions HUB's 3-97 which would be 8-247. BOTH ARE WRONG. 0-255 is where the transition is going. This idea of gamma correction is the dumbest thing ive ever seen. Like trying to reinvent the wheel by making it square.... Give the full 0-100 response time like they should, and gamma correction isn't required.

I really don't care for youtubers all circle jerking eachother off agreeing on the same bullcrap premise. End of the day, 0-100 times are all that matters. In the actual sales world, if you lied about performance the way monitor reviews did, you would end up getting sued. The only reason monitor brands get away with "1ms grey to grey" is because in that specific condition with overdrive set a specific way, it does hit 1ms grey to grey.... that doesn't mean the display is actually "good"....
Ideally, none of these numbers are accurate demonstration of what our eyes see because pixel response is actually a curve, not just a number. That is why we have Cumulative Absolute Deviation (CAD) which takes Response Area (defined by curve) into account. For this to work properly responses must be gamma corrected.
No that's just what these youtubers are telling you. Doesn't make it true. Pixels dont stop changing because you present a new frame. If frame 1 takes 10 milliseconds to change but refresh wise the refresh rate changes every 5ms, that means frame 1 and frame 2 will fight over the pixels. This is where you get motion blur. Ghosting is quite literally two frames blending together causing a motion blur effect. And that is 100% visible and yet you just tried to say users wont see a difference.... LAUGHABLE. This idea that "full pixel response doesn't matter because our eyes can't see it" is an absolute lie. We can see it. You can even see color shift in some content when pixels are EXTREMELY slower than refresh rate. It can get quite bad. Luckily in modern times pixel response is generally only 2-3x worse than refresh rate worse case scenario, so color shift doesn't happen anymore in content. But we still get motion blur, which we do see, and we all hate. Cumulative Deviation is another bullshit made up metric that isn't required if they gave full 0-100 pixel response times. IT DOES NOT MATTER if there is overshoot or undershoot. The pixel stops changing when the light output stabilizes. Full start to finish.

User avatar
Discorz
VIP Member
Posts: 999
Joined: 06 Sep 2019, 02:39
Location: Europe, Croatia
Contact:

Re: Gamma Corrected Pixel Response is a LIE

Post by Discorz » 08 May 2022, 14:39

I'm not sure what is your point here. We are here to try change things to better. More and more people are aware of false response time advertising, gaming community is growing, we demand more and we might succeed in future. It will take time though.

Yes, if there is no % tolerance then gamma correction is not needed. It was made under assumption that % tolerance would continue to be used. But still (non)linear curve does not correspond to what our eyes see. Neither do total response times. It doesn't tell us if target over/under/shoots and how much. Which means we definitely need overshoot data in either percentage or RGB. And extracting such data out of non-gamma corrected curve is misleading. In my opinion we need something like this:
1. FIRST RESPONSE TIME (0% tolerance, first target crosspoint)
2. PEAK TIME (usually "overdrive update time", not necessarily refresh time)
3. OVER/UNDER/SHOOT (over/under/shoot size, in either RGB, or % but gamma corrected)
4. FULL RESPONSE TIME
And these would still not be perfect because response is a curve not a number.

So gamma Correction is not a lie. Manufacturers giving us 1ms gtg average only is a lie - by using large 10% tolerance without overshoot and total response time data. If they did gave us such then gamma correction is still needed to not be a lie.

Also watch your language. I personally don't mind but we need to stick to forum rules.
Compare UFOs | Do you use Blur Reduction? | Smooth Frog | Latency Split Test
Alienware AW2521H, Gigabyte M32Q, Asus VG279QM, Alienware AW2518HF, AOC C24G1, AOC G2790PX, Setup

RestTarRr
Posts: 36
Joined: 08 Mar 2020, 07:08

Re: Gamma Corrected Pixel Response is a LIE

Post by RestTarRr » 16 May 2022, 10:12

namcost wrote:
07 May 2022, 02:33
It takes you 10 minutes to run the race, but your 10-90 time would be 8 minutes, losing 10% at the beginning (1 minute) and then 10% at the end (1 minute) for a total of 2 minutes taken away.
Why are you assuming that the pace is identical throughout the entire race? That's the entire point of the 10-90. It isn't identical and you can't produce the other 20% out of thin air and claim that it's exactly 2 minutes, because the 10-90 can be 8 minutes, it can also be 8:30, it can also be 7:50, etc.

namcost
Posts: 21
Joined: 02 Dec 2021, 19:18

Re: Gamma Corrected Pixel Response is a LIE

Post by namcost » 16 May 2022, 12:28

RestTarRr wrote:
16 May 2022, 10:12
Why are you assuming that the pace is identical throughout the entire race? That's the entire point of the 10-90. It isn't identical and you can't produce the other 20% out of thin air and claim that it's exactly 2 minutes, because the 10-90 can be 8 minutes, it can also be 8:30, it can also be 7:50, etc.
I gave a basic example.... It's not meant to be literal....

My point stands, 10-90 and 3-97 are bullshit metrics. The only metric that matters is full start/stop times. You even proved that they are bullshit, because as stated, the missing time could be even longer than "2 minutes" which makes the 10-90/3-97 times even worse to use. Full start/stop is all that matters. Thanks for agreeing with me.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Gamma Corrected Pixel Response is a LIE

Post by Chief Blur Buster » 18 May 2022, 14:58

namcost wrote:
08 May 2022, 11:22
That's very ignorant of you. Presumptuous even. Just because someone makes a video doesn't make it 100% factual.
I need to warn you about forum rules around here.
Image
(Even when criticizing my forum rules, if you dare reply/PM me. Move along.)

We don't like this intensity of blind unscientific ranting without nuanced acknowledgements of error margins which are currently missing from your rant.

VESA is the correct source of the 10%-to-90% standard.
But we have to understand WHY they had no choice but to choose this standard.
Do you know why they chose 10%-to-90%?
It's annoying yes, but read on for the pick-poison deal-with-devil legitimate scientific reason.

Let's use more nuanced language -- and let's target our rants accurately.
You have many valid points shared with Blur Busters but your rant is a little too anger-fueled and incorrectly targets blame.
Let's focus on science and physics, because Blur Busters is more science-minded.
namcost wrote:
08 May 2022, 11:22
If you were to pull the 10-90 in sales, you would get SUED for false advertisement. So why do consumers accept it for reviews? Because they don't know any better. An ill-informed customer is a happy customer.
I understand the marketing-misleading perspective (see Pixel Response FAQ: GtG Versus MPRT.

🡺 But why did VESA invent the 10%-to-90% standard!? 🢀

To figure out who to blame, we need to figure the who-and-the-why.

Technologically, there is a very good (full stop indisputable) historical scientific reason why 10%-90% was used.

Electronic equipment of all kinds worldwide has a noisefloor. Trying to measure GtG100% on common photosensor equipment less sensitive than the human eye, creates random GtG numbers that look really bad. Even with an expensive Tektronix oscilloscope. We see problems with a lot of legacy measuring equipment where the same color can randomize GtG 0.5ms to Gtg 10ms, because the noisefloor was so massively noisy (blurryu noisefloor boundary that's a few octaves thick).

Trying to measure GtG accurately of RGB(0,0,0) transitioning to RGB(25,25,25) and then transitioning back to RGB(0,0,0) can be below the noise floor for a lot of GtG measurements. This is because the 10% RGB value without gamma correction is only 1.7

This is because most monitors are 2.2 gamma, and the gamma formulas are:

Code: Select all

encoded = ((original / 255) ^ (1 / gamma)) * 255

original = ((encoded / 255) ^ gamma) * 255
Divide RGB(255,255,255) by 10 and you think you're wanting to use RGB(25,25,25) as your GtG 10% cutoff threshold.
But see the problem yet? Run the gamma math.

Mathing it out in 24-bit colorspace with gamma correction formula:
Thus, RGB(25,25,25) is ~1.5% the brightness (in photons) of an RGB(255,255,255) full white pixel.
Thus, RGB(89,89,89) is ~10% the brightness (in photons) of an RGB(255,255,255) full white pixel.

Let's look at the original. The horse mouth. The original blame. VESA.
The thresholds is supposed to be lumen-based rather than pixel-value-based.
Yes, the VESA GtG test standard has much to criticize, but...

Holy ****! Only 1.5 percent the light of a white pixel!?

That's far below the noisefloor of many oscilloscope+photodiode combos if you wanted to measure 0% vs 1.5%.

That's what some GtG measuring equipment screws up; by actually measuring down to GtG1.5%, rather than GtG10% (lumens point of view). Some measuring equipment (professional & amateur) forget to gamma-correct the GtG measurement, and the defective GtG measuring equipment was sold by the measuring-equipment vendor to manufacturers, some of whom didn't know better. (Yes, sometimes Marketing is overeager, but remember, they're reading numbers off the meter they purchased for $5000 from the industrial GtG vendor at DisplayWeek, or whatever). Things are better now, but many indies forget to gamma-correct the GtG's

If you're measuring GtG from black to white, the line start moving really fast at the beginning, so less errors (GtG 1.5% vs GtG 10% can become more identical because it's just shooting vertically from the fuzzy noise floor), so the GtG-measuring-equipment miscalibration mistake isn't always noticed.

But when you're creating HardwareUnboxed-style GtG heatmaps, you're measuring a lot of GtG combos that aren't fullblack nor fullwhite.

Image

Often these heatmaps are measured in RGB space rather than lumens space, so you start veering into the error margins of the noise floor for all those 32's and 64's, and yes, even sometimes 96's. The way GtG values can now be averaged from a heatmap instead of a single combo, can also cause GtG numbers to distort massively, even with 1C or 2C temperature changes, which dramatically change the color of some corners of the GtG heatmap (as colder LCDs have slower GtGs, and start wreaking havoc with GtGs below the noisefloor of measuring equipment) -- especially if you're not gamma correcting your GtG's out of the noisefloor problem.

It's a complicated measurement issue, that probably needs to be re-standardized with new minimum accuracy standards mandatory for modern measuring equipment.

Now...

It can go both ways -- reviewers or manufacturer side having wrong GtG thresholds -- Sure, yes, some manufacturers cheat, but some are just parrotting the exact number off the commercial measuring equipment they used -- and if the equipment is correctly purchased, calibrated to VESA specifications, marketing at least parrots a number more matching "VESA spec" than certain reviewers doing, amplifying the "badness of real world GtG numbers" via an incorrect

Remember that on a 100 nits monitor, (old panels with less bright backlights, or PhotoShop 120 nit calibration) 1.5% means 1.5 nits. Even a $400 Thorlabs photodiode isn't as good as the human eye for some specific use cases. You can reduce the sample rate to reduce the noisefloor, but with fast-GtG and darks, it's still hard to get accurate GtG curves matching the artifacts that the human eye saw.

Measuring equipment overlooks lots of things human vision can see (e.g. spatial FRC or inversion noise in the darks of an LCD that is unseen by a single-pixel sensor such as a photodiode) -- a good photodiode is just a defacto equivalent of a high-framerate single-pixel camera running at really high Hz (the sample rate of oscilloscope). But that's still not human eyes either, where the eyes are seeing a composite of all kinds of artifacts (GtG, MPRT, spatials+temporals combos, etc). So inventing a measurement standard is a Pandora's Box, or continuing to use an old standard in the era of much faster pixel response that now starts creating more noisefloor problems -- higher sample rate is needed to accurately measure GtG behaviours of a 1ms-GtG monitor than a 33ms-GtG Monitor, and now more measurements is done (e.g. GtG heatmapping rather than a single GtG transition). Increasing the sample rate proportionally has major implications on the noisefloor, everything else unchanged.

Now....consider calibration. Whether using full brightness of monitor or calibrated SDR brightness of monitor.

Especially when reviewer precalibrates color first to classical PhotoShop SDR standards (120 nits), then now you're measuring GtG from less than 2 nits -- that means you get random GtG numbers that jitter a lot, because of the noisefloor. Even mudane things such as a 5 degree temperature difference (cold rooms in winter 17C, warm rooms in summer 22C) can really severely affect the GtG 1.5% number while GtG10% is much more steady across all temperatures.

At 1.5% lumens of a white pixel, lots of error margins come into play (e.g. laboratory fluorescent light flicker leaking into the measuring cage unnoticed within the noise floor, to things like power supply noise being injected into long photodiode wires making the noisefloor blurrier, etc).

Loading up the very old doc that's decades old, it seems that VESA specifies the 10% threshold as the photons point of view, not the RGB point of view. But many measurement equipment vendors make the mistake of not gamma-correcting GtG.

Sure, you can get better measuring equipment. But VESA is the worldwide standards organizer. A small-outfit in Taiwan may be forced by bosses to use some old oscilloscopes and crappy photodiodes, and the only way to get accurate GtG measurements is GtG10%-90%.

It is a two way street (reviewer with bad GtG threshold unbeknownst to them, manufacturer with correct GtG threshold) versus (reviewer with good GtG threshold, manufacturer with incorrect GtG threshold unbeknownst to them)

Or maybe the manufacturer did it correctly and the reviewer did it incorrectly (increasing the "exaggeration chasm" where the manufacturer's lower GtG number is more honest if you're just going "by the VESA book").

It doesn't dismiss the fact that GtG10% is human-visible, as human eyes are more sensitive than 90% of junk measuring equipment lying all over the place. I agree with you there. But we need to understand why the "vesa standard" and understand what the hell is happening, and not assign blind blame.
namcost wrote:
07 May 2022, 02:33
JUST TEST FULL 0-100
This unfortunately isn't as simple as that.

Modify an Arduino tester to record a photodiode recording for the darkest non-gamma-corrected colors of your GtG heatmap, like 0-20 and 20-0. Notice how noisy it is. Now try testing again at 15C, 18C, 20C, 23C and 26C temperatures. The GtG numbers can vary by 2x-10x for certain squares of the GtG heatmap, just merely by temperature and panel lottery, if your testing transitions between two very dark colors -- especially if you're using bigger GtG heatmaps that some sites now use.

Sure, it's not too much of a problem for measuring a full white-black or black-white transition. But as you already know, an LCD has 65,280 different pixel response speeds in 24-bit colorspace (256*255. That's 65536 minus the 256 non-changes), and that's why many reviewers have begun to measure GtG heatmaps, which is a great step for reviewers. GtG is not a single number.

Future standards? There's room for accurate equipment to measure down to GtG2%, and thereabouts. We can also define criteria for a GtG100% measurement, since we don't care about GtG 0.25% vs GtG 0%, as long as the noisefloor can be pushed below the threshold -- it is not easy especially if you pre-calibrate color before testing lag or GtG's.

And this is relevant to lag tests too -- RTINGS uses GtG2% as their stopwatch-stop for their input-lag tests -- this is roughly the beginning of human-visible change anyway (more or less), although I prefer the GtG10% or GtG50% cutoff for latency stopwatching.

I have a legit beef about reviewer websites: They don't always publish exact test methodology (e.g. latency test stopwatch start/stop triggers).

It's fine to be someone who just built a photodiode tester, and then discover the reviewer disreprecancies. But I bet you didn't know that TOP < CENTER < BOTTOM latency can invert to TOP > CENTER > BOTTOM latency when monitor OSD settings are changed? And sometimes even TOP < CENTER > BOTTOM as well as TOP > CENTER < BOTTOM when measuring lag of top edge, center, and bottom edge. See, monitor settings influence it. Winky wink. It's a big rabbit hole.

I hope you learned something new today.
I also hate this mess too.
But at least I understand why.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Post Reply