Gamma Corrected Pixel Response is a LIE
Posted: 07 May 2022, 02:33
I wanted to started this thread for the huge controversy that is fake news. You can hate that term all you want. It accurately describes how absolutely wrong these new age reviewers are with their "gamma corrected is the only proper way to test" bull.
However, gamma correction is a false term used to make them feel superior. Also to prevent people from getting into testing their own monitors. Here is why....
Usually, I test the simple 0% aka black, 20% grey, 40% grey, 60% grey, 80% grey, and 100% aka white. They claim this is "wrong". However, if I were to "gamma correct" that, all I have to do is give the RGB value. 20% grey is just "51-51-51" and if you look at hardware unboxed, 51 is part of their graph spec sheet. 40% grey is just 102. 60% grey is 153, 80% is 204, and 100% is 255. There is no "gamma correction" that is just their way of lying to consumers of their content.
HUB gives values of 0-26-51-77-102-128-153-179-204-230-255.... guess how that correlates to percentages? 0-10-20-30-40-50-60-70-80-90-100.... in terms of percentages. This "gamma correction" is a flat out lie. They never gamma corrected for anything. All they are doing is giving you new numbers to confuse the consumer of their content.
TFT central gives their bull excuses in their article which is just laughable. 10-90 has nothing to do with perceived color. 10-90 of a measurement means you start 10% after the transition starts and end 10% before it actually finishes. Think of it like running a race. It takes you 10 minutes to run the race, but your 10-90 time would be 8 minutes, losing 10% at the beginning (1 minute) and then 10% at the end (1 minute) for a total of 2 minutes taken away. So monitor reviewers confuse this idea and thinks it has anything to do with perceived color/shade/brightness. The issue here is their absolute WRONG testing fashion. They shouldn't be testing for 10-90 times. They should be testing for 0-100 aka full start to stop. HUB and TFT central both seem to agree that testing 3-97 is more appropriate to go along with "gamma correction" meaning you still lose 6% of data all because "gamma correction" which is just a laugh and a half. They keep wanting to go around the real issue. JUST TEST FULL 0-100. And there won't be a need for gamma correction or any kind of extra bull they throw in.
The issue comes down to one simple fact. Reviewrs are given "free samples" usually to keep, as long as their review meets the criteria given from the manufacturer. Don't believe me? Find the old Nvidia Titan (one of them) from Linus Tech Tips. He says "we got this gpu for free from Nvidia, and they gave us a PR spec sheet. We are only allowed to talk about what is on this PR sheet" meaning they cannot do any kind of real review. Monitor reviewers generally want to keep the 10-90 bull going so they can keep selling trash displays to consumers. God forbid they actually make better displays! HUB actually got into a little bit of heat when they announced switching to 3-97 over 10-90 by LG. This was because LG absolutely wants their displays to look as good as possible. If you rate full 0-100 times, no one is going to buy display X over display Y because display Y is actually superior. They are literally afraid of actual competition. Its kinda sad. ANYONE who claims reviewers aren't being "light hearted" in reviews from free products, are fooling themselves. I know people who review things (other products not monitors) who absolutely are given guidelines for their reviews. IT'S COMMON PRACTICE. Any reviewer claiming otherwise is just trying to save face with their consumers (those who watch/read their content).
I for one, am not on a leash. I don't get monitors for free and when I do test I always give absolute accurate results. I honestly believe we should hold these monitor reviewers to higher standards. I don't care of 10-90 is the industry standard and I don't care if they "meet half way" and choose to do 3-97. full 0-100 times are all that matters.... I myself was lazy quite awhile back as I was going through health issues, and ended up only testing 10-90 times for the viewsonic xg2431.... I should have done full 0-100 times.... that's on me. my excuse isn't valid as if my health was a concern, I simple should have waited to do testing....
Breaking that down into factual information however, refresh rate and pixel response times go hand in hand. NO, not the 10-90 times, the full 0-100 transition time!
Let's say you have a Display that is 60hz. That means every 1hz is changing every 16.66ms. The higher the refresh rate, the faster each single hertz will change. Blur Busters himself have talked about this using the term "persistence" in terms of motion clarity. IF YOUR PIXEL RESPONSE IS SLOWER THAN YOUR REFRESH RATE, you will have motion blur. The limiting factor is always the slowest unit measured.
You want an example. FINE. 240hz monitors. Native persistence 4.16ms. Meaning a PERFECT 240hz display will have no pixel transition worse than 4.16ms. So for ease of explaining, the slowest pixel change should be no slower than 4ms. This would result in a perfect image for 240hz. The problem there is most of these displays at 240hz are not capable of 4ms. The issue is exacerbated by reviewers using 10-90 or even 3-97 times. Because they are flat out lying about a monitors performance using excuses like "your human eye can't tell the difference" BULL. You absolutely can see motion blur caused by slow pixel response on fast displays.
A great example of that is when i did 10-90 times for the viewsonic xg2431 right here posted on these very forums. Those 10-90 times I have for the xg2431 with the highest overdrive showed a rather nice 3.33ms as the slowest pixel response. HOWEVER, because its 10-90, it is NOT taking into account the overshoot or undershoot from pixel response changes. Which can double even triple the actual pixel response time. In fact, I just might go back and do a proper 0-100 pixel response test to show said proof.... considering I am ramping up getting ready for the AW3423DW to hit my doorstep any day now.
We need to hold mainstream reviewers to higher standards. They literally get paid to lie to people and its flat out wrong. The main issue is everyone who thinks they know "everything" because "the guy on youtube who makes videos says so." That is not how society works. Just because they make a youtube video doesn't mean they are 100% telling the truth. People out there making flat earth videos, should we take everything they say as 100% fact because "they spent the time to make a video"?!?!? Lets start holding these monitor reviewers to higher standards. Boycott the living hell out of them. We need to move the industry forward. Lies like gamma correction, 3-97 and 10-90 times are only going to keep monitor makers from actually moving forward. How long has it taken to even get OLED into the gaming market? How long has it taken for monitor makers to improve pixel response times? We are in the SLOWEST growth of monitor technology we have ever seen. And that's a joke in itself.
However, gamma correction is a false term used to make them feel superior. Also to prevent people from getting into testing their own monitors. Here is why....
Usually, I test the simple 0% aka black, 20% grey, 40% grey, 60% grey, 80% grey, and 100% aka white. They claim this is "wrong". However, if I were to "gamma correct" that, all I have to do is give the RGB value. 20% grey is just "51-51-51" and if you look at hardware unboxed, 51 is part of their graph spec sheet. 40% grey is just 102. 60% grey is 153, 80% is 204, and 100% is 255. There is no "gamma correction" that is just their way of lying to consumers of their content.
HUB gives values of 0-26-51-77-102-128-153-179-204-230-255.... guess how that correlates to percentages? 0-10-20-30-40-50-60-70-80-90-100.... in terms of percentages. This "gamma correction" is a flat out lie. They never gamma corrected for anything. All they are doing is giving you new numbers to confuse the consumer of their content.
TFT central gives their bull excuses in their article which is just laughable. 10-90 has nothing to do with perceived color. 10-90 of a measurement means you start 10% after the transition starts and end 10% before it actually finishes. Think of it like running a race. It takes you 10 minutes to run the race, but your 10-90 time would be 8 minutes, losing 10% at the beginning (1 minute) and then 10% at the end (1 minute) for a total of 2 minutes taken away. So monitor reviewers confuse this idea and thinks it has anything to do with perceived color/shade/brightness. The issue here is their absolute WRONG testing fashion. They shouldn't be testing for 10-90 times. They should be testing for 0-100 aka full start to stop. HUB and TFT central both seem to agree that testing 3-97 is more appropriate to go along with "gamma correction" meaning you still lose 6% of data all because "gamma correction" which is just a laugh and a half. They keep wanting to go around the real issue. JUST TEST FULL 0-100. And there won't be a need for gamma correction or any kind of extra bull they throw in.
The issue comes down to one simple fact. Reviewrs are given "free samples" usually to keep, as long as their review meets the criteria given from the manufacturer. Don't believe me? Find the old Nvidia Titan (one of them) from Linus Tech Tips. He says "we got this gpu for free from Nvidia, and they gave us a PR spec sheet. We are only allowed to talk about what is on this PR sheet" meaning they cannot do any kind of real review. Monitor reviewers generally want to keep the 10-90 bull going so they can keep selling trash displays to consumers. God forbid they actually make better displays! HUB actually got into a little bit of heat when they announced switching to 3-97 over 10-90 by LG. This was because LG absolutely wants their displays to look as good as possible. If you rate full 0-100 times, no one is going to buy display X over display Y because display Y is actually superior. They are literally afraid of actual competition. Its kinda sad. ANYONE who claims reviewers aren't being "light hearted" in reviews from free products, are fooling themselves. I know people who review things (other products not monitors) who absolutely are given guidelines for their reviews. IT'S COMMON PRACTICE. Any reviewer claiming otherwise is just trying to save face with their consumers (those who watch/read their content).
I for one, am not on a leash. I don't get monitors for free and when I do test I always give absolute accurate results. I honestly believe we should hold these monitor reviewers to higher standards. I don't care of 10-90 is the industry standard and I don't care if they "meet half way" and choose to do 3-97. full 0-100 times are all that matters.... I myself was lazy quite awhile back as I was going through health issues, and ended up only testing 10-90 times for the viewsonic xg2431.... I should have done full 0-100 times.... that's on me. my excuse isn't valid as if my health was a concern, I simple should have waited to do testing....
Breaking that down into factual information however, refresh rate and pixel response times go hand in hand. NO, not the 10-90 times, the full 0-100 transition time!
Let's say you have a Display that is 60hz. That means every 1hz is changing every 16.66ms. The higher the refresh rate, the faster each single hertz will change. Blur Busters himself have talked about this using the term "persistence" in terms of motion clarity. IF YOUR PIXEL RESPONSE IS SLOWER THAN YOUR REFRESH RATE, you will have motion blur. The limiting factor is always the slowest unit measured.
You want an example. FINE. 240hz monitors. Native persistence 4.16ms. Meaning a PERFECT 240hz display will have no pixel transition worse than 4.16ms. So for ease of explaining, the slowest pixel change should be no slower than 4ms. This would result in a perfect image for 240hz. The problem there is most of these displays at 240hz are not capable of 4ms. The issue is exacerbated by reviewers using 10-90 or even 3-97 times. Because they are flat out lying about a monitors performance using excuses like "your human eye can't tell the difference" BULL. You absolutely can see motion blur caused by slow pixel response on fast displays.
A great example of that is when i did 10-90 times for the viewsonic xg2431 right here posted on these very forums. Those 10-90 times I have for the xg2431 with the highest overdrive showed a rather nice 3.33ms as the slowest pixel response. HOWEVER, because its 10-90, it is NOT taking into account the overshoot or undershoot from pixel response changes. Which can double even triple the actual pixel response time. In fact, I just might go back and do a proper 0-100 pixel response test to show said proof.... considering I am ramping up getting ready for the AW3423DW to hit my doorstep any day now.
We need to hold mainstream reviewers to higher standards. They literally get paid to lie to people and its flat out wrong. The main issue is everyone who thinks they know "everything" because "the guy on youtube who makes videos says so." That is not how society works. Just because they make a youtube video doesn't mean they are 100% telling the truth. People out there making flat earth videos, should we take everything they say as 100% fact because "they spent the time to make a video"?!?!? Lets start holding these monitor reviewers to higher standards. Boycott the living hell out of them. We need to move the industry forward. Lies like gamma correction, 3-97 and 10-90 times are only going to keep monitor makers from actually moving forward. How long has it taken to even get OLED into the gaming market? How long has it taken for monitor makers to improve pixel response times? We are in the SLOWEST growth of monitor technology we have ever seen. And that's a joke in itself.