Thanks for that explanation. So what happens when you feed a true 10 bit panel a 8bit source? Nothing? Display just displays as is? Or maybe in the case of something like a xbox one the gpu dithers the source up creating FRC? But maybe gpu dithering differs from display dither.Chief Blur Buster wrote:Yes, 10 and 12-bit is a superset of 8-bit. It's simply a much larger color palette. 10-bit has 4x the number of colors of 8-bit, and 12-bit has 16x the number of colors as 8-bit.
Honestly I am not looking for tips on reducing eye strain from this forum. I'm simply trying to understand what is going on here and how the tech is behaving.Chief Blur Buster wrote:I've never, ever heard of anybody ever having vision problems with inversion/FRC in an HDTV. There are many causes of eyestrain, and while I am no doctor, I would bet a whole house mortgage (At this point, with the huge numbers of clues you've given including plasma) that your eye problems are unrelated to inversion/FRC. Thus, at this point, I feel the discussion is useless for a person who hasn't had problems with plasma.
LOL I wasn't suggesting using a 50 inch TV 2 feet from my face. I was thinking maybe a 50 inch TV at distance of 8ft or so would do a better job of "hiding" flicker/noise of FRC or inversion then a monitor at 2 to 3ft. But then again it is a bigger display so don't know.Chief Blur Buster wrote:For the same resolution (1080p), the angular resolution of a 50" HDTV ten feet away via a sofa is lower than a 24" monitor from a computer chair two feet in front. And if you put a 50" HDTV (even the world's most flicker free one) on a computer desk only 2 feet in front of your face, you will get far more eyestrain from the size than even a flickerier 24" monitor. Fixing one problem creating bigger problems.