Post
by Chief Blur Buster » 02 Jun 2014, 10:13
Both of you have a point but I think I need to brings you an important heads up that it depends on a lot of variables.
I used to work in the home theater industry and I can see banding issues even in 24bit color in non-zoomed mode during slow gradients (e.g. Blue sky, grey smoke, dark scenes) on ultrahigh-contrast ratio wide-dynamic-range displays such as laser projection (perfect blacks and blinding whites). Some parts of the 24bit colorspace is much easier to see when contrast ratio is 10x or 100x higher than TN panels.
Adding noise and temporal dithering eliminates this, but subtle computer generated gradients (and subtle gradients on very low-camera-sensor-noise recordings) easily show banding non-zoomed. There is no difference between zooming in a small section of a high-contrast gradient and not zooming on a low-contrast gradient.
RealNC, zooming is not important - what matters is how the colors are spread out, e.g. Left edge dark grey and right edge slightly darker grey - it could simply be a low contrast non-zoomed gradient such as smoke in the dark, rather than zooming into the dark section of a high-contrast gradient. It is mathematically identical at the pixel, and creates exactly the same visual problem on panels, and FRC 6bit cannot tell these two situations apart at all, so the conversion to 8bit is identical, with exactly the same banding visibility at the panel level. Lots of low contrast gradients exist in real life, such as blue sky and grey smoke, where the colors get spread really thin, enough to show banding.
8bits(per channel) is not enough to pass a theoretical Holodeck Turing test ("Wow, I didn't know I was standing in Holodeck, I thought I was in real life.") once you map a wide dynamic range instead of a bottom-end brightness and a top-end brightness. It is harder to see 8-bit limitations of adjacent colors at 1000:1 contrast ratio than at 100,000:1 contrast ratio. At 100x contrast ratio, gradients become 100x more visible during dark scenes, enough to being 8bit limitations in real world material, naturally easily into human limits for some of the colorspace. Contrast ratios' effect on making gradient banding more visible is well known amongst videophile-display engineers, and humans can actually distinguish over a billion colors (albiet not all simultaneously in the same scene) when including luminance if you expand over the whole dynamic range from the dimmest light in a totally dark room, through the brightest blinding whites of the midday sun. Today's display do not have even one-thousandth of that dynamic range, but hundred-thousand-dollar displays exist that I have seen where 24-bit looks like a color-by-the-numbers cartoon.
An extra variable also exists, the momentary dynamic range the human eye is currently at. (A.K.A. Dark-adapted vision). It is harder to tell the difference between two dim grey squares on a white background, than on a black background, on any display. The human vision only has a momentary (instantaneous) effective dynamic range of closer to 100:1 so all very dark greys tend to round to completely black while we are viewing a bright scene, even on a 1000:1 contrast ratio display. So humans can only simultaneously tell apart several thousand colors in the same view, for a given whole average vision brightness (and human iris / vision adaptation moment). Most tell-colors-apart tests show two colors side by side, without compensating surrounding view for maintaining the average-scene-brightness / maintaining iris size. And when such tests compensate for fixed dynamic range (prevent variances in iris size and maintain vision brightness adaptation), suddenly, humans can only tell apart several thousand colors, rather than millions! The way science counts human vision colors need to defines the environmental variables (dynamic range), which is why some of them claim humans can only tell apart a few thousand colors, while others of the studies claim billions of colors. Both studies are correct. (Surprised, eh?)