Banding in 8-bit per channel color

Advanced display talk, display hackers, advanced game programmers, scientists, display researchers, display manufacturers, vision researchers & Advanced Display Articles on Blur Busters. The masters on Blur Busters.
User avatar
RealNC
Site Admin
Posts: 3741
Joined: 24 Dec 2013, 18:32
Contact:

Re: Why do we need refresh rates at all?

Post by RealNC » 02 Jun 2014, 05:54

flood wrote:i'm just saying that it is possible to see banding on a 8 bit display when you have an 8 bit image of a gradient. obviously the effect would be more mild than on a 6bit display, but that doesnt mean that 8 bit images on a good 8 bit display are immune from banding. To use your terminology, well 8 bit images themselves often intrinsically have visible banding.
We both agree on that :) I was only talking about banding added by the display, which can happen on 6-bit panels, even with dithering.
I'm not sure what you meant by how zooming in is wrong...
I meant that by zooming, you get to see the inherent banding of the source, not the banding produced by the inaccuracy of the monitor's panel.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Banding in 8-bit per channel color

Post by Chief Blur Buster » 02 Jun 2014, 10:13

Both of you have a point but I think I need to brings you an important heads up that it depends on a lot of variables.

I used to work in the home theater industry and I can see banding issues even in 24bit color in non-zoomed mode during slow gradients (e.g. Blue sky, grey smoke, dark scenes) on ultrahigh-contrast ratio wide-dynamic-range displays such as laser projection (perfect blacks and blinding whites). Some parts of the 24bit colorspace is much easier to see when contrast ratio is 10x or 100x higher than TN panels.

Adding noise and temporal dithering eliminates this, but subtle computer generated gradients (and subtle gradients on very low-camera-sensor-noise recordings) easily show banding non-zoomed. There is no difference between zooming in a small section of a high-contrast gradient and not zooming on a low-contrast gradient.

RealNC, zooming is not important - what matters is how the colors are spread out, e.g. Left edge dark grey and right edge slightly darker grey - it could simply be a low contrast non-zoomed gradient such as smoke in the dark, rather than zooming into the dark section of a high-contrast gradient. It is mathematically identical at the pixel, and creates exactly the same visual problem on panels, and FRC 6bit cannot tell these two situations apart at all, so the conversion to 8bit is identical, with exactly the same banding visibility at the panel level. Lots of low contrast gradients exist in real life, such as blue sky and grey smoke, where the colors get spread really thin, enough to show banding.

8bits(per channel) is not enough to pass a theoretical Holodeck Turing test ("Wow, I didn't know I was standing in Holodeck, I thought I was in real life.") once you map a wide dynamic range instead of a bottom-end brightness and a top-end brightness. It is harder to see 8-bit limitations of adjacent colors at 1000:1 contrast ratio than at 100,000:1 contrast ratio. At 100x contrast ratio, gradients become 100x more visible during dark scenes, enough to being 8bit limitations in real world material, naturally easily into human limits for some of the colorspace. Contrast ratios' effect on making gradient banding more visible is well known amongst videophile-display engineers, and humans can actually distinguish over a billion colors (albiet not all simultaneously in the same scene) when including luminance if you expand over the whole dynamic range from the dimmest light in a totally dark room, through the brightest blinding whites of the midday sun. Today's display do not have even one-thousandth of that dynamic range, but hundred-thousand-dollar displays exist that I have seen where 24-bit looks like a color-by-the-numbers cartoon.

An extra variable also exists, the momentary dynamic range the human eye is currently at. (A.K.A. Dark-adapted vision). It is harder to tell the difference between two dim grey squares on a white background, than on a black background, on any display. The human vision only has a momentary (instantaneous) effective dynamic range of closer to 100:1 so all very dark greys tend to round to completely black while we are viewing a bright scene, even on a 1000:1 contrast ratio display. So humans can only simultaneously tell apart several thousand colors in the same view, for a given whole average vision brightness (and human iris / vision adaptation moment). Most tell-colors-apart tests show two colors side by side, without compensating surrounding view for maintaining the average-scene-brightness / maintaining iris size. And when such tests compensate for fixed dynamic range (prevent variances in iris size and maintain vision brightness adaptation), suddenly, humans can only tell apart several thousand colors, rather than millions! The way science counts human vision colors need to defines the environmental variables (dynamic range), which is why some of them claim humans can only tell apart a few thousand colors, while others of the studies claim billions of colors. Both studies are correct. (Surprised, eh?)
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
RealNC
Site Admin
Posts: 3741
Joined: 24 Dec 2013, 18:32
Contact:

Re: Banding in 8-bit per channel color

Post by RealNC » 02 Jun 2014, 11:39

The million dollar question here is: is a 10-bit or 16-bit panel going to help anything when playing games, compared to an 8-bit panel? It's not like color depth can be upscaled; AFAIK, that doesn't even make sense for colors.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

spacediver
Posts: 505
Joined: 18 Dec 2013, 23:51

Re: Banding in 8-bit per channel color

Post by spacediver » 02 Jun 2014, 12:17

RealNC wrote:The million dollar question here is: is a 10-bit or 16-bit panel going to help anything when playing games, compared to an 8-bit panel? It's not like color depth can be upscaled; AFAIK, that doesn't even make sense for colors.
Won't make you more competitive, but will allow for better rendering, assuming the source has a high bit depth also. Of course, it's gonna be a while until the entire display chain moves up from 8 bits.

A 10 or 12 bit panel does have the advantage of being able to work with higher precision LUTs, however, which can help eliminate banding artifacts that occur when modifying things like gamma.

flood
Posts: 929
Joined: 21 Dec 2013, 01:25

Re: Banding in 8-bit per channel color

Post by flood » 02 Jun 2014, 13:00

RealNC wrote:The million dollar question here is: is a 10-bit or 16-bit panel going to help anything when playing games, compared to an 8-bit panel? It's not like color depth can be upscaled; AFAIK, that doesn't even make sense for colors.
I think it's not something to worry about too much for now.
but as the industry outgrows the standard srgb gamut, banding (intrinsic in the source) could become more prominent and >=10 bit panels will help with that.

It would be nice to have linear 16bit colors be the standard in the future. Then we'd no longer worry about gamma in image processing

spacediver
Posts: 505
Joined: 18 Dec 2013, 23:51

Re: Banding in 8-bit per channel color

Post by spacediver » 02 Jun 2014, 13:39

flood wrote:
RealNC wrote:
It would be nice to have linear 16bit colors be the standard in the future. Then we'd no longer worry about gamma in image processing
Agreed, and it would also make it easier to integrate visual effects (which, from what I understand, typically operate within a 16 bit linear light framework).

User avatar
Alcazar
Posts: 9
Joined: 31 May 2014, 10:10
Location: Silicon Valley, CA
Contact:

Re: Banding in 8-bit per channel color

Post by Alcazar » 02 Jun 2014, 14:59

spacediver wrote:
RealNC wrote:The million dollar question here is: is a 10-bit or 16-bit panel going to help anything when playing games, compared to an 8-bit panel? It's not like color depth can be upscaled; AFAIK, that doesn't even make sense for colors.
Won't make you more competitive, but will allow for better rendering, assuming the source has a high bit depth also. Of course, it's gonna be a while until the entire display chain moves up from 8 bits.

A 10 or 12 bit panel does have the advantage of being able to work with higher precision LUTs, however, which can help eliminate banding artifacts that occur when modifying things like gamma.
It would allow game developers to make darker game environments (think iD Software's Doom sequel) that can freak you out at night, because fog and shadow effects with very gradual gradient colors will create banding on an 8-bit panel, and the artifacts are plainly obvious in dark scenes. With a 10-bit panel (I have one, by the way, http://www.zdnet.com/dell-ultrasharp-32 ... 000027376/) I think you *can* gain a competitive advantage over other gamers when trying to see details in dark environments.

spacediver
Posts: 505
Joined: 18 Dec 2013, 23:51

Re: Banding in 8-bit per channel color

Post by spacediver » 02 Jun 2014, 15:50

Alcazar wrote: It would allow game developers to make darker game environments (think iD Software's Doom sequel) that can freak you out at night, because fog and shadow effects with very gradual gradient colors will create banding on an 8-bit panel, and the artifacts are plainly obvious in dark scenes. With a 10-bit panel (I have one, by the way, http://www.zdnet.com/dell-ultrasharp-32 ... 000027376/) I think you *can* gain a competitive advantage over other gamers when trying to see details in dark environments.
right, but only if the content/application is 10 bit, and the video card is 10 bit. You could have a 16 or higher bit panel (a CRT for example), and you'll still only get 8 bit color depth in almost all situations.

User avatar
Alcazar
Posts: 9
Joined: 31 May 2014, 10:10
Location: Silicon Valley, CA
Contact:

Re: Banding in 8-bit per channel color

Post by Alcazar » 04 Jun 2014, 10:33

spacediver wrote:
Alcazar wrote: It would allow game developers to make darker game environments (think iD Software's Doom sequel) that can freak you out at night, because fog and shadow effects with very gradual gradient colors will create banding on an 8-bit panel, and the artifacts are plainly obvious in dark scenes. With a 10-bit panel (I have one, by the way, http://www.zdnet.com/dell-ultrasharp-32 ... 000027376/) I think you *can* gain a competitive advantage over other gamers when trying to see details in dark environments.
right, but only if the content/application is 10 bit, and the video card is 10 bit. You could have a 16 or higher bit panel (a CRT for example), and you'll still only get 8 bit color depth in almost all situations.
I think every OS, application or game, and GPU I use is usually doing 10 bit color but maybe this needs further debate :)

User avatar
RealNC
Site Admin
Posts: 3741
Joined: 24 Dec 2013, 18:32
Contact:

Re: Banding in 8-bit per channel color

Post by RealNC » 04 Jun 2014, 10:57

Alcazar wrote:I think every OS, application or game, and GPU I use is usually doing 10 bit color but maybe this needs further debate :)
Games use a 32-bit ARGB format (both in Direct3D as well as OpenGL,) where each color component uses 8 bits, plus an 8-bit alpha channel. AFAIK, only workstation-class GPUs can actually deal with 10-bit color components. NVidia Quadro cards, for example. The GeForce cards do not support it.

But I'm not 100% sure whether mainstream cards can actually support 10-bit colors in OGL/D3D but just don't support actually outputting that to the monitor.

But if games were to use 10-bits per color, that would leave them with only 2 bits for the alpha channel. So I don't think 10-bit color is even useful for games.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

Post Reply