Banding in 8-bit per channel color

Advanced display talk, display hackers, advanced game programmers, scientists, display researchers, display manufacturers, vision researchers & Advanced Display Articles on Blur Busters. The masters on Blur Busters.
silikone
Posts: 57
Joined: 02 Aug 2014, 12:27

Re: Banding in 8-bit per channel color

Post by silikone » 13 Sep 2014, 11:57

If bit depth increases further, would the extra bits fill in the bands, or would the contrast expand to "whiter than white"? For example, going from 8 to 10 bits could make conventional white just 25% of the total maximum brightness. This would allow bloom effects from bright lights and reflections.
It's likely that this would cause eye strain under normal viewing conditions, though.

spacediver
Posts: 505
Joined: 18 Dec 2013, 23:51

Re: Banding in 8-bit per channel color

Post by spacediver » 13 Sep 2014, 12:31

Changing 8 bit to 10 bit is not going to magically give your display a higher dynamic range.

flood
Posts: 929
Joined: 21 Dec 2013, 01:25

Re: Banding in 8-bit per channel color

Post by flood » 13 Sep 2014, 16:36

silikone wrote:If bit depth increases further, would the extra bits fill in the bands, or would the contrast expand to "whiter than white"? For example, going from 8 to 10 bits could make conventional white just 25% of the total maximum brightness. This would allow bloom effects from bright lights and reflections.
It's likely that this would cause eye strain under normal viewing conditions, though.
both are possible but note that, for lcd's, maximum white level is limited by the backlight setting

Haste
Posts: 326
Joined: 22 Dec 2013, 09:03

Re: Banding in 8-bit per channel color

Post by Haste » 13 Sep 2014, 23:39

I can see a lot of banding in many games on the ROG SWIFT which is an 8bit TN panel.
Monitor: Gigabyte M27Q X

RLBURNSIDE
Posts: 104
Joined: 06 Apr 2015, 16:09

Re: Banding in 8-bit per channel color

Post by RLBURNSIDE » 02 Nov 2015, 15:27

Banding in skies / foggy areas / spotlights / godrays is very much an issue.

I've been experimenting at work increasing the swap chain on Xbox One and PS4 to 10-bit which is then passed to my 8-bit + FRC television/monitor via HDMI Deep Color (30-bit) but I haven't compared the results yet. One big problem with this is the fact that most 1080p TVs, even if they can do 10-bit via 8-bit + FRC or 10-bit natively (many UHD TVs now to 10-bit natively, in 2015), don't have the input bandwidth to accept Deep Color at 1080p/60, since until very recently 165 mhz HDMI 1.4a input chips were used which maxes out at around 1080p / 60 / 444 / 8bpc.

That said, I am working on a cheap DIY solution to use a yellow notch filter on my BenQ w1070 projector (which can do 1.07 billion collors and can accept 10-bit Deep Color over HDMI 1.4a) to expand the color gamut from rec 709 to DCI P3, in order to be compatible with UHD Blurays. I also realized that my projector's VGA input, being analog, is one way to avoid having to drop to 422 chroma subsampling since the projector doesn't have the input bandwidth over HDMI for 1080p / 60 / 10-bit / 444, but it can do it via VGA, which is only limited by SNR from the analog step.

The HD Fury IV has an HDMI 2.0a input port and can output 1080p / 10-bit / 444 to its VGA / component via a 10 or 11 bit RAMDAC. Either that or I just pick up an AMD graphics card and let Windows do it. I haven't checked up on what bit depth the RAMDACs are on modern videocards but I assume / hope they're at least 10-bit capable. In theory, VGA has arbitrary bit depth since it's analog, it only depends on your RAMDAC's capabilities (and your app / game / windows desktop settings) / cable length / SNR. You can get around having a long analog cable by running HDMI all the way to the TV then just use VGA at the last step. Or use DVI-D then VGA conversion box right at the end.

I'm sharing my results for the yellow notch filter DIY project over at AVS for anyone who might be interested. In theory you could also make passive 2D YNF filter glasses and watch P3 video on your rec 709 PC monitor or HDTV as well, although I doubt many would want to. Anyway, as a proof of concept, it's pretty cool to be able to expand your color gamut and one of the most fun things about owning a projector. It costs about 30% light loss to expand the color gamut from rec 709 to P3, but to play back UHD Blurays on a cheap 600 dollar projector, I think buying / installing a cheap filter is a worthy hack. And you definitely need 10-bit input / processing to pull it off since expanded color gamuts only make banding worse, not better.

Of course I still have to figure out the right blend of luminance range mapping / EOTF -> Gamma 2.6 functions to use. The commercial Christie Cinema DLP projectors use yellow notch filters + 2.6 gamma to expand from 709 to p3, that's where I got the idea.

Increasing to 10-bit swap chain to reduce quantization errors for both SDR and HDR is coming to gaming, trust me. Like Chief Blur Buster says, it will take a while before the 8-bit to 10-bit migration reaches peak adoption but the swap chain bit depth is something that can be overridden by the console manufacturers and/or PC users in windowed borderless mode.

It's too bad CRTs are a thing of the past though, bit depth is another way they were clearly superior to most displays out now. Although beyond 10-bit or 12-bit the benefits are much bigger if you use PQ gamma instead of regular Gamma. PQ Gamma is actually beneficial even at 8-bit to lower perceptual banding, but it becomes a necessity at 10- or 12-bit when combined with HDR10 or HDR12 (Dolby Vision) to keep everything below the Barton threshold (which is the curve below which banding at a given dynamic range and bit depth becomes imperceptible).
Last edited by RLBURNSIDE on 02 Nov 2015, 15:32, edited 2 times in total.

RLBURNSIDE
Posts: 104
Joined: 06 Apr 2015, 16:09

Re: Banding in 8-bit per channel color

Post by RLBURNSIDE » 02 Nov 2015, 15:29

Haste wrote:I can see a lot of banding in many games on the ROG SWIFT which is an 8bit TN panel.
For 600 bucks you can buy a BenQ w1070 which supports 10-bit / 1.07 billion colors through 8bit + FRC (probably, I doubt it's native). Anyway, another reason to buy a projector. But make sure any HDTV or projector you ever buy from now on supports 10-bit input, either natively or not. Heck, if you get a TV that has a 10-bit panel, it's still dumb if it doesn't support 12-bit through 10-bit + FRC. It's just throwing away quality for nothing.

Glide
Posts: 280
Joined: 24 Mar 2015, 20:33

Re: Banding in 8-bit per channel color

Post by Glide » 02 Nov 2015, 16:44

Alien: Isolation will do 10-bit color and it definitely reduces banding on a 10-bit LCD: (photos from hardforum)

RLBURNSIDE
Posts: 104
Joined: 06 Apr 2015, 16:09

Re: Banding in 8-bit per channel color

Post by RLBURNSIDE » 28 Nov 2015, 09:38

spacediver wrote:Changing 8 bit to 10 bit is not going to magically give your display a higher dynamic range.
Well, no, but did you know that standard 8-bit / Rec 709 is meant to represent 0-100 nits dynamic range? And that most LCD monitors go up to 400 if not much more? What do you think they do with those extra nits in the display? That's right, they artificially expand the dynamic range of the incoming signal, making it washed out.

If you pass in an HDR signal (and that sort of requires 10-bit to avoid banding), either from a game or a movie or a netflix show that has HDR, then you can definitely take advantage of that 400 nits peak white capability. The highest brightness peaks are only used for the shiny reflective parts of the scene, and yes, some overall increase of average luminance too.

Even if displays don't natively support 10-bit, many do have 8-bit + FRC and in that case it's definitely worth it to send a 10-bit signal to the monitor if you can. And not one that's merely an 8-bit value plugged into 10-bits, but real, actual 10-bit signals.

In the next couple years we should see a lot more games and game engines switching to 10-bit as they try to take advantage of HDR, which represents a massive improvement in visual quality and "pop", even on older LCD monitors that have way more than 100 nits to play with and have for years and years.

Falkentyne
Posts: 2793
Joined: 26 Mar 2014, 07:23

Re: Banding in 8-bit per channel color

Post by Falkentyne » 28 Nov 2015, 09:56

RLBURNSIDE wrote:
spacediver wrote:Changing 8 bit to 10 bit is not going to magically give your display a higher dynamic range.
Well, no, but did you know that standard 8-bit / Rec 709 is meant to represent 0-100 nits dynamic range? And that most LCD monitors go up to 400 if not much more? What do you think they do with those extra nits in the display? That's right, they artificially expand the dynamic range of the incoming signal, making it washed out.

If you pass in an HDR signal (and that sort of requires 10-bit to avoid banding), either from a game or a movie or a netflix show that has HDR, then you can definitely take advantage of that 400 nits peak white capability. The highest brightness peaks are only used for the shiny reflective parts of the scene, and yes, some overall increase of average luminance too.

Even if displays don't natively support 10-bit, many do have 8-bit + FRC and in that case it's definitely worth it to send a 10-bit signal to the monitor if you can. And not one that's merely an 8-bit value plugged into 10-bits, but real, actual 10-bit signals.

In the next couple years we should see a lot more games and game engines switching to 10-bit as they try to take advantage of HDR, which represents a massive improvement in visual quality and "pop", even on older LCD monitors that have way more than 100 nits to play with and have for years and years.
How will this affect 8 bit (6 bit +Hi FRC) monitors like the XL2720Z?

spacediver
Posts: 505
Joined: 18 Dec 2013, 23:51

Re: Banding in 8-bit per channel color

Post by spacediver » 28 Nov 2015, 12:10

RLBURNSIDE wrote: Well, no, but did you know that standard 8-bit / Rec 709 is meant to represent 0-100 nits dynamic range? And that most LCD monitors go up to 400 if not much more? What do you think they do with those extra nits in the display? That's right, they artificially expand the dynamic range of the incoming signal, making it washed out.
Yep, I'm aware of the relationship between bit depth and dynamic range. My post was just emphasizing that bit depth and contrast are orthogonal parameters.

Post Reply