Banding in 8-bit per channel color

Advanced display talk, display hackers, advanced game programmers, scientists, display researchers, display manufacturers, vision researchers & Advanced Display Articles on Blur Busters. The masters on Blur Busters.
RLBURNSIDE
Posts: 104
Joined: 06 Apr 2015, 16:09

Re: Banding in 8-bit per channel color

Post by RLBURNSIDE » 29 Dec 2015, 16:31

Falkentyne wrote:
RLBURNSIDE wrote:
spacediver wrote:Changing 8 bit to 10 bit is not going to magically give your display a higher dynamic range.
Well, no, but did you know that standard 8-bit / Rec 709 is meant to represent 0-100 nits dynamic range? And that most LCD monitors go up to 400 if not much more? What do you think they do with those extra nits in the display? That's right, they artificially expand the dynamic range of the incoming signal, making it washed out.

If you pass in an HDR signal (and that sort of requires 10-bit to avoid banding), either from a game or a movie or a netflix show that has HDR, then you can definitely take advantage of that 400 nits peak white capability. The highest brightness peaks are only used for the shiny reflective parts of the scene, and yes, some overall increase of average luminance too.

Even if displays don't natively support 10-bit, many do have 8-bit + FRC and in that case it's definitely worth it to send a 10-bit signal to the monitor if you can. And not one that's merely an 8-bit value plugged into 10-bits, but real, actual 10-bit signals.

In the next couple years we should see a lot more games and game engines switching to 10-bit as they try to take advantage of HDR, which represents a massive improvement in visual quality and "pop", even on older LCD monitors that have way more than 100 nits to play with and have for years and years.
How will this affect 8 bit (6 bit +Hi FRC) monitors like the XL2720Z?
You can do any dynamic range at any bit depth, but there's this thing called the Barten threshold which is a curve beyond which one sees banding. If you expand the dynamic range, you need more bits to avoid seeing banding. Or, at the same bit depth (in most cases, 8, for typical LCDs), increasing the dynamic range in the signal will only make banding worse. Banding sucks, you can see it on pretty much any video signal or game once you know what it looks like. Temporal dithering aka FRC is better than nothing, because it's oscillating between neighbouring values quickly : in other words letting the rods and cones in your eyes average out the intermediate values and give you a smoother gradient (in skies, or spotlights, or whatever typical banding areas). FRC is effective, apparently nearly as effective as true bit depth increase, which is why 6 bit + FRC became so popular, since many people can't tell the difference, at least not at first glance.

What's dumb of course is that when display manufacturers increased to 8 bit native panels, many of them didn't keep the FRC active in order to simulate 10-bit input (without quantizing them to nearest neighbour which would be 100% equivalent to 8-bit).

At any given bit depth, you should always add FRC, at least until you reach 16-bit. There's also the issue of which gamma curve you are using. Using Perceptual Quantizer (PQ) gamma, which is the new standard for HDR, the same number of bits are encoded more efficiently. It's currently used for HDR10 (UHD Bluray base standard) or 12 (Dolby Vision) to guarantee no perceptible banding is seen in either 0-1000 nits or 0-10000 nits range respectively. But there's no real reason you couldn't use an existing 8 bit signal with more effective bit depth mapping range. The problem is, of course the old CRT-era Gamma standard is just a curve, it doesn't specify an absolute peak nits value, so it's a terrible standard. If you could update the firmware on your LCD monitor to support PQ gamma, then you could probably get by with 0-100 nits or even more with zero banding, due to more efficient encoding. Gamma is apparently decent in the 0-100 range and maps closely to the human vision in that range of brightness, but it's still inferior to PQ. PQ is based on better / more advanced understanding and studies of our perceptual system, hence the name.

There are some really good articles on the topic over at AVS forum. It's really too bad it seems this forum isn't very active, I learned a lot here about motion blur but it's been pretty dead.

spacediver
Posts: 505
Joined: 18 Dec 2013, 23:51

Re: Banding in 8-bit per channel color

Post by spacediver » 30 Dec 2015, 10:56

RLBURNSIDE wrote: But there's no real reason you couldn't use an existing 8 bit signal with more effective bit depth mapping range. The problem is, of course the old CRT-era Gamma standard is just a curve, it doesn't specify an absolute peak nits value, so it's a terrible standard.
Well, yes and no. The introduction of BT.1886 went some way towards standardizing this curve. While peak luminance was never explicitly stated, the defacto value was probably somewhere between 80 and 100 nits. Another important aspect is that the electro-optical transfer function of a calibrated CRT was actually fairly perceptually uniform (something that Poynton referred to as an "amazing coincidence"), although, as you say, not as uniform as PQ.
RLBURNSIDE wrote:
If you could update the firmware on your LCD monitor to support PQ gamma, then you could probably get by with 0-100 nits or even more with zero banding, due to more efficient encoding. Gamma is apparently decent in the 0-100 range and maps closely to the human vision in that range of brightness, but it's still inferior to PQ. PQ is based on better / more advanced understanding and studies of our perceptual system, hence the name.
Yes, you could do this, but if you're watching HD content that was mastered with BT.1886, then you might be missing out on artistic intent. Still, for regular use, if you can achieve a PQ function without introducing quantization artifacts, this could be a great thing :)

Post Reply