Banding in 8-bit per channel color

Advanced display talk, display hackers, advanced game programmers, scientists, display researchers, display manufacturers, vision researchers & Advanced Display Articles on Blur Busters. The masters on Blur Busters.
User avatar
RealNC
Site Admin
Posts: 3741
Joined: 24 Dec 2013, 18:32
Contact:

Banding in 8-bit per channel color

Post by RealNC » 01 Jun 2014, 11:35

Alcazar wrote:(From the original thread: http://forums.blurbusters.com/viewtopic ... t=10#p6781)

I heard from my friend at nvidia that G-SYNC is only capable of 8bit color right now, did you guys know that?
I didn't. But doesn't sound much of a limitation, at least at this time. All TN panels are 6-bit. ASUS has announced a G-Sync monitor with an 8-bit TN panel (first of its kind, AFAIK.)

IMO, "24-bit RGB ought to be enough for anybody" :mrgreen: Only half-kidding here, even. I can't imagine 30-bit RGB (10-bit panels) being anywhere near to even remotely useful for playing video games (which is what G-Sync is used for.)
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Why do we need refresh rates at all?

Post by Chief Blur Buster » 01 Jun 2014, 18:36

RealNC wrote:IMO, "24-bit RGB ought to be enough for anybody" :mrgreen: Only half-kidding here, even. I can't imagine 30-bit RGB (10-bit panels) being anywhere near to even remotely useful for playing video games (which is what G-Sync is used for.)
That said, 10-bit color has an advantage of allowing a lot of color processing to occur before banding appears. For example, lots of translucent layers, flog layers, murky dark layers, and banding won't as easily show up after overprocessing. However, I don't see 10-bit being widespread in high-performance gaming for a long time...

That said, 10-bit is good for watching movies with videophile picture quality, though not all stages in the video chain supports it (e.g. Netflix streams, playing back typical MKV files, etc).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
RealNC
Site Admin
Posts: 3741
Joined: 24 Dec 2013, 18:32
Contact:

Re: Why do we need refresh rates at all?

Post by RealNC » 02 Jun 2014, 02:35

Chief Blur Buster wrote:That said, 10-bit color has an advantage of allowing a lot of color processing to occur before banding appears. For example, lots of translucent layers, flog layers, murky dark layers, and banding won't as easily show up after overprocessing.
I think this can't happen with 8-bit panels either. Just like oversampling, "overcoloring" would mean that internally games render in 10 bits per channel mode. At the final step, reduction is done.

It's still oversampling, of course. It removes any banding that would occur, the same way as resolution oversampling ("supersampling") removes aliasing, and audio oversampling (by internally using floating point or 24-bit or 32-bit integer samples) for doing audio effects or mixing, and then reducing again to 16-bits on output removes audio aliasing.

I would be really, really surprised to see banding on any display that can show 16.7 million distinct colors (which what 8 bits per channel displays can show.) Unless the color space if off, of course. But then, than can happen with any amount of bits per channel, so that doesn't count.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

flood
Posts: 929
Joined: 21 Dec 2013, 01:25

Re: Why do we need refresh rates at all?

Post by flood » 02 Jun 2014, 04:06

http://www.lagom.nl/lcd-test/gradient.php
zoom in and take a look.
I didnt count, but I can easily see >200 of the 255 "bands" on my iphone 5

on a 3x8 bit display, banding is for sure visible in gentle gradients. our eyes can easily tell the color difference between adjacent colors in colorspace. But on a good calibrated 8bit display, bands in gradients should not be too harsh on the eyes though.

oh and on a 8 bit display, dithering can make banding essentially invisible. by dithering, i mean something like the algorithm in forum.doom9.org/showthread.php?s=749945078702bed8a9a4ed1d959c76ce&t=160038

User avatar
RealNC
Site Admin
Posts: 3741
Joined: 24 Dec 2013, 18:32
Contact:

Re: Why do we need refresh rates at all?

Post by RealNC » 02 Jun 2014, 04:17

flood wrote:http://www.lagom.nl/lcd-test/gradient.php
zoom in and take a look.
I didnt count, but I can easily see >200 of the 255 "bands" on my iphone 5
Don't zoom in! If you zoom in, then you're effectively doing the same as zooming in on an image on a 300DPI monitor and then complaining about being able to make out pixels.

Even if you had a 12-bit, or 16-bit display, you would always see banding when you zoom in. Hell, you could have a 10000-bit panel even, but when you zoom in, there will be banding.

On that picture you linked to, I cannot see any banding on my 6-bit TN monitor.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

flood
Posts: 929
Joined: 21 Dec 2013, 01:25

Re: Why do we need refresh rates at all?

Post by flood » 02 Jun 2014, 04:31

dude... do you even undrstand where banding comes from...
it's not related to how large the region of pixels with constant colors are; it's the fact that adjacent colors in colorspace like (10,10,10) and (11,11,11) are easily distinguishable by our eyes.

on a 16 bit display with a 16 bit image of a gradient you will not see banding assuming a reasonable gamma curve.

User avatar
RealNC
Site Admin
Posts: 3741
Joined: 24 Dec 2013, 18:32
Contact:

Re: Why do we need refresh rates at all?

Post by RealNC » 02 Jun 2014, 04:36

flood wrote:dude... do you even undrstand where banding comes from...
I do. But since you zoomed-in rather than keeping it 1:1, I assume that you don't :P
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

flood
Posts: 929
Joined: 21 Dec 2013, 01:25

Re: Why do we need refresh rates at all?

Post by flood » 02 Jun 2014, 04:43

omg i dont even

the image has color values of
0011223344...
if i zoom in it becomes
0000011111222223333344444

thats exactly what a softer gradient shows

User avatar
RealNC
Site Admin
Posts: 3741
Joined: 24 Dec 2013, 18:32
Contact:

Re: Why do we need refresh rates at all?

Post by RealNC » 02 Jun 2014, 05:26

flood wrote:omg i dont even
Acting a bit friendlier would help a ton in getting a point across, you know. Just saying.

When you zoom-in on a grayscale gradient, your display is not doing any kind of dithering on the results. Of course you are going to see banding. A 10-bit gradient would have 1024 individual steps. You'd still be able to make out steps when zooming in without dithering on a 10-bit panel. A 16-bit gradient would have 65536 steps. There's a good chance that you'd still be able to make out the difference between two adjacent steps, even with 65536 shades and a 16-bit panel. But you'd have to ask someone who knows about human vision.

But that's not what happens in real scenarios. The display does apply dithering. 6-bit panels are able to prevent banding by dithering the 8-bit source. Games are using 8 bits per channel. A good 8-bit display will not produce any kind of banding that isn't already in the source.

Maybe we're talking about different things, here. My point is that an 8-bit panel is all you need to see the source material as-is, without introducing any banding that isn't already there in the source material itself.

Once you have source material with better color resolution than 8 bits, then the 8-bit panel can also dither. But there's no sign of games using more than 24-bit RGB any time soon.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

flood
Posts: 929
Joined: 21 Dec 2013, 01:25

Re: Why do we need refresh rates at all?

Post by flood » 02 Jun 2014, 05:46

RealNC wrote: I would be really, really surprised to see banding on any display that can show 16.7 million distinct colors (which what 8 bits per channel displays can show.) Unless the color space if off, of course. But then, than can happen with any amount of bits per channel, so that doesn't count.
i'm just saying that it is possible to see banding on a 8 bit display when you have an 8 bit image of a gradient. obviously the effect would be more mild than on a 6bit display, but that doesnt mean that 8 bit images on a good 8 bit display are immune from banding. To use your terminology, well 8 bit images themselves often intrinsically have visible banding.

I'm not sure what you meant by how zooming in is wrong...

as for games, i believe we are stuck on 3x8bit because gpu's are designed for 8 bit operations.

Post Reply