flood wrote:omg i dont even
Acting a bit friendlier would help a ton in getting a point across, you know. Just saying.
When you zoom-in on a grayscale gradient, your display is not doing any kind of dithering on the results. Of course you are going to see banding. A 10-bit gradient would have 1024 individual steps. You'd still be able to make out steps when zooming in without dithering on a 10-bit panel. A 16-bit gradient would have 65536 steps. There's a good chance that you'd still be able to make out the difference between two adjacent steps, even with 65536 shades and a 16-bit panel. But you'd have to ask someone who knows about human vision.
But that's not what happens in real scenarios. The display does apply dithering. 6-bit panels are able to prevent banding by dithering the 8-bit source. Games are using 8 bits per channel. A good 8-bit display will not produce any kind of banding that isn't already in the source.
Maybe we're talking about different things, here. My point is that an 8-bit panel is all you need to see the source material as-is, without introducing any banding that isn't already there
in the source material itself.
Once you have source material with better color resolution than 8 bits, then the 8-bit panel can also dither. But there's no sign of games using more than 24-bit RGB any time soon.
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.