Aldagar wrote: ↑19 Nov 2020, 09:48
Does it effectively reduce bit color depth? If an 8 bit panel can process values from 0 to 255, but we reduce the color gamut coverage by clamping the subpixels ceiling, then we are effectively reducing that range, increasing color innacuracy and banding.
It can.
However, in practice it generally isn't visible if being done properly.
There is often 10-bit temporal dithering (FRC) on 8-bit panels, and also NVIDIA graphics card can do GPU-side temporal dithering too. The result is that we often don't see any difference. Also, color processing on a panel will remap (e.g. contrast, color temperature, gamma setting, etc) so 10-bit processing is pretty darn nigh mandatory anyway for 8-bit GPU to 8-bit display.
But sRGB settings on wide-gamut panels are often badly gimped, to the point where it's sometimes better to create an sRGB profile via an .icc instead. There are some generic sRGB profiles, though it is often better to use a colorimeter to create an sRGB profile in one go (at your preferred gamma, color temperature, etc) so that it becomes only one color processing step rather than multiple (rounding errors). In this case, the GPU side will often temporally-dither as necessary to keep the 8-bit look.
Fortunately temporal dithering 8-bit to 10-bit (or more-bits) -- whether GPU-side or monitor-side FRC -- has no flicker because it's only 1/256th intensity flickering at slow LCD GtG speeds -- flickering between two-adjacent-color pixel values, unlike 6-bit to 8-bit FRC which can produce 1/64th intensity flicker cycles that sometimes become visible.
This is all automagically done, and other kinds of artifacts (inversion artifacts, or imperfectly DC-filtered PWM-free backlights) can have WAY bigger flicker amplitudes on oscilloscopes than 8-bit to 10-bit FRC. Even incandescent light bulbs have bigger flicker amplitudes. At 240Hz 8-bit, you're now looking at 120 flicker cycles of 1/256th intensity changes for 8-to-10-bit FRC/temporal dithering... beyond flicker intensity faintness detection threshold AND beyond flicker fusion threshold.
Aldagar wrote: ↑19 Nov 2020, 09:48
Similarly, does it reduce contrast ratio, since we are limiting the maximum brightness level of the subpixels without actually dimming the backlight?
Properly done, no. As long as you're using the native white gamut (same color temperature versus same color temperature). Sometimes brightness has to go down if you're changing color temperature because you have to reduce brightness of certain color channels to achieve the preferred color temperature.
But it can be identical nits for the sRGB versus wide-gamut depending on how you calibrate. And you can get much brighter when you fudge sRGB to achieve goals (such as a brighter picture that is slightly shifted in color temperature closer to the native backlight color temperature). Which can also greatly reduce banding because you're using a larger amount of the panel's digital dynamic range.