sRGB emulation on Wide Color Gamut monitors

Everything about displays and monitors. 120Hz, 144Hz, 240Hz, 4K, 1440p, input lag, display shopping, monitor purchase decisions, compare, versus, debate, and more. Questions? Just ask!
Post Reply
Aldagar
Posts: 33
Joined: 12 Mar 2020, 14:27

sRGB emulation on Wide Color Gamut monitors

Post by Aldagar » 19 Nov 2020, 09:48

It seems there is a trend nowadays towards launching new monitors with Wide Color Gamut coverage. These monitors use special layers to filter the blue LEDs backlight and produce a more even spectral distribution between the RBG channels, but this usually implies that regular sRGB content will look saturated, and many people rely on sRGB modes to make it look natural again.

Leaving aside the fact that on some models the sRGB mode is unreliable, I have some concerns regarding the way sRGB emulation works:
  1. Does it effectively reduce bit color depth? If an 8 bit panel can process values from 0 to 255, but we reduce the color gamut coverage by clamping the subpixels ceiling, then we are effectively reducing that range, increasing color innacuracy and banding.
  2. Similarly, does it reduce contrast ratio, since we are limiting the maximum brightness level of the subpixels without actually dimming the backlight?

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: sRGB emulation on Wide Color Gamut monitors

Post by Chief Blur Buster » 20 Nov 2020, 14:32

Aldagar wrote:
19 Nov 2020, 09:48
Does it effectively reduce bit color depth? If an 8 bit panel can process values from 0 to 255, but we reduce the color gamut coverage by clamping the subpixels ceiling, then we are effectively reducing that range, increasing color innacuracy and banding.
It can.

However, in practice it generally isn't visible if being done properly.

There is often 10-bit temporal dithering (FRC) on 8-bit panels, and also NVIDIA graphics card can do GPU-side temporal dithering too. The result is that we often don't see any difference. Also, color processing on a panel will remap (e.g. contrast, color temperature, gamma setting, etc) so 10-bit processing is pretty darn nigh mandatory anyway for 8-bit GPU to 8-bit display.

But sRGB settings on wide-gamut panels are often badly gimped, to the point where it's sometimes better to create an sRGB profile via an .icc instead. There are some generic sRGB profiles, though it is often better to use a colorimeter to create an sRGB profile in one go (at your preferred gamma, color temperature, etc) so that it becomes only one color processing step rather than multiple (rounding errors). In this case, the GPU side will often temporally-dither as necessary to keep the 8-bit look.

Fortunately temporal dithering 8-bit to 10-bit (or more-bits) -- whether GPU-side or monitor-side FRC -- has no flicker because it's only 1/256th intensity flickering at slow LCD GtG speeds -- flickering between two-adjacent-color pixel values, unlike 6-bit to 8-bit FRC which can produce 1/64th intensity flicker cycles that sometimes become visible.

This is all automagically done, and other kinds of artifacts (inversion artifacts, or imperfectly DC-filtered PWM-free backlights) can have WAY bigger flicker amplitudes on oscilloscopes than 8-bit to 10-bit FRC. Even incandescent light bulbs have bigger flicker amplitudes. At 240Hz 8-bit, you're now looking at 120 flicker cycles of 1/256th intensity changes for 8-to-10-bit FRC/temporal dithering... beyond flicker intensity faintness detection threshold AND beyond flicker fusion threshold.
Aldagar wrote:
19 Nov 2020, 09:48
Similarly, does it reduce contrast ratio, since we are limiting the maximum brightness level of the subpixels without actually dimming the backlight?
Properly done, no. As long as you're using the native white gamut (same color temperature versus same color temperature). Sometimes brightness has to go down if you're changing color temperature because you have to reduce brightness of certain color channels to achieve the preferred color temperature.

But it can be identical nits for the sRGB versus wide-gamut depending on how you calibrate. And you can get much brighter when you fudge sRGB to achieve goals (such as a brighter picture that is slightly shifted in color temperature closer to the native backlight color temperature). Which can also greatly reduce banding because you're using a larger amount of the panel's digital dynamic range.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Aldagar
Posts: 33
Joined: 12 Mar 2020, 14:27

Re: sRGB emulation on Wide Color Gamut monitors

Post by Aldagar » 23 Nov 2020, 15:57

I suspected FRC dithering played a role in sRGB emulation. My LG 27GL850 locks the response time setting to "Fast" when using the sRGB mode, and I wonder if it's purposedly designed like that in order to correctly emulate sRGB or if it was an oversight from LG. With the "Fast" overdrive setting there is some overshoot specially at lower refresh rates (not extreme, but it's there) and I've even considered returning the monitor because of that, but from what I've seen LG's sRGB calibration and quality control are quite good so I think I will have to live with that compromise (that, and also low contrast ratio).

silikone
Posts: 57
Joined: 02 Aug 2014, 12:27

Re: sRGB emulation on Wide Color Gamut monitors

Post by silikone » 25 Nov 2020, 07:43

Not even native sRGB monitors always get it right. Some models forego the linear EOTF portion and instead approximate a natural 2.2 curve. While this reduces perceived banding, it also has a tendency to crush details on screens with typical contrast ratios.
To be fair, it was never really clear which one was the "correct" approach, but then the sRGB standard also uses a lousy 80 nits as its reference white, so who cares what it says at the end of the day.

Post Reply