- Using the monitor's sRGB mode. I suppose this is the most accurate method since it's a factory calibration (not a perfect one, but a calibration nonetheless). Unfortunately, this locks many other settings, including the overdrive to "Fast", which produces very noticeable overshoot, specially at low refresh rates. Many other monitors lock even the brightness setting.
- Enabling HDR under "Windows HD Color settings". This seems to remap SDR content to the monitor's gamut. I'm not sure how reliable this method is or if it works with all applications and software, but from what I tested it works on the desktop, Firefox, VLC and the games I tried. Sadly, this also locks many settings, but it allows me to change the overdrive. Also, some games look noticeably darker, and I'm not sure if it's gamma related or because HDR on this monitor is not very good.
- In the AMD Radeon software, under display settings, enabling "Custom Color" and disabling "Color Temperature Control". This seems to clamp the gamut to sRGB and it works at GPU driver level, but I believe it's the least accurate method. It's very unintuitive, but an AMD staff explains it here: https://community.amd.com/t5/drivers-so ... td-p/92613
What is the best way to configure a Wide Color Gamut monitor without a colorimeter?
What is the best way to configure a Wide Color Gamut monitor without a colorimeter?
When I got my LG 27GL850 (98% DCI-P3 coverage) the first thing I wanted to do was configure it so that it didn't oversaturate the colors, since I want content to be displayed the way it was intended with natural colors. I do not have a colorimeter nor I intend to get one in the near future, but I discovered there are some ways to achieve it without calibrating the display or creating an ICC profile:
Re: What is the best way to configure a Wide Color Gamut monitor without a colorimeter?
I just bought the LG 32GP850-B which is a Wide Gamut Color monitor as well.
As you denoted I think theoretically the best way would be to let windows via the HDR setting know that your monitor is a WGC one and do all the color calculations and mappings according to the input color space, but for now HDR seems buggy and needs to get optimized.
Also controls being locked on the OSD while on HDR is indeed an unwelcome limitation.
This is why I find the third option to be your best bet until all the others get better.
Since I have an nvidia card I tried the an sRGB clamp tool which has the same functionality as disabling "Color Temperature Control" on AMD cards and it seems to be working great!:
https://github.com/ledoge/novideo_srgb
I was able to get almost identical results using novideosrgb to the monitor's OSD srgb setting, without locking those valuable OSD controls.. So you get really close to sRGB color accuracy while still retaining customizability..
Lets hope color calibration for WCG displays in Windows and GPU drivers gets better in the near future, because in the current state they clearly lacking a solid support.
As you denoted I think theoretically the best way would be to let windows via the HDR setting know that your monitor is a WGC one and do all the color calculations and mappings according to the input color space, but for now HDR seems buggy and needs to get optimized.
Also controls being locked on the OSD while on HDR is indeed an unwelcome limitation.
This is why I find the third option to be your best bet until all the others get better.
Since I have an nvidia card I tried the an sRGB clamp tool which has the same functionality as disabling "Color Temperature Control" on AMD cards and it seems to be working great!:
https://github.com/ledoge/novideo_srgb
I was able to get almost identical results using novideosrgb to the monitor's OSD srgb setting, without locking those valuable OSD controls.. So you get really close to sRGB color accuracy while still retaining customizability..
Lets hope color calibration for WCG displays in Windows and GPU drivers gets better in the near future, because in the current state they clearly lacking a solid support.
