VG248QE, how to tune

Everything about displays and monitors. 120Hz, 144Hz, 240Hz, 4K, 1440p, input lag, display shopping, monitor purchase decisions, compare, versus, debate, and more. Questions? Just ask!
User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: VG248QE, how to tune

Post by Chief Blur Buster » 22 Jan 2014, 15:40

spacediver wrote:Why bother with powerstrip when you can achieve the same modification to the LUT by using the gamma slider in nvidia control panel/ati catalyst control center?
It gets lost when you launch a game.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

spacediver
Posts: 505
Joined: 18 Dec 2013, 23:51

Re: VG248QE, how to tune

Post by spacediver » 22 Jan 2014, 16:08

does powerstrip continuously reinforce the LUT?

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: VG248QE, how to tune

Post by Chief Blur Buster » 22 Jan 2014, 17:13

Don't forget multiple LUTs can become involved.

With monitor LUT, you don't need to continuously reinforce.

With graphics card driver LUT, it can get lost when you launch the game, as it can effectively override with its own LUT.

I am not sure if PowerStrip controls the LUT inside the monitor, but I definitely want to figure out which LUT is being modified by which color calibration utility.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

spacediver
Posts: 505
Joined: 18 Dec 2013, 23:51

Re: VG248QE, how to tune

Post by spacediver » 23 Jan 2014, 02:16

Interesting. Do most monitors have internal LUTs? I'm pretty sure mine don't, but they're CRTs.

Another cool thing about LUTs is that they can be specified with up to 16 bit precision, and I believe Nvidia at least supports 10 bit LUT precision in their DACs. That means that if you have a 10 bit display, you can take advantage of 10 bit LUT precision which means a lot of flexibility in defining your luminance function ("gamma") without banding artifacts. I've tested my FW900 (which doesn't have an internal LUT) and it appears to support 10 bits (I can't test it with higher bit depths as the Nvidia LUT only supports up to 10 bits).

Do display manufacturers release information about their internal monitor LUTs?

MonarchX
Posts: 60
Joined: 23 Feb 2014, 20:07

Re: VG248QE, how to tune

Post by MonarchX » 21 Jul 2014, 18:55

spacediver wrote:Interesting. Do most monitors have internal LUTs? I'm pretty sure mine don't, but they're CRTs.

Another cool thing about LUTs is that they can be specified with up to 16 bit precision, and I believe Nvidia at least supports 10 bit LUT precision in their DACs. That means that if you have a 10 bit display, you can take advantage of 10 bit LUT precision which means a lot of flexibility in defining your luminance function ("gamma") without banding artifacts. I've tested my FW900 (which doesn't have an internal LUT) and it appears to support 10 bits (I can't test it with higher bit depths as the Nvidia LUT only supports up to 10 bits).

Do display manufacturers release information about their internal monitor LUTs?
I know with AMD cards you can force internal 10bit LUT processing and output to 8bit screen, which results in better gradation. madVR does something similar, but I think it goes up to 16bit or 14bit. Can you use 10bit internal LUT processing with nVidia cards and then output back to 8bit? My Eizo Foris FG2421 uses 8bit+FRC, but that doesn't fully get rid of banding created by LUT's, even ArgyllCMS LUT's. Maybe upcoming 1.7.0 ArgyllCMS will improve upon that, but its just as difficult to calibrate 8bit+FRC monitors as it is to calibrate 6bit +FRC monitors, although 6bit + FRC are far worse at showing smooth gradation. I think its the dithering effect itself that makes it difficult for the probes to read grayscale levels that kind of spread to other grayscale levels. Is DisplayPort required for 8bit+FRC? Eizo Foris FG2421 only supports 1.07 billion colors (8bit+FRC) when used with DisplayPort or so it says in the specs.

spacediver
Posts: 505
Joined: 18 Dec 2013, 23:51

Re: VG248QE, how to tune

Post by spacediver » 21 Jul 2014, 19:43

yea these are good questions, and unfortunately hard to find answers to. I'm doing some more research and will report back if I discover anything. MadVR uses temporal dithering (which is basically FRC) to simulate higher bit depths.

Post Reply