It gets lost when you launch a game.spacediver wrote:Why bother with powerstrip when you can achieve the same modification to the LUT by using the gamma slider in nvidia control panel/ati catalyst control center?
VG248QE, how to tune
- Chief Blur Buster
- Site Admin
- Posts: 11653
- Joined: 05 Dec 2013, 15:44
- Location: Toronto / Hamilton, Ontario, Canada
- Contact:
Re: VG248QE, how to tune
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter
Forum Rules wrote: 1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
2. Please report rule violations If you see a post that violates forum rules, then report the post.
3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!
-
- Posts: 505
- Joined: 18 Dec 2013, 23:51
Re: VG248QE, how to tune
does powerstrip continuously reinforce the LUT?
- Chief Blur Buster
- Site Admin
- Posts: 11653
- Joined: 05 Dec 2013, 15:44
- Location: Toronto / Hamilton, Ontario, Canada
- Contact:
Re: VG248QE, how to tune
Don't forget multiple LUTs can become involved.
With monitor LUT, you don't need to continuously reinforce.
With graphics card driver LUT, it can get lost when you launch the game, as it can effectively override with its own LUT.
I am not sure if PowerStrip controls the LUT inside the monitor, but I definitely want to figure out which LUT is being modified by which color calibration utility.
With monitor LUT, you don't need to continuously reinforce.
With graphics card driver LUT, it can get lost when you launch the game, as it can effectively override with its own LUT.
I am not sure if PowerStrip controls the LUT inside the monitor, but I definitely want to figure out which LUT is being modified by which color calibration utility.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter
Forum Rules wrote: 1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
2. Please report rule violations If you see a post that violates forum rules, then report the post.
3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!
-
- Posts: 505
- Joined: 18 Dec 2013, 23:51
Re: VG248QE, how to tune
Interesting. Do most monitors have internal LUTs? I'm pretty sure mine don't, but they're CRTs.
Another cool thing about LUTs is that they can be specified with up to 16 bit precision, and I believe Nvidia at least supports 10 bit LUT precision in their DACs. That means that if you have a 10 bit display, you can take advantage of 10 bit LUT precision which means a lot of flexibility in defining your luminance function ("gamma") without banding artifacts. I've tested my FW900 (which doesn't have an internal LUT) and it appears to support 10 bits (I can't test it with higher bit depths as the Nvidia LUT only supports up to 10 bits).
Do display manufacturers release information about their internal monitor LUTs?
Another cool thing about LUTs is that they can be specified with up to 16 bit precision, and I believe Nvidia at least supports 10 bit LUT precision in their DACs. That means that if you have a 10 bit display, you can take advantage of 10 bit LUT precision which means a lot of flexibility in defining your luminance function ("gamma") without banding artifacts. I've tested my FW900 (which doesn't have an internal LUT) and it appears to support 10 bits (I can't test it with higher bit depths as the Nvidia LUT only supports up to 10 bits).
Do display manufacturers release information about their internal monitor LUTs?
Re: VG248QE, how to tune
I know with AMD cards you can force internal 10bit LUT processing and output to 8bit screen, which results in better gradation. madVR does something similar, but I think it goes up to 16bit or 14bit. Can you use 10bit internal LUT processing with nVidia cards and then output back to 8bit? My Eizo Foris FG2421 uses 8bit+FRC, but that doesn't fully get rid of banding created by LUT's, even ArgyllCMS LUT's. Maybe upcoming 1.7.0 ArgyllCMS will improve upon that, but its just as difficult to calibrate 8bit+FRC monitors as it is to calibrate 6bit +FRC monitors, although 6bit + FRC are far worse at showing smooth gradation. I think its the dithering effect itself that makes it difficult for the probes to read grayscale levels that kind of spread to other grayscale levels. Is DisplayPort required for 8bit+FRC? Eizo Foris FG2421 only supports 1.07 billion colors (8bit+FRC) when used with DisplayPort or so it says in the specs.spacediver wrote:Interesting. Do most monitors have internal LUTs? I'm pretty sure mine don't, but they're CRTs.
Another cool thing about LUTs is that they can be specified with up to 16 bit precision, and I believe Nvidia at least supports 10 bit LUT precision in their DACs. That means that if you have a 10 bit display, you can take advantage of 10 bit LUT precision which means a lot of flexibility in defining your luminance function ("gamma") without banding artifacts. I've tested my FW900 (which doesn't have an internal LUT) and it appears to support 10 bits (I can't test it with higher bit depths as the Nvidia LUT only supports up to 10 bits).
Do display manufacturers release information about their internal monitor LUTs?
-
- Posts: 505
- Joined: 18 Dec 2013, 23:51
Re: VG248QE, how to tune
yea these are good questions, and unfortunately hard to find answers to. I'm doing some more research and will report back if I discover anything. MadVR uses temporal dithering (which is basically FRC) to simulate higher bit depths.