Page 1 of 1

Can the ambient light output by a display decrease the perceived color vibrancy?

Posted: 20 Aug 2023, 00:45
by jman54
I recently used a crt at varying refresh rates and noticed a few things that has made me question how the brain perceives colors on a display. First, is that my crt appears to have more color vibrancy than my lcd in a near black room even though the lcd has a higher contrast ratio by a factor of nearly 5x. I am talking about an image with no darkness as well, so it's not the crt black levels. I hypothesized that maybe it was because the human eye is less sensitive to light when there is a lot of ambient light in the room and conversely when there is less it is more sensitive to light.

Logically the light from the display itself should also force this eye adaption as well. The crt is at least half as bright if not more and therefore my eye would be more sensitive to the light it output. I decided to test my idea by lowering the refresh rate of my crt to see if it would appear even brighter. (a crt display draws a frame and the frame immediately starts to fade, meaning there is actually a blackness between each frame, lcds don't do this, instead they swap out an image for the next with no black between them). The result was that it looked even brighter at 48hz than at higher framerates!

Yet, there was a slightly visible black between the images, but when I saw the image, it appeared even brighter than when at a higher framerate. I know the brightness did not actually increase because I discussed this with a friend who noticed this as well and he measured the same luminance at either framerate, which should mean each frame has the same brightness regardless of framerate. This suggests that the increased black image in my persistence vision, ie a noticeable flickered light, can actually give you higher perceived brightness than a static light (and therefore perceived contrast? It is hard to gauge the contrast since I cannot compare the framerates simultaneously. Picking up on brightness while swapping between settings is easier to notice.) This seems kind of obvious when you consider that staring at a flickered light hurts your eyes more than a static one.

I know the human perception of contrast can exceed a displays objective contrast because of this: https://en.wikipedia.org/wiki/Checker_shadow_illusion

This means a display that could only output the colors of the checkerboard would result in a third color perceived by the brain that is impossible for the display to actually have. This establishes that a display can have perceived contrast greater than the objective contrast that is dependent on the light of the neighboring pixels. I believe this might also be happening here with my crt.

Now here is where I run into a crossroads. My assumption that the ambient light of the display itself affects perceived contrast seems reasonable, but using the knowledge that the checker shadow illusion gives also means I could associate an increase in perceived contrast due to the neighboring pixel being a different color in my persistence vision.

That is, the neighboring pixel is black in my persistence vision when the framerate is lower. Do I attribute the increase in color vibrancy when ran at 48hz to the increased duration of black in my persistence vision, resulting in a darker environment eye adaptation, or due to the fact that the color of the neighboring pixels is a different color in my persistence vision, that is, they are darker and causing some sort of illusion like in the checkboard illusion?

Now me saying the neighboring pixels are darker in my persistence vision might be confusing because I said the image was brighter at a lower framerate but remember a crt is actually drawing each pixel sequentially in every frame, unlike lcd which swaps them all at once. Meaning the persistent image in my vision is actually changing much faster and more like a gradient. I see the dark parts in my persistence vision, but the newest part of the image appears brighter at the lower framerate then at the higher framerate. Since the persistence vision updates instep with the pixels being drawn, the entire image appears brighter even though I can see the dark parts.

It seems crazy that the human eye could make dark adaptations in such short time as to be noticeable at 48hz vs 60hz, but consider the display is actually dark for much longer than what it appears It just appears the opposite due to image retention in the brain. The two things listed above are my reasoning for why it looks more vibrant than my lcd and while I don't have objective evidence to back it up, this is the most logical conclusion I can come up with.

Now technically speaking my crt could of actually had higher contrast as I get conflicting data from google on if crts have higher contrast than lcd, I have no tool to measure it, but this doesn't explain the brighter images I perceived when flickering.