- #1
jman5000
- 28
- 2
- TL;DR Summary
- Ambient light affects perceived contrast due to eye adaptation to raised light. A Display's light output would itself raise ambient light. Therefore, dimmer displays can result in higher perceived contrast than what the static contrast is to a greater degree than a brighter display.
Bear with me this will be long, but everything here I feel is necessary to understand the context.
I recently used a crt at varying refresh rates and noticed a few things that has made me question how the brain perceives colors on a display. First, is that my crt appears to have more color vibrancy than my lcd in a near black room even though the lcd has a higher contrast ratio by a factor of nearly 5x. I hypothesized that maybe it was because the human eye is less sensitive to light when there is a lot of ambient light in the room and conversely when there is less it is more sensitive to light.
Logically the light from the display itself should also force this eye adaption as well. The crt is at least half as bright if not more and therefore my eye would be more sensitive to the light it output. I decided to test my idea by lowering the refresh rate of my crt to see if it would appear even brighter. (a crt display draws a frame and the frame immediately starts to fade, meaning there is actually a blackness between each frame, lcds don't do this, instead they swap out an image for the next with no black between them). The result was that it looked even brighter at 48hz than at higher framerates!
Yet, there was a slightly visible black between the images, but when I saw the image, it appeared even brighter than when at a higher framerate. I know the brightness did not actually increase because I measured the same luminance at either framerate. This suggests that the increased black image in my persistence vision, ie a noticeable flickered light, can actually give you higher perceived brightness than a static light (and therefore perceived contrast? It is hard to gauge the contrast since I cannot compare the framerates simultaneously. Picking up on brightness while swapping between settings is easier to notice.)
I know the human perception of contrast can exceed a displays objective contrast because of this: https://en.wikipedia.org/wiki/Checker_shadow_illusion
This means a display that could only output the colors of tile A and B would result in a third color perceived by the brain that is impossible for the display to actually have. This establishes that a display can have perceived contrast greater than the objective contrast that is dependent on the light of the neighboring pixels. I believe this might also be happening here with my crt.
Now here is where I run into a crossroads. My assumption that the ambient light of the display itself affects perceived contrast seems reasonable, but using the knowledge that the checker shadow illusion gives also means I could associate an increase in perceived contrast due to the neighboring pixel being a different color in my persistence vision.
That is, the neighboring pixel is black in my persistence vision when the framerate is lower. Do I attribute the increase in color vibrancy when ran at 48hz to the increased duration of black in my persistence vision, resulting in a darker environment eye adaptation, or due to the fact that the color of the neighboring pixels is a different color in my persistence vision, that is, they are darker?
Now me saying the neighboring pixels are darker in my persistence vision might be confusing because I said the image was brighter at a lower framerate but remember a crt is actually drawing each pixel sequentially in every frame, unlike lcd which swaps them all at once. Meaning the persistent image in my vision is actually changing much faster and more like a gradient. I see the dark parts in my persistence vision, but the newest part of the image appears brighter at the lower framerate then at the higher framerate. Since the persistence vision updates instep with the pixels being drawn, the entire image appears brighter even though I can see the dark parts.
It seems crazy that the human eye could make dark adaptations in such short time as to be noticeable at 48hz vs 60hz, but consider the display is actually dark more of the time than lit up. It just appears the opposite due to persistence of vision. The two things listed above are my reasoning for why it looks more vibrant than my lcd and while I don't have objective evidence to back it up, this is the most logical conclusion I can come up with.
Let me know if you disagree with any assumptions or reasoning! I am working off a lot of assumptions with little proof beyond my own experiences but I'm not sure how else to explain the increased perceived brightness and color vibrancy the crt has compared to my lcd.
My question is if there any research to support this conclusion? Alternatively suggest some additional reading that explores the idea of how flickered light or neighboring light can affect perceived color. I know this technique could never match a display that can natively output high contrast while being able to maintain a low average light, like an oled, but I am curious on what the limits to this could be. It seems logical that the higher your average display brightness, the less effective this flicker would be at increasing perceived contrast since your eye would be adjusted to a brighter image. Not to mention flickering on a dim crt is fine, probably would hurt eyes at higher nits like 600.
On a sidenote, unrelated to the science of this, the 48hz flickering actually adds some cool affects to some parts of video. For example, ceiling lights have a subtle flicker that makes them have more presence in a scene, fire has an actual subtle flicker, a magic portal will have a subtle pulsing, lightsabers and lasers will also have a pulsing effect. Really neat, except that it does it to things that shouldn't flicker as well like lab coats or anything that is bright compared to the rest of the scene.
I recently used a crt at varying refresh rates and noticed a few things that has made me question how the brain perceives colors on a display. First, is that my crt appears to have more color vibrancy than my lcd in a near black room even though the lcd has a higher contrast ratio by a factor of nearly 5x. I hypothesized that maybe it was because the human eye is less sensitive to light when there is a lot of ambient light in the room and conversely when there is less it is more sensitive to light.
Logically the light from the display itself should also force this eye adaption as well. The crt is at least half as bright if not more and therefore my eye would be more sensitive to the light it output. I decided to test my idea by lowering the refresh rate of my crt to see if it would appear even brighter. (a crt display draws a frame and the frame immediately starts to fade, meaning there is actually a blackness between each frame, lcds don't do this, instead they swap out an image for the next with no black between them). The result was that it looked even brighter at 48hz than at higher framerates!
Yet, there was a slightly visible black between the images, but when I saw the image, it appeared even brighter than when at a higher framerate. I know the brightness did not actually increase because I measured the same luminance at either framerate. This suggests that the increased black image in my persistence vision, ie a noticeable flickered light, can actually give you higher perceived brightness than a static light (and therefore perceived contrast? It is hard to gauge the contrast since I cannot compare the framerates simultaneously. Picking up on brightness while swapping between settings is easier to notice.)
I know the human perception of contrast can exceed a displays objective contrast because of this: https://en.wikipedia.org/wiki/Checker_shadow_illusion
This means a display that could only output the colors of tile A and B would result in a third color perceived by the brain that is impossible for the display to actually have. This establishes that a display can have perceived contrast greater than the objective contrast that is dependent on the light of the neighboring pixels. I believe this might also be happening here with my crt.
Now here is where I run into a crossroads. My assumption that the ambient light of the display itself affects perceived contrast seems reasonable, but using the knowledge that the checker shadow illusion gives also means I could associate an increase in perceived contrast due to the neighboring pixel being a different color in my persistence vision.
That is, the neighboring pixel is black in my persistence vision when the framerate is lower. Do I attribute the increase in color vibrancy when ran at 48hz to the increased duration of black in my persistence vision, resulting in a darker environment eye adaptation, or due to the fact that the color of the neighboring pixels is a different color in my persistence vision, that is, they are darker?
Now me saying the neighboring pixels are darker in my persistence vision might be confusing because I said the image was brighter at a lower framerate but remember a crt is actually drawing each pixel sequentially in every frame, unlike lcd which swaps them all at once. Meaning the persistent image in my vision is actually changing much faster and more like a gradient. I see the dark parts in my persistence vision, but the newest part of the image appears brighter at the lower framerate then at the higher framerate. Since the persistence vision updates instep with the pixels being drawn, the entire image appears brighter even though I can see the dark parts.
It seems crazy that the human eye could make dark adaptations in such short time as to be noticeable at 48hz vs 60hz, but consider the display is actually dark more of the time than lit up. It just appears the opposite due to persistence of vision. The two things listed above are my reasoning for why it looks more vibrant than my lcd and while I don't have objective evidence to back it up, this is the most logical conclusion I can come up with.
Let me know if you disagree with any assumptions or reasoning! I am working off a lot of assumptions with little proof beyond my own experiences but I'm not sure how else to explain the increased perceived brightness and color vibrancy the crt has compared to my lcd.
My question is if there any research to support this conclusion? Alternatively suggest some additional reading that explores the idea of how flickered light or neighboring light can affect perceived color. I know this technique could never match a display that can natively output high contrast while being able to maintain a low average light, like an oled, but I am curious on what the limits to this could be. It seems logical that the higher your average display brightness, the less effective this flicker would be at increasing perceived contrast since your eye would be adjusted to a brighter image. Not to mention flickering on a dim crt is fine, probably would hurt eyes at higher nits like 600.
On a sidenote, unrelated to the science of this, the 48hz flickering actually adds some cool affects to some parts of video. For example, ceiling lights have a subtle flicker that makes them have more presence in a scene, fire has an actual subtle flicker, a magic portal will have a subtle pulsing, lightsabers and lasers will also have a pulsing effect. Really neat, except that it does it to things that shouldn't flicker as well like lab coats or anything that is bright compared to the rest of the scene.
Last edited by a moderator: