- #1
peter.ell
- 43
- 0
First, I am curious about why CCDs need color filters in the first place. It is said that they are necessary because digital camera sensors can only sense intensity of light, but not wavelength (color), which makes sense initially... except that CCD sensors operate by utilizing the photoelectric effect, right? And isn't the photoelectric effect wavelength dependent? If so, than why would digital camera sensors require the use of color filters over the pixels of the sensors in order to detect and differentiate between various colors?
With that out of the way, here's the real question:
I know that digital camera sensors capture only three colors per pixel, red, green, and blue just like the corresponding pixels on any computer monitor. Purple is not a spectral color, and so it can be easily captured by a camera sensor and displayed on a screen by using red and blue pixels. But violet is a spectral color, having its own wavelength beyond that of blue. If a digital camera takes a picture of something that is violet, theoretically two things should occur due to the lack of the sensor's ability to pick up and distinguish violet from blue: violet objects would look black because the sensor is not able to respond to violet light, or violet objects would look blue because the sensor is picking up the violet light with the blue-sensitive pixels. Yet this does not seem to be the case.
So what's going on to allow digital cameras to capture violet light in a way that allows displays to represent it?
Thank you so much!
With that out of the way, here's the real question:
I know that digital camera sensors capture only three colors per pixel, red, green, and blue just like the corresponding pixels on any computer monitor. Purple is not a spectral color, and so it can be easily captured by a camera sensor and displayed on a screen by using red and blue pixels. But violet is a spectral color, having its own wavelength beyond that of blue. If a digital camera takes a picture of something that is violet, theoretically two things should occur due to the lack of the sensor's ability to pick up and distinguish violet from blue: violet objects would look black because the sensor is not able to respond to violet light, or violet objects would look blue because the sensor is picking up the violet light with the blue-sensitive pixels. Yet this does not seem to be the case.
So what's going on to allow digital cameras to capture violet light in a way that allows displays to represent it?
Thank you so much!