- #1
fog37
- 1,569
- 108
- TL;DR Summary
- spectrometer sensitivity at different wavelengths does not provide correct relative irradiance....
Hello Everyone,
I am trying to better understand how a spectrometer must be used to measure the wavelength content of the radiation from a specific source.
All spectrometers measure irradiance over a wavelength range (for ex, UV-VIS) but the sensitivity (photons/count) is not the same for all wavelengths. This means that even if the radiation contains the same energy at two different wavelengths ##\lambda_1## and ##\lambda_2##, the spectrometer will show the wavelength with the lower sensitivity to have a higher irradiance even if that is not what is going on...
Does that mean that spectrometers cannot provide a reliable relative irradiance when we compare different wavelengths? How do we solve for that?
Thanks!
I am trying to better understand how a spectrometer must be used to measure the wavelength content of the radiation from a specific source.
All spectrometers measure irradiance over a wavelength range (for ex, UV-VIS) but the sensitivity (photons/count) is not the same for all wavelengths. This means that even if the radiation contains the same energy at two different wavelengths ##\lambda_1## and ##\lambda_2##, the spectrometer will show the wavelength with the lower sensitivity to have a higher irradiance even if that is not what is going on...
Does that mean that spectrometers cannot provide a reliable relative irradiance when we compare different wavelengths? How do we solve for that?
Thanks!