Spectrometer sensitivity (photons/count) at different wavelengths

In summary, a spectrometer measures irradiance over a wavelength range, but the sensitivity (photons/count) is not the same for all wavelengths. This means that even if the radiation contains the same energy at two different wavelengths λ1 and λ2, the spectrometer will show the wavelength with the lower sensitivity to have a higher irradiance. To resolve this issue, a source must be found that radiates a known broad band spectrum and the sensitivity of the spectrometer can be calibrated.
  • #1
fog37
1,569
108
TL;DR Summary
spectrometer sensitivity at different wavelengths does not provide correct relative irradiance....
Hello Everyone,

I am trying to better understand how a spectrometer must be used to measure the wavelength content of the radiation from a specific source.
All spectrometers measure irradiance over a wavelength range (for ex, UV-VIS) but the sensitivity (photons/count) is not the same for all wavelengths. This means that even if the radiation contains the same energy at two different wavelengths ##\lambda_1## and ##\lambda_2##, the spectrometer will show the wavelength with the lower sensitivity to have a higher irradiance even if that is not what is going on...

Does that mean that spectrometers cannot provide a reliable relative irradiance when we compare different wavelengths? How do we solve for that?

Thanks!
 
Engineering news on Phys.org
  • #2
How you resolve this issue (called responsivity curve in the language of IR detectors and photodiodes, probably called other things in other contexts) depends on what kind of spectrometer you are dealing with. For FTIR, you can calibrate the spectrum by taking a reference spectrum over a very non-dispersive material (gold in the IR, for example). (This procedure also deals with the spectrum of the broadband light source.) There isn't a one-size-fits-all procedure.
 
  • Like
Likes fog37
  • #3
fog37 said:
This means that even if the radiation contains the same energy at two different wavelengths λ1 and λ2, ...
It also assumes that the aperture of the effective slit is the same width at both wavelengths.

You need to find a source that radiates a known broad band spectrum. Then you can calibrate the sensitivity of your system.

Knowing the predictable characteristics of the sensor employed can resolve the problem. Some sensors, such as bolometers, register thermal energy independent of wavelength.
 
  • Like
Likes fog37 and Twigg
  • #4
There are two levels of concern here, depending upon the system and your needs. The most direct calibration is to set up the system and compare test sample directly to a known sample . Usually I run known source spectrum --test source scan---known source spectrum. The happy news is that silicon photodetectors are very linear in response, so that is all you really need.
Sometimes you don't have the luxury to calibrate in situ at the time of the test. Then you need to calibrate beforehand and also know a priori how changes in optical setup (calibration vs test) may effect your result. Be aware that optics (slit width optic focal number etc) and temperature can be important. It is easy to get lost in the minutiae.
Spectrometers are exceedingly clever and precise when well used.
 
  • Like
Likes Charles Link and Twigg
  • #5
I see. So calibration is the solution.

I thought that calibration would only convert the vertical axis, which measures counts, to a correct intensity value ##W/m^2##. But I guess calibration can also provide a correction factor to take care of the different sensitivity across different wavelengths which would result in the relative intensity issues I am describing...
 
  • Like
Likes hutchphd
  • #6
And of course there is an entirely different method for calibrating the wavelength accuracy which involves sources that emit at known frequencies. These again depend upon your exact requirements but are not particularly arcane nor complicated. Wonderful instruments.
 
  • Like
Likes Twigg

FAQ: Spectrometer sensitivity (photons/count) at different wavelengths

What is a spectrometer and what does it measure?

A spectrometer is a scientific instrument used to measure the intensity of light at different wavelengths. It is used to analyze the composition of materials by measuring the amount of light they absorb or emit.

What is meant by "sensitivity" in a spectrometer?

Sensitivity refers to the ability of a spectrometer to detect and measure very small changes in light intensity. It is typically measured in units of photons per count, which represents the number of photons that are detected and registered by the instrument for each count.

How does the sensitivity of a spectrometer vary at different wavelengths?

The sensitivity of a spectrometer can vary at different wavelengths due to the design and components of the instrument. Some spectrometers may be more sensitive to certain wavelengths than others, and this can also be affected by factors such as the type of detector used.

Why is it important to consider the sensitivity of a spectrometer at different wavelengths?

The sensitivity of a spectrometer at different wavelengths is important because it can affect the accuracy and precision of measurements. If a spectrometer is not sensitive enough at a certain wavelength, it may not be able to detect and measure small changes in light intensity, leading to inaccurate results.

How can the sensitivity of a spectrometer be improved at different wavelengths?

The sensitivity of a spectrometer at different wavelengths can be improved by using a more sensitive detector, optimizing the instrument's design and components, and reducing sources of noise and interference. Additionally, selecting a spectrometer with a higher resolution may also improve sensitivity at specific wavelengths.

Similar threads

Back
Top