- #1
Anthony_
- 6
- 0
Crystal Oscillators -- accuracy versus temperature
I have done quite a lot of desk research and spoken to a few manufacturers, but I am still not clear about the specifications for crystal oscillators and I hope someone can help me out. It is a multi-part question.
1) The accuracy of an XO is usually specified as plus or minus at a given temperature. I assume that a specific oscillator, if measured, would have a specific accuracy at a specific temperature on a specific occasion. This plus or minus might refer to manufacturing tolerance for different oscillators. Or it might refer to changes of accuracy (retrace) of a specific oscillator when tested on different occasions. Or both. Any idea which it is?
2) The accuracy of an oscillator varies with temperature. The variation is a parabolic coefficient. I have usually seen this shown as negative. The oscillator loses in frequency as the temperature decreases or increases. It never gains, against its nominal frequency. Is that correct? Does a crystal oscillator only lose, and never gain, with changes in temperature away from the turnover temperature?
3) If we put sample variation and temperature variation together, then we would have, for example, +-20ppm at 25 degrees C, and -100 (+-20) ppm at 0-40 degrees C i.e. between -80 and -120 ppm. The parabolic coefficient also has a plus or minus range, so we might have -70 to -130 ppm as the range of variation.
4) Or the manufacturing sample tolerance might change at different temperatures. Samples might be +-20ppm at 25 degrees C, but they might be +-100ppm at 0-40 degrees. Does the sample accuracy stay the same at different temperatures, or does it vary more widely with temperature? Does anyone know?
5) I am aware that TCXO's change this behaviour. I am not asking how to achieve better accuracy. I am trying to understand the behaviour of a standard XO with changes in temperature.
Thanks
I have done quite a lot of desk research and spoken to a few manufacturers, but I am still not clear about the specifications for crystal oscillators and I hope someone can help me out. It is a multi-part question.
1) The accuracy of an XO is usually specified as plus or minus at a given temperature. I assume that a specific oscillator, if measured, would have a specific accuracy at a specific temperature on a specific occasion. This plus or minus might refer to manufacturing tolerance for different oscillators. Or it might refer to changes of accuracy (retrace) of a specific oscillator when tested on different occasions. Or both. Any idea which it is?
2) The accuracy of an oscillator varies with temperature. The variation is a parabolic coefficient. I have usually seen this shown as negative. The oscillator loses in frequency as the temperature decreases or increases. It never gains, against its nominal frequency. Is that correct? Does a crystal oscillator only lose, and never gain, with changes in temperature away from the turnover temperature?
3) If we put sample variation and temperature variation together, then we would have, for example, +-20ppm at 25 degrees C, and -100 (+-20) ppm at 0-40 degrees C i.e. between -80 and -120 ppm. The parabolic coefficient also has a plus or minus range, so we might have -70 to -130 ppm as the range of variation.
4) Or the manufacturing sample tolerance might change at different temperatures. Samples might be +-20ppm at 25 degrees C, but they might be +-100ppm at 0-40 degrees. Does the sample accuracy stay the same at different temperatures, or does it vary more widely with temperature? Does anyone know?
5) I am aware that TCXO's change this behaviour. I am not asking how to achieve better accuracy. I am trying to understand the behaviour of a standard XO with changes in temperature.
Thanks