- #1
JamesGoh
- 143
- 0
For an exam I am studying for, I have to understand the effect of measurement tolerance, calibration accuracy and time-related drift on frequency measurements.
Im aware that the time-related drift is due to the oscillator frequency becoming less accurate (due to continuous crystal vibration), however I am not sure about measurement tolerance and calibration accuracy.
Does measurement tolerance refer to a tolerated frequency range the ideal frequency measurement falls around ? (e.g say if I wanted to get an ideal reading of 9Hz and instead I get 8Hz, I would take 8Hz)
Im guessing that calibration accuracy would refer to how well the equipment has been calibrated to generate or measure a frequency (e.g. how well a crystal has been carved to generate a resonant frequency) ?
thanks in advance
Im aware that the time-related drift is due to the oscillator frequency becoming less accurate (due to continuous crystal vibration), however I am not sure about measurement tolerance and calibration accuracy.
Does measurement tolerance refer to a tolerated frequency range the ideal frequency measurement falls around ? (e.g say if I wanted to get an ideal reading of 9Hz and instead I get 8Hz, I would take 8Hz)
Im guessing that calibration accuracy would refer to how well the equipment has been calibrated to generate or measure a frequency (e.g. how well a crystal has been carved to generate a resonant frequency) ?
thanks in advance