- #1
Number2Pencil
- 208
- 1
Hello,
I am trying to develop a software control algorithm to compensate for oscillator imperfections/frequency drift. I have a NTP server which I can get a pretty good estimation (@1Hz) of the "true" time and compare it to my system's time. I can differentiate the offset-error to calculate the time-drift (which is related to frequency error). I can then update my local real-time-clock (RTC) hardware to indicate the new frequency, the RTC will perform clock compensation with this new frequency. Once I can control the time-drift error to 0, my local clock matches very closely to the "true" time. If the oscillator drifts (due to temperature change or something), the control algorithm should detect it and compensate.
I've got this working fairly well with a SW PID controller, but the noise in the frequency-error is enough to cause the oscillation to swing around too much. I need to do some filtering/averaging to really get a good gauge of the true oscillator frequency, but doing this requires some delay. What I am noticing is that the PID controller will try to do a very small amount of change on the control signal, but due to the delay it will not see the effect of this for several samples. On the very next sample, the PID controller then tries to do additional change, still not seeing any difference. This gradual change starts to snowball. My control system starts to oscillate badly because it eventually tries to compensate for the previous over-compensation. It's actually worse than me not averaging at all...
I can reduce the effect by slowing down my controller dynamics, but I have to slow it down to a crawl to get it to be worthwhile. At that point the controller is awful and takes way too long to converge.
I was wondering if anyone has any suggestions for dealing with this?
I am trying to develop a software control algorithm to compensate for oscillator imperfections/frequency drift. I have a NTP server which I can get a pretty good estimation (@1Hz) of the "true" time and compare it to my system's time. I can differentiate the offset-error to calculate the time-drift (which is related to frequency error). I can then update my local real-time-clock (RTC) hardware to indicate the new frequency, the RTC will perform clock compensation with this new frequency. Once I can control the time-drift error to 0, my local clock matches very closely to the "true" time. If the oscillator drifts (due to temperature change or something), the control algorithm should detect it and compensate.
I've got this working fairly well with a SW PID controller, but the noise in the frequency-error is enough to cause the oscillation to swing around too much. I need to do some filtering/averaging to really get a good gauge of the true oscillator frequency, but doing this requires some delay. What I am noticing is that the PID controller will try to do a very small amount of change on the control signal, but due to the delay it will not see the effect of this for several samples. On the very next sample, the PID controller then tries to do additional change, still not seeing any difference. This gradual change starts to snowball. My control system starts to oscillate badly because it eventually tries to compensate for the previous over-compensation. It's actually worse than me not averaging at all...
I can reduce the effect by slowing down my controller dynamics, but I have to slow it down to a crawl to get it to be worthwhile. At that point the controller is awful and takes way too long to converge.
I was wondering if anyone has any suggestions for dealing with this?