- #1
Topher925
- 1,566
- 7
I have a project where I'm measuring the phase of a signal compared to a signal reference using an SR830 lock-in amplifier. My problem is that the amplitude of the input signal periodically changes over time and this is causing a false measurement in the phase of the input signal. I can't figure out what would cause it as the phase measurement should be constant and independent of amplitude, should it not? Keep in mind that this signal has a very good SNR.
The reference signal is a constant amplitude and frequency 50/50 square wave. It doesn't matter what filtering settings I have, if the amplitude of the input signal decreases by 50% the phase measurement will change by roughly 2 degrees or so. What is going on?
The reference signal is a constant amplitude and frequency 50/50 square wave. It doesn't matter what filtering settings I have, if the amplitude of the input signal decreases by 50% the phase measurement will change by roughly 2 degrees or so. What is going on?