- #1
Darkmisc
- 220
- 31
I'm writing up a prac that involved LCR circuits.
One part of the prac involved measuring the phase difference between the voltage across the capacitor (V_C) and the inductor (V_L), while the frequency of the input signal was varied.
This was done by measuring the distance between peaks for (V_C) and (V_L) on an oscilloscope.
I got varying values for the phase difference, with a phase difference of zero when the input frequency matched the resonant frequency.
V_C may be expressed as (-1/Cw)I cos wt.
V_L may be expressed as Lw I cos wt.
When looking at these equations, I don't understand why the phase difference depended on the signal's frequency. The way I interpret the equations is that V_C and V_L are permanently 180 degrees out of phase (both being cos of wt, but one negative, the other positive). I can understand that their amplitudes won't always be the same, but it seems to me they should always be 180 degrees out of phase.
Where have I gone wrong in my understanding?
Thanks
One part of the prac involved measuring the phase difference between the voltage across the capacitor (V_C) and the inductor (V_L), while the frequency of the input signal was varied.
This was done by measuring the distance between peaks for (V_C) and (V_L) on an oscilloscope.
I got varying values for the phase difference, with a phase difference of zero when the input frequency matched the resonant frequency.
V_C may be expressed as (-1/Cw)I cos wt.
V_L may be expressed as Lw I cos wt.
When looking at these equations, I don't understand why the phase difference depended on the signal's frequency. The way I interpret the equations is that V_C and V_L are permanently 180 degrees out of phase (both being cos of wt, but one negative, the other positive). I can understand that their amplitudes won't always be the same, but it seems to me they should always be 180 degrees out of phase.
Where have I gone wrong in my understanding?
Thanks