- #1
Tasell
- 9
- 0
I've a circuit with a power supply of 1.95V and a resistor 100ohms. To measure the current through the resistor, I use an analogue multimeter. This turns out to be 1.9x10^-4 A. But, from the calculation using the formula I=V/R, I should be 1.95x10^-2. Shouldn't an analogue multimeter be accurate to measure current due to its low resistance? Or, is 1.95V too high for the multimeter?
And, to calculate the internal resistance of the AMM, I use a digital multimeter to measure the voltage dropped, which turns out to be 0.08V. Then, plugging it into the equation r=V/I=0.08V/1.9x10^-4 = 4.21ohm. Again, from the theoretical calculation, r=(V/I)-R=(1.95/1.9x10^-4)-100 = 2.63ohm. This is very different from the measured value. Can anyone explain to me why this is so?
And, to calculate the internal resistance of the AMM, I use a digital multimeter to measure the voltage dropped, which turns out to be 0.08V. Then, plugging it into the equation r=V/I=0.08V/1.9x10^-4 = 4.21ohm. Again, from the theoretical calculation, r=(V/I)-R=(1.95/1.9x10^-4)-100 = 2.63ohm. This is very different from the measured value. Can anyone explain to me why this is so?