- #1
cissey
- 4
- 0
In the circuit the LED's rated current is 20 milliamphers. With a voltage drop across the LED of 1.5 volts, calculate the value of the series resistance R1 necessary to be used if the supply voltage is 12 volts DC.
Here is how I answered this, I am not sure if I am right.
R1=Er1/Ir1
R1= 10.5/.020
R1=525 ohm
Can someone help me please?
This problem has me confused
Here is how I answered this, I am not sure if I am right.
R1=Er1/Ir1
R1= 10.5/.020
R1=525 ohm
Can someone help me please?
This problem has me confused