- #1
Rayhaan
- 30
- 8
- TL;DR Summary
- how to wire an led switch based on rated voltage and current
Could someone please help.
i need to connect a 12vdc motor which can draw 16 amps
to a switch that has an led on it. the led is rated at 3v 20mA.
The power supply i am using is rated to put out 12vdc at 30A
I thought of the possibility of adding a resistor to the positive pole of the led.
In this case i saw people using ohms Law
R=(Vsource-Vrequired)/ required amps.
In this case i get(12-3)/0.02= 450 ohms
is this right?
because the amperes of the power source is not factored in at all with this calculation will i then use a 450 ohm resistor for any 12v source regardless of the rated current output of the power source? IE. whether the power supply is 20A 30A or 100A will i still simply attach a 450 ohm resistor to the positive terminal of the led?
i need to connect a 12vdc motor which can draw 16 amps
to a switch that has an led on it. the led is rated at 3v 20mA.
The power supply i am using is rated to put out 12vdc at 30A
I thought of the possibility of adding a resistor to the positive pole of the led.
In this case i saw people using ohms Law
R=(Vsource-Vrequired)/ required amps.
In this case i get(12-3)/0.02= 450 ohms
is this right?
because the amperes of the power source is not factored in at all with this calculation will i then use a 450 ohm resistor for any 12v source regardless of the rated current output of the power source? IE. whether the power supply is 20A 30A or 100A will i still simply attach a 450 ohm resistor to the positive terminal of the led?