Limiting amps and volts from a battery

In summary, you would like to use a voltage divider circuit to provide a voltage of 2.5 volts and 0.25ma to your motor. You have found a formula that is based off of a voltage divider circuit, and you have calculated that 10KΩ will provide the correct resistance.
  • #1
JoeSalerno
79
2
I have a project that involves a small motor, and the battery powering it would blow the motor in an instant without proper use of resistors. The battery I'm using is rated at 9.6 volts and 1600ma. The motor however, is rated at 3 volts and 0.3 Amps. I would like to go just under these ratings to be safe, so I'd like to provide 2.5 volts and 0.25ma. I've found a formula that's based off of a voltage divider circuit : Resistor 2 = [(Voltage out)(Resistor 1)]/(voltage in - voltage out). Using this formula (assuming my first resistor was 10k Ohms) I got
[(2.5)(10,000)]/(9.6-2.5)= 3,500 Ohms. This should provide the correct amount of resistance to get the voltage down to 2.5 volts. For calculating resistance for amperage, I used Resistance = Voltage/Current, and found 9.6/1600= 10. If the math was correct, and I used the right formula, this should be 10 ohms, but that is obismal and I'm pretty sure that's not right. If anyone knows how to make this circuit in a more efficient manner, or is able to double check that I'm using the right formulas, that would be greatly appreciated. By the way, I know the easiest answer is to just use a smaller battery, but this is part of a larger circuit, so I'm just trying to use the battery that's already in there. Thanks in advance
 
Engineering news on Phys.org
  • #2
JoeSalerno said:
I would like to go just under these ratings to be safe, so I'd like to provide 2.5 volts and 0.25ma. I've found a formula that's based off of a voltage divider circuit : Resistor 2 = [(Voltage out)(Resistor 1)]/(voltage in - voltage out). Using this formula (assuming my first resistor was 10k Ohms) I got
[(2.5)(10,000)]/(9.6-2.5)= 3,500 Ohms. This should provide the correct amount of resistance to get the voltage down to 2.5 volts. For calculating resistance for amperage, I used Resistance = Voltage/Current, and found 9.6/1600= 10.

this idea is sort of OK if the load ( the motor ) was purely resistive, but it isn't.
It's an inductive load and as such requires a different approach

Getting the 3V for the motor is the easy part ... a voltage regulator eg a LM317 ( adjustable reg) set for ~ 3 to 3.5V

the second bit, the current ... At normal running speed it may well draw only 300mA (0.3A) but at startup that could easily spike to more than 1 Amp and if under load would still require more than 300mA

I have seen some people have issues using linear voltage regulators ( LM317 etc) as motor drivers because they cannot supply that
initial spike in required current. One way around that that does work is to have a reasonable sized electrolytic capacitor on the output of the regulator to supply the current spike.
The other way is to use a switching regulator eg ...

http://www.ebay.com.au/itm/LM2596-Voltage-Regulator-DC-DC-Buck-Converter-Adjustable-Step-Down-Module-/161934554531
cheers
Dave
 
  • #3
What you really want is a DC/DC converter, if efficiency is of any concern.

Your divider of 10 KΩ -- 3.5 KΩ will certainly give you the voltage that you want, but when you attach it to your motor it will not voltage regulate very well at all. For steady state, consider the effect of the motor as a resistor attached in parallel to the 3.5 KΩ . Also consider inrush current on startup.

JoeSalerno said:
For calculating resistance for amperage, I used Resistance = Voltage/Current, and found 9.6/1600= 10. If the math was correct, and I used the right formula, this should be 10 ohms, but that is obismal and I'm pretty sure that's not right.

9.6 V / 1.6 A is actually 6 Ω.

It is only abysmal in proper context. 100 MΩ is abysmal if it is a load. 0 Ω is a short circuit (not negligible at all--can start fires).

Not really sure what you are trying to do with that result anyhow.

[edit]: davenn did a better job of addressing startup/inrush effects.
 
  • #4
davenn said:
this idea is sort of OK if the load ( the motor ) was purely resistive, but it isn't.
It's an inductive load and as such requires a different approach

Getting the 3V for the motor is the easy part ... a voltage regulator eg a LM317 ( adjustable reg) set for ~ 3 to 3.5V

the second bit, the current ... At normal running speed it may well draw only 300mA (0.3A) but at startup that could easily spike to more than 1 Amp and if under load would still require more than 300mA

I have seen some people have issues using linear voltage regulators ( LM317 etc) as motor drivers because they cannot supply that
initial spike in required current. One way around that that does work is to have a reasonable sized electrolytic capacitor on the output of the regulator to supply the current spike.
The other way is to use a switching regulator eg ...

http://www.ebay.com.au/itm/LM2596-Voltage-Regulator-DC-DC-Buck-Converter-Adjustable-Step-Down-Module-/161934554531
cheers
Dave
So, to get this straight, I can either hook up a circuit that contains the proper voltage regulator you stated, as well as an appropriate capacitor as a stabilizer. Or I can just use the voltage regulator that you linked because it has all of those components already in it? If I were to use that voltage regulator, it says that it outputs anywhere from 1.25 to 35 volts, and its amperage output is 2-3 amps. To my understanding, volts are "pushed" through a circuit, while amps are drawn in by the loads in the circuit. How would I control how many volts come out of voltage regulator, and would the motor draw a large amount of current initially and then drop to around the 0.3A rating?
 
  • #5
JoeSalerno said:
So, to get this straight, I can either hook up a circuit that contains the proper voltage regulator you stated, as well as an appropriate capacitor as a stabilizer

yes, you would have to build this one up from scratch

JoeSalerno said:
Or I can just use the voltage regulator that you linked because it has all of those components already in it? If I were to use that voltage regulator, it says that it outputs anywhere from 1.25 to 35 volts, and its amperage output is 2-3 amps.

it is the much easier way to go

JoeSalerno said:
To my understanding, volts are "pushed" through a circuit, while amps are drawn in by the loads in the circuit. How would I control how many volts come out of voltage regulator, and would the motor draw a large amount of current initially and then drop to around the 0.3A rating?

as long as you set the regulator to output 3 - 3.5V, the current drawn will be what is required ... you don't need to do any current limitingDave
 
  • #6
JoeSalerno said:
How would I control how many volts come out of voltage regulator
The regulator in the link looks like it has a big blue potentiometer for setting the output to a specific value.
 
  • #7
davenn said:
yes, you would have to build this one up from scratch
it is the much easier way to go
as long as you set the regulator to output 3 - 3.5V, the current drawn will be what is required ... you don't need to do any current limitingDave
Thank you very much for your help. I would've never known to use components other than resistors. Thanks again
 
  • Like
Likes davenn
  • #8
no problems :smile:

report back when you have it all running
or if you have any other questions
 

FAQ: Limiting amps and volts from a battery

1. How do I limit the amps and volts from a battery?

To limit the amps and volts from a battery, you can use a device called a voltage regulator. This device regulates the output voltage and current from the battery to a desired level, preventing it from exceeding a certain limit. You can also use resistors or diodes to limit the amps and volts from a battery.

2. Why is it important to limit the amps and volts from a battery?

Limiting the amps and volts from a battery is important because it can prevent damage to your electronic devices. If the battery is not properly regulated, it can supply too much current or voltage, which can cause overheating and potentially lead to a fire or explosion.

3. What are the consequences of not limiting the amps and volts from a battery?

If the amps and volts from a battery are not limited, it can result in damage to your electronic devices, as well as potential safety hazards. The excess current and voltage can cause overheating, leading to equipment failure or even fires or explosions.

4. Can I limit the amps and volts from any type of battery?

Yes, you can limit the amps and volts from most types of batteries. However, the methods and devices used may vary depending on the type of battery. For example, lead-acid batteries may require different methods compared to lithium-ion batteries.

5. Are there any risks involved in limiting the amps and volts from a battery?

There are some potential risks involved in limiting the amps and volts from a battery, such as reducing the battery's overall capacity or potentially damaging the battery if not done correctly. It is important to carefully research and follow proper procedures to ensure safe and effective limitation of amps and volts from a battery.

Back
Top