Understanding Charger Circuits and Diode Voltage Drop

In summary: A battery charger should have a charging cutoff voltage (CCV) setting to protect the battery from being overcharged. The CCV should be set as low as possible while still providing a sufficient trickle charge to prevent the battery from becoming fully discharged. This will also prolong battery life.
  • #1
ramonegumpert
187
0
Dear Experts

I just connect a diode to a battery rated at 4.2 volts.

The diode takes about 0.8volts ( i thought should be 0.7v but measurement says 0.8v) and the battery is currently at 3.12v.

The supply voltage is 5v dc.

After some time, if the battery reaches 4.2v, would current stop flowing into the battery given that the supply voltage = battery + diode voltage ?

If so, why do we need a charger circuit to limit the voltage level of the battery from overcharging?

Thanks for reading.

Best regards
Ramone
 
Engineering news on Phys.org
  • #2
There are battery chargers that work exactly like that.

There are problems with them, though. They have to have some resistive current limiting for when the battery is very flat and this means that as the battery approaches the supply voltage the charging current becomes very small.
So, it may take a very long time to fully charge the battery.

A charger that charges at a fairly constant rate and then switches off will charge much more quickly than the type described here.

Incidentally, diode voltages are not independent of current. A diode will conduct with around 0.5 volts on it. The current may be small, but if the battery was sensitive to overcharging, it could eventually get overcharged by this small current. At this low current, the voltage across the battery in your example would be 4.5 volts and this may be enough to damage it.
 
  • #3
ramonegumpert said:
If so, why do we need a charger circuit to limit the voltage level of the battery from overcharging?
Hello again, RG. :wink: Did you get your solar charger sorted out?

Some battery chemistries are very sensitive to overcharging. A difference of 0.1V may mean the difference between the cell being fully charged and it being dangerously overcharged. So you need the charging circuit to switch off at a very precise point. Further, the voltage of the cell may be temperature dependent, so the point at which charging should be terminated needs to take temperature into account. Also, the voltage delivered by a basic charger may change according to the size of the cell, or the mains voltage, etc., so some sort of regulation needs to be built into the charging/protection circuit associated with each individual cell to ensure optimal charging (and maximizing the life) of the cell.
 
  • #4
You can damage batteries by charging them the 'wrong way'. Some are best with 'constant voltage' charging and others with 'constant current' charging. The Lead Acid car battery is pretty good natured and a simple transformer-rectifier system is fine but most other types are much more fussy..
Take advice about the requirements of the specific battery you need to charge or you may spoil it (or yourself).
 
  • #5
sophiecentaur said:
You can damage batteries by charging them the 'wrong way'. Some are best with 'constant voltage' charging and others with 'constant current' charging. [...] Take advice about the requirements of the specific battery you need to charge or you may spoil it (or yourself).
If I'm not mistaken, the OP is using lithium ion cells which needs to be charged in two steps; constant current until the cell reaches 4.2V followed by constant voltage until the charge current drop to around 5% of the max current. If charging to full capacity is not a concern the second step can be omitted. I agree that OP should take care and avoid the potential hazard of overcharging these type of cells, but one can only assume he is aware of the safety issues. Right?

http://www.youtube.com/watch?v=SMy2_qNO2Y0#t=1m55s
 
  • #6
Dear Vk6kro

Thank you for enlightening.
:)

Best regards
RG
 
  • #7
Dear NascentOxygen

I stopped working on the solar charger for a while. Its a weekend hobby.

Thanks for your tips.
:)
 
  • #8
Thanks Sophiecentaur and Gnurf :)

I have been aware that its dangerous to overcharge lithium batteries.

Now i will try make a circuit to ensure precise charging .

Is there standard circuit for this purpose?

Thanks.
 
  • #9

FAQ: Understanding Charger Circuits and Diode Voltage Drop

What is a charger circuit?

A charger circuit is an electronic circuit that is used to charge a battery or power a device. It typically consists of a power source, such as a wall outlet or USB port, and a series of components that regulate the flow of electricity to the battery or device being charged.

What is the purpose of a diode in a charger circuit?

A diode is used in a charger circuit to regulate the flow of electricity and prevent reverse current from damaging the battery. It acts as a one-way valve, allowing current to flow in only one direction.

What is diode voltage drop?

Diode voltage drop is the amount of voltage that is lost when current flows through a diode. It is typically around 0.7 volts for silicon diodes and 0.3 volts for germanium diodes.

How does diode voltage drop affect the charging process?

The diode voltage drop can affect the charging process by reducing the amount of voltage that reaches the battery. This can result in slower charging times or a lower final charge level for the battery.

How can I calculate diode voltage drop in a charger circuit?

Diode voltage drop can be calculated by subtracting the voltage going into the diode (known as the forward voltage) from the voltage coming out of the diode (known as the reverse voltage). This can be done using a multimeter or by consulting the diode's datasheet for its specific voltage drop value.

Back
Top