# How can the charging time of a cell phone battery be calculated?

Suppose I have a 20 AmpHour battery at 4V (as in, the actual capacity of the battery is 80 Watt hours). And I'm using a 10V 2A charger to charge it (this is a 20 Watt charger). How can the charge time be calculated?

Is it 4 hours?

Note: The charge time is definitely not 20Ah/2A = 20h as a lot of people state, since these numbers are close to the actual values for most fast-charging smartphones today, and the charge time should be of the order of 1 hr).

Note 2: Please explain using the fundamentals of physics, instead of obscure formulae.

Since the charger is at 10V and the battery is at 4V, how does this affect the charging?

Replay

You don't tell us what voltage is required to charge your 4 V battery so let's assume it's 5 V.

Linear regulator / charger

• If we use a linear voltage regulator or charger we can drop the 10 V down to 5V and can draw 2 A from the power supply. Power into the battery will be 5V x 2A = 10 W. Power dissipated in the regulator will be 5V x 2A = 10 W. Efficiency will be 50%.
• Charge time would be \$\frac {Ah~capacity}{current} = \frac {20Ah}{2A} = 10~h \$ if the battery could store all the energy. Since the battery gets warm during charging we know that some of the input power is being lost so 12 h may be a better estimate.

Buck charger

Using a buck voltage converter is more efficient and when reducing the voltage the current can be increased. Let's assume we could get one with 85% efficiency.

$$V_O \cdot I_O = \eta V_I \cdot I_I$$ where \$\eta \$ is the efficiency. From this we can work out the maximum output current is

$$I_O = \eta \frac {V_I \cdot I_I}{V_O} = 0.85 \frac {10 \cdot 2}{5} = 3.4~A$$

Ideal charge time will be \$\frac {Ah~capacity}{current} = \frac {20Ah}{3.4A} = 5.9~h \$ so say 7 hours allowing for heat losses.

I hope the formulae weren't too obscure. ;^)

Category: batteries Time: 2016-07-30 Views: 0