Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

As the battery degrades, does it just lose charging capacity or does it lose charge efficiency?

This site may earn commission on affiliate links.
So as an example, we start with a 75kwH battery. For simplicity sake, let's say to charge from 0->75kwH it consumes 75 kwH of energy from the wall (assuming 100% efficiency here to make the math simple.)

Many recharge cycles later, the battery is degraded 10%, and has the equivalent of 67.5 kwh of capacity. If I charge this battery from 0 -> 67.5 kwH, does it take 67.5kwH from the wall, or does it take 75kwH from the wall (or maybe even some value in between 67.5 and 75).

Said another way, as the battery degrades -- is the car losing efficiency (it costs more per mile to drive), or is it simply losing capacity (cost per mile will remain the same, just smaller storage)?

Thanks!
 
So as an example, we start with a 75kwH battery. For simplicity sake, let's say to charge from 0->75kwH it consumes 75 kwH of energy from the wall (assuming 100% efficiency here to make the math simple.)

Many recharge cycles later, the battery is degraded 10%, and has the equivalent of 67.5 kwh of capacity. If I charge this battery from 0 -> 67.5 kwH, does it take 67.5kwH from the wall, or does it take 75kwH from the wall (or maybe even some value in between 67.5 and 75).

Said another way, as the battery degrades -- is the car losing efficiency (it costs more per mile to drive), or is it simply losing capacity (cost per mile will remain the same, just smaller storage)?

Thanks!

The battery increases it’s internal resustance with time.

The degradation loss is both by loss of cyclable lithium, meaning less storage capacity and by increased internal resistance which means more heat losses during the discharge (and during charging).

Loss of cyclable lithium means less capacity to charge, so 10% lossequals about 10% less energy to charge also.

Increased internal resistance means more losses, so more is lost in the charging and discharging.

The main part is loss of cyclable lithium.

Charging the new 75 kWh batt from empty might cost ~ 85 kWh (~12% losses, including the car energy use during the charging).

Charging the 67.5kWh 10% degraded pack might cause 13% loss, so 77kWh or so.
Just examples, grabbed from thin air. The losses are very dependant on the charging power, as the losses are squared to the current.
 
Last edited:
  • Like
Reactions: elddum and gaswalla
The battery increases it’s internal resustance with time.

The degradation loss is both by loss of cyclable lithium, meaning less storage capacity and by increased internal resistance which means more heat losses during the discharge (and during charging).

Loss of cyclable lithium means less capacity to charge, so 10% lossequals about 10% less energy to charge also.

Increased internal resistance means more losses, so more is lost in the charging and discharging.

The main part is loss of cyclable lithium.

Charging the new 75 kWh batt from empty might cost ~ 85 kWh (~12% losses, including the car energy use during the charging).

Charging the 67.5kWh 10% degraded pack might cause 13% loss, so 77kWh or so.
Just examples, grabbed from thin air. The losses are very dependant on the charging power, as the losses are squared to the current.

Thanks for the info! So since the main effect is loss of cyclable lithium, then the cost-per-mile to drive shouldn't be affected too much by the extra resistance.

BTW Do you know how the Battery Management System knows to stop with 67.5kWH in the 10% degraded battery instead of trying to shove 75 kwH into it? As far as I know the battery "fullness" is detected via the voltage of the cells. So if 4.167 volts per cell is "Full" on the new battery -- is it also 4.167 volts per cell to be "Full" for the degraded battery? That would mean the voltage would climb "Faster" in the degraded battery with less cyclable lithium versus the brand new battery given the same input current? I guess in that case one could estimate the battery degradation by how fast the voltage gets to 4.167 during charging.
 
BTW Do you know how the Battery Management System knows to stop with 67.5kWH in the 10% degraded battery instead of trying to shove 75 kwH into it?
SOC is measured by cell voltage, precise measurement is made when the battery is at rest (car sleeping).
2.5V is true 0% and 4.20V is true 100%.

When charging, the SOC can not be measured as we are pushing energy in by increasing the voltage.
So the BMS use energy counting.
Knowing the capacity, for example 67.5 kWh and the charge setting means for example that 30% needs to be charged (which could be SOC = 50% and charge setting 80%), the battery needs to be filled with 0.3 x 67.5 = 20.25 kWh. This is the net value, so the BMS knows the total value needed to fill 20.25 kWh net energy.

This calculated value is present in the BMS data we can see.

When charging the charging is running until this amount of energy is added.

After the charge when the battery comes to rest (if not driving immediatly) the resting voltage is messured and the real SOC is measured.

For a full charge, 4.20V is the maximum supply voltage allowed. 100% SOC is reached when the charge current has dropped below a certain value during the 4.20V supply phase.

After the 100% charge is stopped the voltage drops slightly (maybe 4.19V or so for the resting voltage). An old cell will have a larger drop.
When you check the voltage on your car the cells are not at rest, as the car is awake using energy.
As far as I know the battery "fullness" is detected via the voltage of the cells. So if 4.167 volts per cell is "Full" on the new battery -- is it also 4.167 volts per cell to be "Full" for the degraded battery? That would mean the voltage would climb "Faster" in the degraded battery with less cyclable lithium versus the brand new battery given the same input current? I guess in that case one could estimate the battery degradation by how fast the voltage gets to 4.167 during charging.
As you need to push in the current with increased voltage you will never see the real cell voltage during the charging session.
In the end of a 100% charge you reach 4.20V which can not be exceeded so the 4.20V is kept until either the energy calculated as per above for a not 100% charge is met or the current has dropped to the set level for a 100% charge.

After a charge the cell voltage drops slowly and it takes an hour or two to reach the true Open Circuit Voltage.
 
Knowing the capacity, for example 67.5 kWh and the charge setting means for example that 30% needs to be charged (which could be SOC = 50% and charge setting 80%), the battery needs to be filled with 0.3 x 67.5 = 20.25 kWh. This is the net value, so the BMS knows the total value needed to fill 20.25 kWh net energy.

So in order for the BMS to stop the charge based on measuring net energy that has been pushed in -- the BMS needs an accurate assessment of the battery degradation? Otherwise, if it didn't realize the 75kWh battery had degraded to 67.5 kW it would not be able to accurately calculate how many kWh it needs to push in to reach full (or whatever target the user has selected)

Interesting stuff! Thanks for the info! Also amused with the 4.20 max charge lol.
 
So in order for the BMS to stop the charge based on measuring net energy that has been pushed in -- the BMS needs an accurate assessment of the battery degradation? Otherwise, if it didn't realize the 75kWh battery had degraded to 67.5 kW it would not be able to accurately calculate how many kWh it needs to push in to reach full (or whatever target the user has selected)

Interesting stuff! Thanks for the info! Also amused with the 4.20 max charge lol.
The BMS is quite OK with following the batteries degradation and is most often quite close to the real capacity. (There are a few tricks to check that…)

If the BMS overestimates the capacity, the charging level will overshoot the set level.
(I do not remember if I wrote that).
 
  • Like
Reactions: Nack and zwjt3