Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Battery Degradation Scientifically Explained

This site may earn commission on affiliate links.
Status
Not open for further replies.
@EV-Tech Exp

I did a couple searches and briefly examined this thread before asking, apologize if I missed the answer:

In regards to maximum discharge rate compared to maximum charge rate for a battery - for the Tesla Model 3 battery chemistry specifically, any idea of what the relationship is? I imagine they correlate (though higher discharge rates are allowed at higher SoC it appears - the inverse of what happens with charging rates), but it looks like discharge is allowed to be faster than charging (or it may be this way only because of the likely duration of the high discharge rate...which would make me wonder about the Model 3 packs that are being tracked at close to maximum power for extended periods...).

It looks like the max discharge rate is somewhere around 350kW right now, from the 75kWh battery, so about 4.7C. Any conclusions on max discharge rate that is possible that we can draw from the 250kW (3.3C) maximum charge rate allowed by the battery?

Just curious about what is the likely output power limit of the Model 3 Performance pack (perhaps the power is limited by the motor and not the pack?).

I would guess that a lot more power might be possible, but Tesla might be opting not to do this because an elevated power output would have to be pulled back quite quickly (not necessarily due to the battery - but due to cooling requirements elsewhere) - which is one of the advantages of the Model 3 over prior models - it tends to do this quite gracefully.
 
  • Like
Reactions: darth_vad3r
@EV-Tech Exp
I'm curious, I had a debate with someone, they said high C-rate outputs utilized under max acceleration contributes a large chunk to degradation of batteries.
He gave this as reference:
Calculating the Battery Runtime - Battery University
(Figure 4)
I see that is really outdated, are there any new studies on the effects of short high c-rate outputs?

On the assumption this person is correct and out love for acceleration really does decrease the lifespan significantly of the battery then it would make sense that instead of a completely new battery pack from Maxwell tech they make some special 2KWh capacitor so the battery never has to take those peak in power from discharge or regen. That would even allow regen in Winter.
 
it would make sense that instead of a completely new battery pack from Maxwell tech they make some special 2KWh capacitor so the battery never has to take those peak in power from discharge or regen. That would even allow regen in Winter.

Yeah, I always pictured the capacitor as a cache for the battery, like RAM is a cache for the hard drive, and faster L2 cache is for slower RAM, etc.
 
Could someone clarify if my current daily cycling 70%->55%->70% (i.e. my charging is set to stop at 70% and my round trip daily commute takes <20%) is not a great idea because of it going through the 60% every time? Should I switch to charging to 60%. I have a once a week 40-45% use day and live in South Florida?
 
Last edited:
Thus, the conclusion was that if max SoC is taken below unstable levels (~90% SoC in the study), the difference that reducing max SoC further makes is negligible in comparison to the additional structural degradation induced by the regular phase changes of the cathode. The upper SoC stability level will be highly impacted by temperature however and will differ for every cell.

Sorry, but in plain English, are you saying that reducing daily charge limit below 90% has a negligible effect on reducing battery degradation? In this case, I will continue to charge to 90% every day?
 
  • Informative
Reactions: Arctic_White
The first 20 minutes of this video shows that this guy knows something or two about Li-Ion batteries. The final few minutes are the takeaway's to minimize battery degradation. His conclusions reinforce the credible advice I have seen previously.

View attachment 409086


This one video isn't Model 3 specific but his other videos cover Model 3 battery technology and his next video is about V3 Supercharging.
Thank you for the great information, I have a question though how about frequency of charging... lets say someone charges everyday for two hours (not reaching the 80%), and someone else charging less frequently (lets say to the same max level) would a battery being charged three times a week get better battery longevity than the battery that being charged 7 days a week?
 
Thank you for the great information, I have a question though how about frequency of charging... lets say someone charges everyday for two hours (not reaching the 80%), and someone else charging less frequently (lets say to the same max level) would a battery being charged three times a week get better battery longevity than the battery that being charged 7 days a week?

Personally, I have no idea what the answer to your question is. For sure you shouldn't leave your battery at very low charge level...so if charging infrequently caused that, it would be bad.

There are a lot of variables to control for in your scenario, though:
To control for some of them:
Let's assume that the above two users are: 1) Driving the same small amount of miles total each week. 2) Neither of them go below 40% between charges...

Then you have to wonder about the following - which I asked about before but didn't get an answer to:

An interesting piece of academic literature (can't recall the title but will update the post later with reference) has found, (for particular cells which presumably contain specific degradation suppressing additives), little to no increased degradation for cells cycled up to a maximum SoC of between ~60 - ~90. Below 60% max SoC, degradation was reduced, and ~60% SoC corresponds to the approximate SoC at which the cathode structural phase change occurs for an NMC or NCA cathode.

Thus, the conclusion was that if max SoC is taken below unstable levels (~90% SoC in the study), the difference that reducing max SoC further makes is negligible in comparison to the additional structural degradation induced by the regular phase changes of the cathode.

So it seems like (in some battery chemistries) you want to avoid regular phase changes of the cathode. If charging every day happens to transition you between two charge states which cross that boundary, would it be worse than just crossing that boundary once a week? It depends on exactly where the boundary is (in the above hypothetical, say it was at 65%, and the first user crossed it every day, driving from 80% to 60%, whereas the second user drove from 80-40% over the course of a week, and so only crossed it once every 2 days)...

I have no idea what the answer is, or how significant this structural phase change is. Remember the initial post was about STORAGE conditions and the optimum % for that - not the optimum for regular use conditions. When you're changing the battery state and transitioning it, the optimal % to charge to may not be the same as the optimal storage % (I have no idea).

All the above being said: personally, I don't worry about it. I charge to 80% or 90% as frequently as possible according to my needs, and assume it will be fine (since Telsa says that is cool). I find that 90% makes me more comfortable as it reduces the chance I'll be operating at a particularly low SoC, ever.
 
Last edited:
@EV-Tech Exp

I did a couple searches and briefly examined this thread before asking, apologize if I missed the answer:

In regards to maximum discharge rate compared to maximum charge rate for a battery - for the Tesla Model 3 battery chemistry specifically, any idea of what the relationship is? I imagine they correlate (though higher discharge rates are allowed at higher SoC it appears - the inverse of what happens with charging rates), but it looks like discharge is allowed to be faster than charging (or it may be this way only because of the likely duration of the high discharge rate...which would make me wonder about the Model 3 packs that are being tracked at close to maximum power for extended periods...).

It looks like the max discharge rate is somewhere around 350kW right now, from the 75kWh battery, so about 4.7C. Any conclusions on max discharge rate that is possible that we can draw from the 250kW (3.3C) maximum charge rate allowed by the battery?

Just curious about what is the likely output power limit of the Model 3 Performance pack (perhaps the power is limited by the motor and not the pack?).

I would guess that a lot more power might be possible, but Tesla might be opting not to do this because an elevated power output would have to be pulled back quite quickly (not necessarily due to the battery - but due to cooling requirements elsewhere) - which is one of the advantages of the Model 3 over prior models - it tends to do this quite gracefully.

For cells with a graphite anode, as per most lithium-ion batteries in the world, the stated charge rate is typically lower than the discharge rate, and this is typically limited by the rate of diffusion in the graphite, (from the core to the surface of a particle or vice versa). As the system is discharged, lithium ions leave the graphite anode, whereas when the system is charged, lithium ions are pushed into the graphite anode.

If the cell is discharged at an excessive rate, the voltage and thus available power will rapidly drop, but no great harm is done. The charge phase is deliberately limited to prevent lithium plating occurring, which would cause a safety concern. Consequently, it is possible to push much closer to the limit of performance on the discharge than on the charge for safety reasons. Furthermore, excessive power usage in charge causes much greater degradation than when using excessive power in discharge.

There may also be a difference in intercalation energy between the charge and discharge phase - not sure though, will need to look into it further (exhibited as charge transfer resistance).

So in summary, the charge rates are typically lower than discharge rates to ensure safety and maintain acceptable cycle life.


@EV-Tech Exp
I'm curious, I had a debate with someone, they said high C-rate outputs utilized under max acceleration contributes a large chunk to degradation of batteries.
He gave this as reference:
Calculating the Battery Runtime - Battery University
(Figure 4)
I see that is really outdated, are there any new studies on the effects of short high c-rate outputs?

On the assumption this person is correct and out love for acceleration really does decrease the lifespan significantly of the battery then it would make sense that instead of a completely new battery pack from Maxwell tech they make some special 2KWh capacitor so the battery never has to take those peak in power from discharge or regen. That would even allow regen in Winter.

High output rates can cause localised heating and mechanical stress which can accelerate degradation. The impact of these will partially depend upon how you define short; to build up a sufficient mechanical stress in the anode particles for them to crack, you would need to build a high concentration gradient of lithium between the particle surface and particle core - you would probably struggle to do this with a couple of high power bursts; I'd be more concerned about max speed runs where you're at max power for prolonged periods of time. The short 0-60 events shouldn't have a huge impact, but I don't recall seeing a study to definitively prove that out. Of course, it also depends upon how close you are getting to the limit of capability of the cell.

Yeah, I always pictured the capacitor as a cache for the battery, like RAM is a cache for the hard drive, and faster L2 cache is for slower RAM, etc.

If you see some of the other videos on my Youtube channel, you'll see I'm big into Supercapaitors, however a 2kWh supercapacitor pack would be too large and heavy. It is a nice concept, but unfortunately the numbers don't quite work.

Roughly 10Wh/kg, hence 2kWh = 200kg of cells, which realistically would be about 300kg when converted to a pack.

You could use a smaller pack to get the required power - roughly 7kW/kg, hence 360kW would require about 50kg of cells, and 75kg of pack, excluding the additional mass that comes with a more complex energy storage system.

An additional 75kg, with space requirements which would probably take the entire frunk, to slightly reduce battery degradation? Not really worth it....
 
@EV-Tech Exp
Thanks.

We've noticed that when the battery is somewhat cool (far from frozen - say 50-60 degrees) that as you regenerate, the car will limit regeneration more and more. Regen starts out limited, but then it becomes MUCH more limited. (This might happen for a 1-2% change in state of charge, say going from 70->72%.) The additional regen limiting is significant, way more than you would expect for a 2% change in SoC - especially considering the battery is presumably warming up a bit as energy sloshes into it and the motors generate a bit of waste heat which gets into the coolant.

Any idea why Tesla might pull regen, the more regen you do? Is it easier to stuff lithium ions into the anode initially, but then the battery gets tired of doing it?

Once the battery is really fully up-to-temp this no longer seems to occur, of course. But it takes a long time!
 
Last edited:
Could someone clarify if my current daily cycling 70%->55%->70% (i.e. my charging is set to stop at 70% and my round trip daily commute takes <20%) is not a great idea because of it going through the 60% every time? Should I switch to charging to 60%. I have a once a week 40-45% use day and live in South Florida?

What you're currently doing is probably fine. Charging to 75% and staying above 60% to not go through the cathode phase transition *might* be beneficial, though it might not. When we observe degradation of the battery, we are observing the weakest link, which is usually the anode, thus reducing cathode degradation may have absolutely no impact.

Honest answer is, I don't know the specific answer for this cell, however I think what you're currently doing is fine, and I wouldn't worry about it too much, as this shouldn't be the largest degradation factor you have to worry about, given that you live in a hot climate.

Keep doing what you're doing!

Sorry, but in plain English, are you saying that reducing daily charge limit below 90% has a negligible effect on reducing battery degradation? In this case, I will continue to charge to 90% every day?

That was the case for the cells in the study I referred to, however that is not universally true and not the case for all cells as some cells are more limited than others by electrochemical stability at high potentials.

Given that Tesla seem to suggest 80% for their battery system as the sweet spot, that is what I'd recommend.

Although the principles of battery degradation are universally true, the relative effects cannot always be carried across different cells.

Thank you for the great information, I have a question though how about frequency of charging... lets say someone charges everyday for two hours (not reaching the 80%), and someone else charging less frequently (lets say to the same max level) would a battery being charged three times a week get better battery longevity than the battery that being charged 7 days a week?

Usually, maintaining a smaller depth of discharge, i.e. charging more often (though not to higher than 80%), will result in better cycle life. Plugging in 7 days a week and going between 60-80% will be better than 3 times a week and going between 10-80% (assuming you're not in a hot climate - if you are, this may not be true). If it is within the approximately 35-80% window, it probably won't make too much of a difference, but when you allow the minimum SoC to get very low, it will have an impact.

As a rule of thumb, staying at 50 +/- 20% is a good place to be.
 
If you see some of the other videos on my Youtube channel, you'll see I'm big into Supercapaitors, however a 2kWh supercapacitor pack would be too large and heavy. It is a nice concept, but unfortunately the numbers don't quite work.

Roughly 10Wh/kg, hence 2kWh = 200kg of cells, which realistically would be about 300kg when converted to a pack.

You could use a smaller pack to get the required power - roughly 7kW/kg, hence 360kW would require about 50kg of cells, and 75kg of pack, excluding the additional mass that comes with a more complex energy storage system.

An additional 75kg, with space requirements which would probably take the entire frunk, to slightly reduce battery degradation? Not really worth it....

Is that 7kW of power deliverable per kg, but stored at 0.01kWh of energy per kg?

So 10kg could handle 70kW of regen, but be full in 5s of regen at that rate?

Hmm... even that small amount of buffer might increase the overall pack safety by avoiding lots of these small momentary regen events where the energy charged gets quickly used in the next few seconds?

This still fits my computer cache analogy as L2 cache could be 1-20 thousand times smaller than memory.
Say 2 MB cache for 8 GB RAM.

That would be like an 18 Wh capacitor “cache” for a 75 kWh pack. I think even something as small as 100 Wh capacitor could be useful.
 
Last edited:
Personally, I have no idea what the answer to your question is. For sure you shouldn't leave your battery at very low charge level...so if charging infrequently caused that, it would be bad.

There are a lot of variables to control for in your scenario, though:
To control for some of them:
Let's assume that the above two users are: 1) Driving the same small amount of miles total each week. 2) Neither of them go below 40% between charges...

Then you have to wonder about the following - which I asked about before but didn't get an answer to:



So it seems like (in some battery chemistries) you want to avoid regular phase changes of the cathode. If charging every day happens to transition you between two charge states which cross that boundary, would it be worse than just crossing that boundary once a week? It depends on exactly where the boundary is (in the above hypothetical, say it was at 65%, and the first user crossed it every day, driving from 80% to 60%, whereas the second user drove from 80-40% over the course of a week, and so only crossed it once every 2 days)...

I have no idea what the answer is, or how significant this structural phase change is. Remember the initial post was about STORAGE conditions and the optimum % for that - not the optimum for regular use conditions. When you're changing the battery state and transitioning it, the optimal % to charge to may not be the same as the optimal storage % (I have no idea).

All the above being said: personally, I don't worry about it. I charge to 80% or 90% as frequently as possible according to my needs, and assume it will be fine (since Telsa says that is cool). I find that 90% makes me more comfortable as it reduces the chance I'll be operating at a particularly low SoC, ever.

The magnitude of the impact made by the cathode phase transition is small in comparison to the SEI on the anode.

I've recently taken delivery of my car and plan to charge to 70-90% daily as my needs dictate and not worry about things too much!
 
@EV-Tech Exp
Thanks.

We've noticed that when the battery is somewhat cool (far from frozen - say 50-60 degrees) that as you regenerate, the car will limit regeneration more and more. Regen starts out limited, but then it becomes MUCH more limited. (This might happen for a 1-2% change in state of charge, say going from 70->72%.) The additional regen limiting is significant, way more than you would expect for a 2% change in SoC - especially considering the battery is presumably warming up a bit as energy sloshes into it and the motors generate a bit of waste heat which gets into the coolant.

Any idea why Tesla might pull regen, the more regen you do? Is it easier to stuff lithium ions into the anode initially, but then the battery gets tired of doing it?

Yes - to prevent lithium plating.

You're building more and more lithium ions at the surface of a graphite anode particle, and building up a big difference in concentrations of lithium particles at the surface in comparison to the core of the particle. .

If the rate at which the lithium ions are arriving exceeds the rate at which they naturally diffuse into the core of the particle, the lithium will form dendrites at the surface.

If you do some regen, lithium comes to the surface of a particle and diffuses into the core, or gets used during a discharge. During a continued charge, surface concentration continues to build, and Tesla limit the rate of regen to prevent this getting to a point where lithium plating occurs.

This is more evident as the temperature drops, the rate of diffusion drops.

Hope that is clear, though I suspect is may not entirely be!



Is that 7kW of power deliverable per kg, but stored at 0.01kWh of energy per kg?

So 10kg could handle 70kW of regen, but be full in 5s of regen at that rate?

Hmm... even that small amount of buffer might increase the overall pack safety by avoiding lots of these small momentary regen events where the energy charged gets quickly used in the next few seconds?

That's right in terms of capability, however it is a lot of additional complexity and cost for the capability you receive...

Combined systems will be coming in the future, but they'll be battery/battery combinations rather than battery/supercapacitor....
 
That's right in terms of capability, however it is a lot of additional complexity and cost for the capability you receive...

Combined systems will be coming in the future, but they'll be battery/battery combinations rather than battery/supercapacitor....

Well, just like we have CPU, L1, L2, L3, L4, RAM, SSD, Hard-drive, cloud storage ...
We could eventually see capacitor/battery/battery systems.

If the added cost reduces the risk of spontaneous combustion by a factor of 10 or 100 I suspect it would be worth it to pursue.

Tesla isn’t telling us whose batteries are being selected for range reduction based on what criteria with the recent updates ... what if high frequent or total lifetime regen is a factor, or the factor?
 
Well, just like we have CPU, L1, L2, L3, L4, RAM, SSD, Hard-drive, cloud storage ...
We could eventually see capacitor/battery/battery systems.

If the added cost reduces the risk of spontaneous combustion by a factor of 10 or 100 I suspect it would be worth it to pursue.

Potentially - let's see where the technology takes us and what the best engineering solution for the available technology is!
 
Hope that is clear, though I suspect is may not entirely be!

Makes perfect sense. Sounds like a completely physical explanation for that mystery, which is quite non-intuitive. Normally people expect regen to become less limited as they drive (battery warms up!), but in fact, sometimes (in particular when doing a lot of regen!), it actually becomes MORE limited the more you drive.
 
Potentially - let's see where the technology takes us and what the best engineering solution for the available technology is!

Yes indeed.

I see a point in the future where they look back at our huge battery packs and laugh at them like we do at computers in the past that were the size of rooms and less powerful than our phones are today. I mean, we only need on the order of few pounds* of electrons to move us hundreds of kilometres. It’s just a simple matter of “storing” them and making them go where we want ;)

*Too lazy to search for the post that had the math... it was either a few grams or pounds. Something really small anyways :)

That should be the minimum size of some distant future super-advanced battery technology :)
 
  • Informative
Reactions: Arctic_White
I've recently taken delivery of my car
Excellent— I hope you enjoy it. I’m sure you’ll notice battery characteristics that are imperceptible to us laymen.
Makes perfect sense. Sounds like a completely physical explanation for that mystery, which is quite non-intuitive. Normally people expect regen to become less limited as they drive (battery warms up!), but in fact, sometimes (in particular when doing a lot of regen!), it actually becomes MORE limited the more you drive.
I don’t understand why long duration regen is limited when that regen is 10’s of kW (10-50+), whereas Supercharging is much higher. @EV-Tech Exp What allows the continuous Supercharging?
 
I don’t understand why long duration regen is limited when that regen is 10’s of kW (10-50+), whereas Supercharging is much higher. @EV-Tech Exp What allows the continuous Supercharging?

It might follow the supercharging profile curve or something a bit more conservative for some reason?

@AlanSubie4Life have you experienced limited regen after a long-duration regen when you are at a lower SoC like 20-30%, or only when you are up in the 70's SoC?

One thought ... it could have to do with the ramp up. SC-ing does a slower ramp, not an instantaneous start.
Since regen is like a one-second ramp every time, they might have to limit it more than SC-ing?
 
Status
Not open for further replies.