Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Does charging at 11kw reduce battery life, vs 3-5kw?

This site may earn commission on affiliate links.
So it's conventional wisdom that supercharging will degrade your battery life, and that, as a rule of thumb, the slower you charge, the better it is for the battery.

There must be a point of diminishing returns. For instance, is charging at 1kw going to be twice as good for your battery as charging at 2kw? I doubt it.

But, I haven't been able to find any credible sources that have done actual research on this. Is there anything that shows the difference in battery degradation on charging rates that are 11kw and below?

In my particular scenario, I'm wondering if I should avoid charging my M3 at 11kw (48 amps) if I'm charging overnight and could still get back to ~85% charge if I instead charged at 4 kw or 5kw. Or, if there is no difference, I might as well leave the charging system at 11kw at all times, and not bother "throttling down."

My electric company does not have time-of-use charging, so my only consideration is impact on battery life.
 
What you might get in diminishing returns is if you lower the charging rate so much that you start crossing into the region where a larger percentage of the power going into your car is actually running the computers and not getting put into the battery. At 4kW, an approximately 16A charge rate, maybe 1/15th of that power (~250W) is eaten up by the car not sleeping. You would be spending slightly more time (i.e., money) to get to your charge limit since it would take a bit more kWh. As @davewill says, the charging rate you are talking about is much, much lower than you would see at a Supercharger.

I have been (almost) exclusively DC charging my car for just under 4 years now. There's been maybe one or two L2 sessions in that time. I had L2 charging where I worked for the first year I had my car but I retired in Sept. 2019. I've put on about 20K miles since then on my 2018 LR RWD. Currently, my projected 100% range is 300 miles; it was 315 or so when new. How come there's only 5% degradation with DC charging? I use a CHAdeMO adapter. It's design limited to a maximum of 50kW. The most I ever see out of it is about 45kW. I also Supercharger about 5% of the time. Around town it's at an Urban (72kW) Supercharger site so the most I see there is 60-65kW when I let my battery get below 35-40% (which isn't all the much). I tend to recharge at 50% and go up to 90%.
 
  • Informative
Reactions: Lindenwood
Upvote 0
There must be a point of diminishing returns.
Diminishing returns, and even eventually it inverts at the very low end, where extremely low charging rate is worse. I found a paper on this some years ago, and the reason is that being in a state of recharging is a little bit damaging. So if you turn the power down so low, that the battery is basically in the recharging state around the clock, it is doing that constant low level damage for a really long time. It helps it to be able to get done charging at a medium/slow rate, and then be done with it and let it rest and not be charging for a long period in between.
 
  • Informative
Reactions: jjrandorin
Upvote 0
My electric company does not have time-of-use charging, so my only consideration is impact on battery life.
If you want to keep it that way, do your part and avoid adding to utility load for no good reason.

But, I haven't been able to find any credible sources that have done actual research on this.
Huh.
I settled on 40A in order to finish charging my Chevy Bolt before 6pm. There are always trade-offs, and I chose the 6pm - 10pm window as my priority to not charge. I try to not wait till after 10pm because it leaves the pack hot. During the winter I will reduce the max charging rate to 24A to reduce the risk of Li plating.

I think the tl;dr narrow answer is this: likely to be inconsequential in not freezing weather. Winter is less clear since good studies show Li plating at 0.2 C-rate at SoC over 80% and IIRC, 20F pack temperature
 
Upvote 0
I find 11kW the ideal for home charging. It is fast enough that is not inconvenient and slow enough that wont affect battery on the long run.

On a extreme situation I had a friend who went to north during hash winter and with mobile charger in a 220v wall socket and car parked outside he could not charge at all, all the power were going to heating the battery to a level it could charge. He had to tow the car to a fast charger, a 11kW would have handled that situation well.
 
Upvote 0
On a extreme situation I had a friend who went to north during hash winter and with mobile charger in a 220v wall socket and car parked outside he could not charge at all, all the power were going to heating the battery to a level it could charge. He had to tow the car to a fast charger, a 11kW would have handled that situation well.

I've seen my Tesla not put charge into the battery on 40A -- it all went to battery heating for the better part of 10 minutes. That is normal when the pack is cold
 
Upvote 0
This is an interesting article and exactly the kind of research I've been looking for:

While it's only comparing DC fast charging and L1/L2 charging, it seems a reasonable extrapolation that if DC fast charging is causing "no statistically significant difference in range degradation" over L1/L2 charging, then there would be no statistically significant difference between, say 10kw and 5kw charging.
 
Upvote 0