Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Lower Charging Rate\Battery Longevity vs Efficiency

This site may earn commission on affiliate links.
My past experience with lithium polymer powered hobby helicopters taught me that lower charging rates helped battery longevity and possibly battery cell balancing.

Does this hold true with the Tesla battery? if I am not in a hurry, or want to charge directly from my solar panels, I have lowered charging rates to 15 to 25 amps. By doing so, charging efficiency drops from the low 90% range to the low 70% range.

Anyone have opinions on why efficiency drops with lower charging rates and if the lower rates help with battery health or balancing?
 
My past experience with lithium polymer powered hobby helicopters taught me that lower charging rates helped battery longevity and possibly battery cell balancing.

Does this hold true with the Tesla battery? if I am not in a hurry, or want to charge directly from my solar panels, I have lowered charging rates to 15 to 25 amps. By doing so, charging efficiency drops from the low 90% range to the low 70% range.

Anyone have opinions on why efficiency drops with lower charging rates and if the lower rates help with battery health or balancing?
How are you measuring efficiency? What have you lowered the charging rates from? Are those the amps the ac charger is pulling from the house or amps into the battery?

Charging at higher rates does cause higher temperatures and more degradation in the cells. But even with the 72 amp charger, the charge rate is 0.18 C which is pretty low. There are lots of high mileage cars that have shown very little loss of range, so I wouldn't worry about charging current too much.
 
Last edited:
  • Like
Reactions: David99
Teslafi shows charging efficiency (among many other metrics). My maximum home charging is 48 A which gives me around 91% charging efficiency. If I drop it down to 15 to 25 A the charging efficiency drops to the low 70% range.

I also have a amp meter that keeps tracks of how many kilowatts are being used by the HPWC which shows similar results.
 
  • Like
Reactions: David99
Teslafi shows charging efficiency (among many other metrics). My maximum home charging is 48 A which gives me around 91% charging efficiency. If I drop it down to 15 to 25 A the charging efficiency drops to the low 70% range.

I also have a amp meter that keeps tracks of how many kilowatts are being used by the HPWC which shows similar results.
So are you comparing the kwh out of the house vs the kwh that are stored in the battery?
I suspect what you are seeing is the effect of the constant drain on the high voltage battery. The lower your charge rate the higher percentage this constant drain is.

If I sit in my car with the air conditioning on LO, the miles per hour added drops to 16 from the usual 27 with my 40 amp charger.
 
Last edited:
I doubt the battery will notice much difference in these charging rates you describe. Remember that other consumable battery-powered devices are charging fast compared to their small battery sizes. For example a laptop might be fully charged in 1-2 hours, which is comparable to supercharging in Tesla scale (1.5c). Normally you don't supercharge every time.

Charging from normal wall power ranges from around 0.05c (3-4kW) to 18 kW (0.20c). Even though the effects are large, the batteried is also big, so in comparison all wall charging is slow.

According to efficiency I haven't read much about it, nor performed any experiment. If you live in a cold environment, charging slowly will also increase the time the battery has to stay warm, thus also increase energy consumption.

At home I'd go for the fastest option feasible with your electrical installation and don't care too much about the efficiency. It's not like power is very expensive compared to petrol anyway :) (atleast not in Norway with expensive gas and just 0.1-0.15$ per kWt). I charge with 7 kW (32 amps on 230v) and is happy with that.
 
  • Like
Reactions: scaesare
Anyone have opinions on why efficiency drops with lower charging rates and if the lower rates help with battery health or balancing?

I believe lower efficiency with low charging rates is due to charging overhead making up a larger proportion of the incoming charge current.

You'll need a bit of imagination for the next part.

I use 120V 15A circuit to charge at home. I get 3-4 miles of range for every hour it is charged. That's fine for me, I'm retired. Anyway my use of electricity isn't as effective as those with 240V charging connections because there is a larger proportion of that charge power is used for overhead, pumps, etc.

I think the slow charge will be gentler on the batteries than slamming power in as I do at superchargers. I went to visit someone who had a 14-50 plug put in for me. There I got about 33 miles of range for each hour spent charging. Until that 14-50 went in, I charged with a 30A clothing drier plug adapter and a heavy 14-50 extension cord.

The 30A 240V would have provided 12-16 miles range per hour of charging if it was at the same efficiency as the 120V 15A. I got 23 miles of range as I recall, so much more efficient use of power. With the 14-50, the miles of range showed even more efficiency. Going from 30 to 40 amps is a 33% increase. I would have got 29 or so miles if the efficiency was the same as the 30 A, so the 33 miles shows even better use of power at 40A.

I don't know the gentle 120V 15A will extend battery life appreciably. I don't think there is any way to tell. Still, it makes sense to me that charging gently is probably better for the battery. It is more expensive, though, I'll pay quite a bit more for the range I get while charging at 120V 15A. Environmentally, I know I'm damaging the planet just a bit more with the slow charging. I am retired so I don't drive much anyway. I have to find reasons to drive.

It's a great car, so quiet, so powerful, so smooth. Oh, yes, that imagination part. Imagine me driving, I'm smiling.

Best,
David
 
It will probably take an electrical engineer to confirm, but I think I read that chargers are more efficient as they approach their maximum capacity. If this is the case, that would explain the drop in efficiency as you lower the charge rate from maximum.

Psychologically, I like that I can charge directly from the sun but my system is limited to 8 kilowatts per or so. Given the 20% or so drop in efficiency, it is cheaper to charge at my full 12 kilowatts per hour and "pay the power company back" utilizing their net metering program.
 
There was some thread that statistically people charging faster have slightly lower battery degradation. A possible explanation was that a faster charging keeps battery in higher temperatures (even despite the battery HVAC) for a shorter time overall.
 
There was some thread that statistically people charging faster have slightly lower battery degradation. A possible explanation was that a faster charging keeps battery in higher temperatures (even despite the battery HVAC) for a shorter time overall.
Jeff Dahn confirmed this point in a lecture. Increased charging time negatively impacts battery life. I understand that this is only one factor among many. Just noting that it is a factor.

I do wonder what is the optimal charge rate?. I would like to charge on 110 V outlet if it does not negatively impact battery life.
 
Jeff Dahn confirmed this point in a lecture. Increased charging time negatively impacts battery life. I understand that this is only one factor among many. Just noting that it is a factor.

I do wonder what is the optimal charge rate?. I would like to charge on 110 V outlet if it does not negatively impact battery life.

Pretty sure that it's a small factor. Qualitatively, slower charging is worse for the battery, but quantitatively, I expect the difference to be smaller than the difference between charging to 80% instead of 90% daily (which is much smaller than the difference between 90% and 100%)

If it otherwise fit my needs and usage, I wouldn't hesitate to charge on 120V daily.
 
  • Helpful
Reactions: kbM3
It will probably take an electrical engineer to confirm, but I think I read that chargers are more efficient as they approach their maximum capacity. If this is the case, that would explain the drop in efficiency as you lower the charge rate from maximum.
The charger in my car can supply 48 amps. I charge at 40 amps, and it's 90 percent efficient. Not all of the output from the charger goes into the battery. As you charge at lower current, the current that's not going to the battery is a higher percentage of what the charger is supplying.
 
Interesting topic. In the winter time (where it is around freezing), I have always timed my charge at about 35 - 40A (40A is my HPWC max rate), so that it is finished just before I head off in the morning. In the summer time, I tried to stretch out my charging at 15 - 20 A thinking that my resistive losses (due to I^2*R) due to thermal parasitic waste is minimized. I never looked at the charging rate in terms of efficiency.

Ultimately, I don't think it makes any difference over the life of the car.
 
As far as efficiency goes, charging at a lower rate is less efficient because there is overhead wattage to run the battery cooler, computer, etc. while charging (approx 500w from what I've read). So the faster you charge, the fewer kWh that goes to this overhead.

The only thing I've read about health is frequent supercharging/fast DC charging wears down the battery faster. I wouldn't think any reduction under 40a/240v would give any significant health benefits.
 
  • Like
Reactions: ig_epower
When charging, the battery is kept at higher voltage than when not charging at same SOC. Battery degradation is influenced by the average voltage over its lifetime. AC charging on TMS is considered as slow charging (less than 0.5 C), therefore it is best to charge fast as possible with AC to keep the battery at lower voltage over its lifetime.
 
Inverters have a bit of a bell curve with efficiency. I know this from my solar installs (home and work). There is a sweet spot for the inverter, and it is usually around 80% of capacity. This is one of the reasons the 110 charges are so inefficient, as well as the overhead discussed above.

For the battery, I think you are splitting hairs in charging speeds between 20 and 48 amps when discussing battery degregation. Whenever I don't need the speed, I charge at 30 amps for the reason I think it will make the inverter last longer, especially in the summer, because it would not be at max capacity and probably run a bit cooler.
 
After reading these threads and looked at the arguments, I have moved from slow-charging (15A) to just charging at 30A. I can go to 40A but I did notice my cut-off switch got a lot of carbon and hence stopped working when I was charging constantly at 40A. I used a Dremel tool to brush off the carbon and the switch is good as new. However to make that part last longer, I think I should not push it to 40A very often unless I need a fast charge rate (which is very rare in real life).
 
Lots of dicussions about this. To the battery it makes really no difference if you charge at 20 or 30 or 40 or 72 Amps. A higher rate would cause more stress but it also reduces the charge time so the effects would even out. Same with temperature. A higher charge rate brings up battery temperature but since the charge time is lower they would cool down sooner. And so on. There is nothing to worry about. If your electrical system in your house is limited, maybe it's a good idea to play it save and reduce the charge rate. But in terms of battery life, don't worry one way or another. Charging at CHADeMO or Superchargers is a different story. Tesla said it does affect the battery slightly.