Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Ideal Charge Rate??

This site may earn commission on affiliate links.

Perfect; Thanks!

I guess to prove that no good deed goes unpunished, I've got some questions:
  • Were these measurement all taken in the same temperature range?
  • What was that range?

I got my MS in December and have been charging it in my garage where temps have often been below zero.
I don't have a dedicated meter for the charger, so my data is pretty bad, but I'm seeing lower efficiencies and thinking temperature may be a factor.

Looking forward to spring!
 
Perfect; Thanks!

I guess to prove that no good deed goes unpunished, I've got some questions:
  • Were these measurement all taken in the same temperature range?
  • What was that range?

I got my MS in December and have been charging it in my garage where temps have often been below zero.
I don't have a dedicated meter for the charger, so my data is pretty bad, but I'm seeing lower efficiencies and thinking temperature may be a factor.

Looking forward to spring!
All of that data was gathered in summer/early fall season with mostly warm weather. lt was rarely hot enough that the car's cooling fans would run. A couple of the low-efficiency outlier points were examples where the cooling fans did turn on during charging because it was fairly hot inside the garage. I didn't take any winter measurements, certainly none where the battery heater would be active which would dramatically lower efficiency.

I didn't log the date or the temperature. The google sheet does have the full history so it should be possible to figure out the date of each entry and lookup the temperature for Clarksville, MD on that date. I probably won't go to that trouble though; if the history/dates aren't available to you and you want to do this, I could give you edit privileges so you could do it, just PM me.
 
I got my MS in December and have been charging it in my garage where temps have often been below zero.
I don't have a dedicated meter for the charger, so my data is pretty bad, but I'm seeing lower efficiencies and thinking temperature may be a factor.

Looking forward to spring!
Cold weather strategy for my wife’s S100D, kept on the driveway. Near Philadelphia, we get stretches of below-freezing weather.

She typically does only local driving, wants comfort, not range. Time of Use metering not available.

Objective is to have a warm battery pack when she unplugs the car and sets out at some unpredictable time between 08:00 and 10:00.

Strategy is to set Max charge rate and start time so that car is still charging when she unplugs, yet finishes the day with at least 90 miles of range. Wife is uncomfortable with lower charge level. She never let her previous ICE cars go below 1/4 tank.

Set Max charge level about 260 miles. Set charge rate about 30 Amps. Start charging at 05:30.

If car is approaching Max charge, skip charging for a day or two. If it is well below freezing in the morning, start preheat at 07:00.

Could charge up to 72 Amps from HPWC on 100 Amp circuit. Only do that when aiming for full charge before a long drive. 2-gauge copper cable to main panel gets warm at full-rate charge.

BTW - resistance heat loss is pretty much the same for a given KWh charge, regardless of charge rate. You can get your wires pretty warm for a short time with high Amps, or leave them cool to the touch for a long time with low Amps. Power lost to resistance (heat) is Amps (amount of current) times Ohms (resistance). Unless you’re dealing with superconductors or extreme temperatures, resistance doesn’t vary with temperature.

Thus, cutting Amps in half means you cut power loss in half.

Multiply power by time to find energy loss.

Cut your Amps in half, you’ll have to charge twice as long to pack the same number of miles into your car’s battery.

Thus, lose half as much power in the wire, for twice as long, and you’ve lost the same amount of energy.

Thicker (lower gauge) cable to your charge point does save some money in the long run by lowering resistance. Less power lost in the cable, more delivered to the charger.
 
  • Disagree
Reactions: Rocky_H
...Power lost to resistance (heat) is Amps (amount of current) times Ohms (resistance). Unless you’re dealing with superconductors or extreme temperatures, resistance doesn’t vary with temperature.

Thus, cutting Amps in half means you cut power loss in half.

Multiply power by time to find energy loss.

Cut your Amps in half, you’ll have to charge twice as long to pack the same number of miles into your car’s battery.

Thus, lose half as much power in the wire, for twice as long, and you’ve lost the same amount of energy.

...

This is totally wrong. Power in a resistance is proportional to the SQUARE of current. Thus a higher current causes dramatically higher power loss in the resistance. Fortunately the power lost in the wires and cables is small for home charging so even dramatically increasing them does not make them too large.
 
Strategy is to set Max charge rate and start time so that car is still charging when she unplugs, yet finishes the day with at least 90 miles of range. Wife is uncomfortable with lower charge level. She never let her previous ICE cars go below 1/4 tank.

Set Max charge level about 260 miles. Set charge rate about 30 Amps. Start charging at 05:30.

I have noticed @ temps near or below freezing, that the battery level will drop off a 2-3% overnight, so the early morning charge strategy makes sense for the highest charge when she leaves. But, think it would make sense to start it early enough so it's not still charging when she unplugs.

BTW, hacer's got the power vs amps right. Just google Ohm's Law more details on that ;)

I need to get my plug metered so I can get some good data on all the losses, multiple things affecting it obviously, and wondering what affect the temperature has vs. charge current.

One issue I just found is that the "added kWh" ramps up with charging, which makes total sense, but then drops as vampire losses add up over time. It doesn't stop dropping until the car is unplugged, then it holds the last value. So that can throw off the efficiency calculations depending on when you collect that kWh added value.
 
Willing to stop charging short of the target.

- My wife doesn't use much power on normal days.

- When she presses the button on HPWC cable to disconnect, charge seems to shut down gracefully before the connector is unlocked. Similar to SuperCharger when we stop charging before 100% on highway trip.

I accept the correction on my resistance calculation. Looks like lower charging current stuffs higher percent of the energy into the battery, ignoring:
- Any battery heating that may be required because charge current is insufficient to keep battery warm.
- Possible lower AC/DC converter efficiency at low power levels.
 
NO!

There has been some misinformation about this for a while now (at least regarding the refreshed on-board chargers). I have been gathering data (still in progress) that clearly shows that charging efficiency is better at reduced current. The efficiency here is measured as the ratio of "charge_energy_added" as reported from the car's REST API divided by the meter reading of the meter connected to my HPWC. I've have also verrified for 2 charging cycles that the meter reading is in agreement with a time-integration of the product of "charger_voltage" and "charger_actual_current" from the REST API sampled every 3 seconds over the course of the charge duration. (This last part means that anybody can replicate these experiments entirely with the REST API even if they don't have a utility meter dedicated to the charger). Below doesn't look like many points, but in fact there are a few points lying on top of each other (at least at this scale). I'll be publishing my full spreadsheet once I've completed my experiments. I provide this preliminary sneak previous because it clearly contradicts the mantra of "highest power is most efficient". I haven't yet tested below 25A, but I suspect 25A is near peak efficiency.
pubchart
I'm assuming this is at 240V? Because if you are on a 208V source you may find a higher current is more efficient.. basically we should look at kW instead of amps..
25A X 240 = 6kW or about 60% of the charger rating (at least my charger)..
208V often drop to 200V while charging so 6kW / 200 = ~30A

208V common at destination chargers
 
Last edited:
I don't think it's as simple as just looking at Kw. Heat from resistive elements (wiring and charging components upstream from the battery) goes by the square of the current (I**2 R), so 10kw at 208v will generate more waste heat than 10kw at 240v, leaving less net energy for the charging the battery. This is why the electric grid is run at such a high voltage, to lower the wasted heat.
 
  • Informative
Reactions: FlatSix911
I don't think it's as simple as just looking at Kw. Heat from resistive elements (wiring and charging components upstream from the battery) goes by the square of the current (I**2 R), so 10kw at 208v will generate more waste heat than 10kw at 240v, leaving less net energy for the charging the battery. This is why the electric grid is run at such a high voltage, to lower the wasted heat.

That's true but the power (and current) on the secondary side is the same.. thus I would guess you can charge at a slightly higher current at low voltage more efficiently
 
Great thread, lot of good info. Losses in the onboard charger aside....

With a Tesla you can easily calculate ‘line loses’. Watch the voltage before it starts charging, then watch the voltage it drops to when it’s settled at the set current. Loss equals voltage drop times current. So if you drop 10 volts and are charging at 32 amps, you are losing 320 Watts (single phase system). If it would be a 3 phase system (Europe etc) you are losing 960 Watts. As mentioned already, half the current will be one quarter the loss.

I’m a Tesla novice but I am keeping my current low(ish) because of this ‘square root’ loss effect as current goes up.
 
I calculate losses by comparing the kWh's reported by my energy meter with the values reported by the Tesla BMS via an OBD2 and ScanMyTesla.

THese 3 sessions are at 3 phase 230V (400V) at 24A. Meaning, 17kW.

1628423265147.png


I'm comparing Nominal remaining kWh from the BMS at start/end with the kWh from the energy meter at start/end.

Interestingly Teslafi underreports the efficiency quite a bit. Energy added as reported by the API does not seem the right value to use. I'm wondering what that value actually means as I expected it to be identical to the difference in Nomimal remaining but there is a ~3% deviation.

So far the losses are 12.3% on average when charging at 400V/24A.

As my average consumption for my S100D is 224wh/km the *actual* average consumption is 255wh/km. That is a significant difference!
 
  • Informative
Reactions: FlatSix911
I calculate losses by comparing the kWh's reported by my energy meter with the values reported by the Tesla BMS via an OBD2 and ScanMyTesla.

THese 3 sessions are at 3 phase 230V (400V) at 24A. Meaning, 17kW.

View attachment 693741

I'm comparing Nominal remaining kWh from the BMS at start/end with the kWh from the energy meter at start/end.

Interestingly Teslafi underreports the efficiency quite a bit. Energy added as reported by the API does not seem the right value to use. I'm wondering what that value actually means as I expected it to be identical to the difference in Nomimal remaining but there is a ~3% deviation.

So far the losses are 12.3% on average when charging at 400V/24A.

As my average consumption for my S100D is 224wh/km the *actual* average consumption is 255wh/km. That is a significant difference!
The Energy Added is a measure of the energy flowing into the battery terminals. Not all of this energy goes to storage. Some is lost as heat but for low charging rates it's usually not very much. There is no good way to measure that loss outside of the lab (where calorimetry can do it). However, the 3% deviation is quite large (and probably wrong) and I think this has more to do with how the Nominal Remaining is calculated. Tesla has a very complicated estimator for battery state of charge (SOC). It does a good job for what it is supposed to do (tell you how much range you have left) but I would not consider it accurate for use for measuring efficiency. Tesla has one of the best (if not the best) battery gauges of any EV but it's still not worthy of calculating charging efficiency.

There have been many reported examples near the highest and lowest SOC where the amount of energy drawn from the battery is markedly different from the delta Remaining charge. For example, it can sometimes read 0% SOC on the dash display (which of course is non-zero SOC in the energy Remaining API) yet the car can drive and consume several more kWh while the estimated SOC barely changes because the cell voltage is not dropping as the estimator predicted it would. Since this happens on the discharge side, it most certainly happens on the charge side as well.

The SOC estimator uses cell voltages, temperatures, histories, battery voltage drops while under load and all sorts of other things to estimate the present state of charge, but this is a hard problem because over most of the important range of SOC there are only small battery voltage deviations, so don't fault Tesla that it's not super precise.

TLDR: Take the energy Remaining number with a healthy grain of salt. Using the energy added will overestimate efficiency (because not all of that energy went into SOC) but only by a little and it's the best value we have for estimating efficiency.

Edit "grain of salt" is a colloquialism in English that means to be skeptical of something.
 
  • Like
Reactions: FlatSix911