Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Better to charger higher or run battery lower?

This site may earn commission on affiliate links.
Yeah makes sense, I was just making sure the car wasn't pulling a lot of power while I took my readings. The no charge rate was next to zero so the gross/net values are really close.

...and interesting thing I noticed when I took the readings. I took readings off both of the 120v lines. One line read consistently a little higher than the other one in amps. I took both values and averaged them together for the chart.

I also noticed that the current fluctuated about 0.2 amps every 10-15 seconds. It would roll up +0.2 then roll down, then after a few seconds, roll up again, etc. I took the readings at the bottom of those fluctuations so it was as consistent as I could get.
 
Yeah makes sense, I was just making sure the car wasn't pulling a lot of power while I took my readings. The no charge rate was next to zero so the gross/net values are really close.

...and interesting thing I noticed when I took the readings. I took readings off both of the 120v lines. One line read consistently a little higher than the other one in amps. I took both values and averaged them together for the chart.

I also noticed that the current fluctuated about 0.2 amps every 10-15 seconds. It would roll up +0.2 then roll down, then after a few seconds, roll up again, etc. I took the readings at the bottom of those fluctuations so it was as consistent as I could get.

Looking at your TeslaFi data which shows 17.7 mi/hr charge rate at 20A you can see some rough calculations on efficiency at different rates. The numbers are all rounded though, so there's extra error.

For our purposes here let's assign your car an arbitrary 240 Wh/mi internal consumption constant.

20A:
17.7 mi/hr x 240 Wh/mi = 4.25 kW (charge rate reported by the car, I believe this to be the best estimate of power delivered to the cells)
Charging power was 20 A x 233 V = 4.66 kW
91.2% efficiency.

10A:
It's at 8.5 mi/hr at 10A (the 2nd one since the first one is probably higher Amps in the taper, like 10.4?).
8.5 x 240 = 2.04 kW (power delivered to cells)
10 A x 237 V = 2.37 kW (charger power)
Efficiency = 86.1%

5% lost efficiency from 20 A to 10 A

7A (the 2nd one again):
5.8*240/(7*239) = 83%

Ignoring some unknown decimal places on the "charger actual current" the trend in efficiency calculated this way is worse the lower the current, implying some fixed overhead losses.
 
Ah interesting stuff! I'll have to look over your calculations again in the morning. It's time for bed and 4:30am comes up pretty quick! ;-)

I set the car to charge at 32A to see what TeslaFi reports the efficiency at. Someone had said it was believed that 32A was the magic number. We will see. Looking back in TeslaFi charge history, 20A gets about 94-96% efficiency. The overnight charges I've had at 30A showed around 93-96% efficiency.

I'll let 32A run for a number of days and then maybe I'll switch it over to 40A and see how that runs with efficiency. Fun to experiment with the different rates to find out if there are any sweet spots. Feel like I'm exploring a MMORPG but with electrons instead of mobs. heh
 
  • Like
Reactions: Tam
At what voltage and current is your 95% number tuned for? Or does it not matter?

The onboard charger efficiency metric is independent of input voltage. My car sees 238 volts at full load (24 amps via NEMA 14-30).

The inputs to determine power (in kW) going to the battery (after overhead/conversion losses) are:
- volts
- amps
- onboard ac to dc efficiency (%)
- overhead (watts)

Then you determine time by dividing that figure into the battery pack capacity (in kWh) added.
 
The onboard charger efficiency metric is independent of input voltage. My car sees 238 volts at full load (24 amps via NEMA 14-30).

The inputs to determine power (in kW) going to the battery (after overhead/conversion losses) are:
- volts
- amps
- onboard ac to dc efficiency (%)
- overhead (watts)

Then you determine time by dividing that figure into the battery pack capacity (in kWh) added.

I ran some super quick numbers doing a sweep of 30, 25, 20, 15, 10, 5A charge rates at an L2 station and polling the API. If I use ~350W I get close to 95% charger efficiency at all levels. SR+, no accessories on (HVAC off).

I will repeat later with everything from 5 through 30 A and maybe make a chart.

EDIT: Replaced first chart with one that also shows car’s estimated time to charge (~2 kWh) ... needs further analysis ... I don’t think my calculations match the car’s estimates, nor does 250 W...
9F1B1F6D-3EB2-4E3C-8D05-CBE8BAACD154.jpeg
 
Last edited:
  • Informative
Reactions: AlanSubie4Life
If I use ~350W I get close to 95% charger efficiency at all levels. SR+, no accessories on (HVAC off).

Good data.

Were you inside or outside the car, and was the screen off? It is kind of splitting hairs to figure out whether the overhead is 250W or 350W, or whatever.

Also, I would not assume constant charger efficiency (as you can see, it is not constant - slightly lower efficiency at lower current). Though it may not vary that much. It is again, kind of splitting hairs - both your model and @pdx_m3s's model seem very close, and the dominant reason that charging faster is more efficient is the fixed overhead. It is I guess the only way you can "fit" your data - you just assume a constant overhead and then see what the charger efficiency works out to be - but I wouldn't just assume the value of overhead watts that gives the most constant charger efficiency is the "correct" value.

Specifically: I'd kind of expect a small drop in efficiency (transiently) as you go past 16A of charging, because then it has to use both chargers. Whether that puts 1A through one charger and 16A through the other, or 8.5A through each, I have no idea (and at 16A, is it putting 8A through each or 16A through one? - I would guess the latter). If the data is precise and accurate enough, it should be possible to see what happens. As you continue to go up to higher charge currents (say more than 20A), then the charger efficiency should get really good again since there would be plenty of current for both chargers to operate efficiently. The exact behavior will depend on how Tesla activates the chargers and whether they share loads (presumably they do what is most efficient overall, which is determined by the efficiency curve vs. output load of each AC-DC converter and the demanded current - but perhaps not).

Also if it's possible to grab the input voltage and current from the API as well (not clear whether you got that from the API or the screen), that might give you more accuracy of course.
 
Last edited:
Good data.

Were you inside or outside the car, and was the screen off?

I was inside, all accessories were off (HVAC, radio), but the screen was on in daytime mode at auto-brightness. The only way I have to control the current is from the in-car charge screen so to gather these 6 data points rather quickly I didn't think to change it then get out of the car and back in to change it each time :D

It is kind of splitting hairs to figure out whether the overhead is 250W or 350W, or whatever.

Yes, agreed ... but I enjoy hair-splitting and was intrigued by the possibility of proving or disproving if the charger efficiency was constant or not :)

Also, I would not assume constant charger efficiency (as you can see, it is not constant - slightly lower efficiency at lower current). Though it may not vary that much. It is again, kind of splitting hairs - both your model and @pdx_m3s's model seem very close, and the dominant reason that charging faster is more efficient is the fixed overhead. It is I guess the only way you can "fit" your data - you just assume a constant overhead and then see what the charger efficiency works out to be - but I wouldn't just assume the value of overhead watts that gives the most constant charger efficiency is the "correct" value.
Agreed.

Specifically: I'd kind of expect a small drop in efficiency (transiently) as you go past 16A of charging, because then it has to use both chargers. Whether that puts 1A through one charger and 16A through the other, or 8.5A through each, I have no idea (and at 16A, is it putting 8A through each or 16A through one? - I would guess the latter). If the data is precise and accurate enough, it should be possible to see what happens. As you continue to go up to higher charge currents (say more than 20A), then the charger efficiency should get really good again since there would be plenty of current for both chargers to operate efficiently. The exact behavior will depend on how Tesla activates the chargers and whether they share loads (presumably they do what is most efficient overall, which is determined by the efficiency curve vs. output load of each AC-DC converter and the demanded current - but perhaps not).

Also if it's possible to grab the input voltage and current from the API as well (not clear whether you got that from the API or the screen), that might give you more accuracy of course.

This was all automated input collection from the API (other than calculated columns from that info). Any thing lower_case_with_underscores is the actual API field name (all from the charge_state API endpoint, except where specified otherwise, e.g. drive_state.power).

The API reports the same values as the car charge screen. It is actually querying the car via Tesla's servers to get those numbers. Every time I looked they matched.

I was just intrigued by @pdx_m3s 's data and wanted to try to replicate it. I agree we are close. Only reason I bumped up from 250 to 350 was to try to get to his constant charger efficiency.

I hadn't considered the 2-chargers factor either. I'm relying purely on the API and have no current clamps or other measuring devices.

As an aside, I'm happy with my 219 Wh/mi figure for my SR+ (this was previously calculated based on many Supercharger data samples recovered manually from a video). At 30A, the calculated charger power is 5.45 while the drive_state API reports -5 kW. At 25A the calculated power is 4.51 kW and the drive_state API still reports the same -5 kW. This tells me I'm in the sweet spot and any lower than 219 and my 25A number would have rounded down to 4 kW (0.01 above the rounding threshold) and any higher than 219 and the 30A number would have rounded up to 6 kW (my number was 0.05 kW below the rounding threshod).

My assumption is that the drive_state.power field is the DC power delivered to the battery for charging, rounded to nearest kW, and that the reported charge_state.charge_rate in mi/hr is that same underlying number multiplied by a constant dependent on your trim (219 Wh/mi for SR+).

When you are driving, drive_state.power is a positive number, unless you are doing regen, then it is negative.

Also note that the loss is clear in at least one data point even from the rounded power numbers. At 30A the charge_state.charger_power was 6 kW, while the drive_state.power was -5 kW (for all my other data points, 25, 20... 5A the charger and drive state powers match).
 
Last edited:
  • Informative
Reactions: AlanSubie4Life
I was inside, all accessories were off (HVAC, radio), but the screen was on in daytime mode at auto-brightness.

350W seems reasonable then. I always see 2A @ 240V reported on the screen briefly when no charging is taking place, before the Wall Connector relay opens, which would be a minimum of 1.5*240 = 360W in that state.

It will definitely be lower when you're not in the car and the screen is off. How much lower, I don't know.

Looks like super reliable data. Will be cool to see the "per amp" data plotted if you actually do get a chance to measure it later.
 
350W seems reasonable then. I always see 2A @ 240V reported on the screen briefly when no charging is taking place, before the Wall Connector relay opens, which would be a minimum of 1.5*240 = 360W in that state.

It will definitely be lower when you're not in the car and the screen is off. How much lower, I don't know.

Looks like super reliable data. Will be cool to see the "per amp" data plotted if you actually do get a chance to measure it later.

Ah ya, I've seen that 2 A I think too... I'm at a 208 V install at work I think, so my same "2 A" could also still be ~360 W with 1.7 A :)
Hmm wait, I think there's also a 2 V that's usually there too? Hmm.

Anyways, they really need a display setting 'for geeks' that gives us at least one decimal point. C'mon Tesla! :)
 
  • Like
Reactions: AlanSubie4Life
Back to the OP's question, is it detrimental to allow the battery to reach a low SOC? I live in a semi-remote area, and I misjudged on the drive home and arrived with 9 miles of estimated range (didn't check the percentage). Ok on occasion, or always bad?
It’s a cumulative effect. Once or twice wouldn’t be measurable, but don’t make a habit of it.
 
  • Like
Reactions: Rocky_H
I ran some super quick numbers doing a sweep of 30, 25, 20, 15, 10, 5A charge rates at an L2 station and polling the API. If I use ~350W I get close to 95% charger efficiency at all levels. SR+, no accessories on (HVAC off).

I will repeat later with everything from 5 through 30 A and maybe make a chart.

EDIT: Replaced first chart with one that also shows car’s estimated time to charge (~2 kWh) ... needs further analysis ... I don’t think my calculations match the car’s estimates, nor does 250 W...
View attachment 439004

I was inside, all accessories were off (HVAC, radio), but the screen was on in daytime mode at auto-brightness. The only way I have to control the current is from the in-car charge screen so to gather these 6 data points rather quickly I didn't think to change it then get out of the car and back in to change it each time :D

[...]

This was all automated input collection from the API (other than calculated columns from that info). Any thing lower_case_with_underscores is the actual API field name (all from the charge_state API endpoint, except where specified otherwise, e.g. drive_state.power).

The API reports the same values as the car charge screen. It is actually querying the car via Tesla's servers to get those numbers. Every time I looked they matched.

I was just intrigued by @pdx_m3s 's data and wanted to try to replicate it. I agree we are close. Only reason I bumped up from 250 to 350 was to try to get to his constant charger efficiency.

I hadn't considered the 2-chargers factor either. I'm relying purely on the API and have no current clamps or other measuring devices.

As an aside, I'm happy with my 219 Wh/mi figure for my SR+ (this was previously calculated based on many Supercharger data samples recovered manually from a video). At 30A, the calculated charger power is 5.45 while the drive_state API reports -5 kW. At 25A the calculated power is 4.51 kW and the drive_state API still reports the same -5 kW. This tells me I'm in the sweet spot and any lower than 219 and my 25A number would have rounded down to 4 kW (0.01 above the rounding threshold) and any higher than 219 and the 30A number would have rounded up to 6 kW (my number was 0.05 kW below the rounding threshod).

My assumption is that the drive_state.power field is the DC power delivered to the battery for charging, rounded to nearest kW, and that the reported charge_state.charge_rate in mi/hr is that same underlying number multiplied by a constant dependent on your trim (219 Wh/mi for SR+).

When you are driving, drive_state.power is a positive number, unless you are doing regen, then it is negative.

Also note that the loss is clear in at least one data point even from the rounded power numbers. At 30A the charge_state.charger_power was 6 kW, while the drive_state.power was -5 kW (for all my other data points, 25, 20... 5A the charger and drive state powers match).

Updated info from a new test performed in the same manner as previously (see my prior quoted posts above for info), visualized with 2 theoretical models of the charge efficiency using 250 or 350 watts of fixed charging overhead and 95% underlying charger efficiency.

This was for a 208-volt L2 charger, with a Model 3 SR+, multiple data points taken (2-3) at each amp setting in the car, from 5->30A.

Note, I did not notice any drop in efficiency crossing from 16A to 17A as one might theorize if that triggers a 2nd 16A charge unit to kick in and lower the efficiency.

Screenshot from 2019-08-15 21-41-24.png


The 250 W overhead line (blue-dashed) is a fairly good fit.
 
  • Informative
Reactions: AlanSubie4Life
Isn’t this the 350W line?

Yes!

Also, were you sitting in the car for this (sounds like yes)?

... and yes. Same scenario as previously posted where I change the current in the car via the charge screen, then capture the data via the API after it settles (and a few times repeated with results averaged).

Due to the fact I have no control over the current other than the on-screen menu it‘s very unlikely I will repeat this test from outside the car at all 26 current settings, but I do already have some data at 30 A where it was sitting at charging normally for a couple hours before the test.

When I get the chance I’ll see if I can spot a difference between the two, but it may be within the margin of error since the current has no decimal places — it could theoretically be 29.5-30.499 A when it says “30 A” and google tells me a 15” LCD uses only 18 W. I’m not sure what other savings there might be if outside of the car with the screen off while the car is charging — I can’t think of any, but there may be some sensors powered down that offer more savings? e.g. airbags?? I might try a different current setting from in and out of the car next time as well to see if I can notice the difference. If it was possible to notice I’d expect a 20 W difference or about 0.5% difference at 20 A, and 1% at 10 A ... hmm, I guess that is probably detectable. My 10 A calculations were 78.7%, 79.3%, and 79.3%. 20 A was 87.5, 87.5, and 87.7%.

... ok, so I went back now to look at the 30 A data I have ... the average efficiency from 2 data points 10 minutes and 1hr10m before the test was surprisingly a whopping 2.2% better. That implies about 130 A less loss ... there may be more going on here than just the screen. Perhaps the battery cooling system was running during my test as well. It was in an underground parking at somewhere around 25°C at 60% SoC. Hmmm. I actually thought maybe the alarm would add some loss while locked and the difference might be less than the LCD screen power. Welp, will definitely be gathering more data to compare at a few more current settings — maybe 10, 20, and 30A .

I may also try to incorporate I-squared R loss into the model in addition to the fixed overhead and fixed % losses to see if I can get a better fit.
 
Last edited:
Yet another chart. This was at a different, slightly lower powered "6 kW" chargepoint ... for whatever reason the numbers look lower than the previous data, but I slapped them onto the same chart nonetheless to compare with the prior chart.

There are 6 new datapoints (from 18 underlying samples) of the charge efficiency at 10, 20, 30A at ~196V taken with the same data sampling from the API technique, but from both inside the car, and outside the car with the doors locked (and the display off).

You can see the difference is not insignificant. It was over 2% at 30A, and about 1.6-1.7% at 20A and 10A. I would have expected it to be less at 30A but for some reason it wasn't. I probably need a lot more data samples to weed out random rounding sampling errors.

Screenshot from 2019-08-16 19-18-42.png


So, in the end, the conclusion is that butt in seat, screen on (daytime mode), no HVAC, costs about 34-141 watts over the car being locked and screen off and driver not present.
 
Last edited: