Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Brand spanking new model 3 LR AWD first full charge (460/499 km) range

This site may earn commission on affiliate links.
Good to confirm. So the mystery is how the predicted range is determined.

It could be that the energy app may use the true* remaining energy above non-brick shutoff level and the battery gauge uses miles above “dashboard zero” which ends up with some possibly variable amount of energy left on the table if you stopped driving at “0”.

I haven’t done much number crunching on the energy app since I concur with you that the battery gauge is more important.

Another thing I wonder is about the delayed or non-existent increase in range when doing regen. If you just did a lot of regen the battery gauge doesn’t seem to tick up to reflect that ... maybe the energy app is factoring that in right away while the battery gauge uses it to refill some invisible buffer to use later.


*true = the car’s best estimate at the truth :)
 
Brand spanking new Model 3 LR AWD EPA rated range is 499 km or 310 miles. I have added about 560km on odometer and this is 2nd charge. I charged the car to 100% and its reporting only 460 km range.

Now that's about 8% difference for a car with less than 600 km on it. Do I need to drain the battery to 5% and then recharge to 100% to see if there is anything to be worried about?

I haven't been hammering it or anything the odometer reads all time 164Wh/km.
Which may translate to 460km.. if my math is correct.. ( 75kWh/16kWh) * 100km = ~ 468km.

I just drove about 50km on it and the battery is reporting at 82% which should be around 409 but it's around 380.

Shed some light please.

Thanks
Mileage is an estimate. Just like in your gasoline car. If you had a 20 gallon tank and the estimated mileage is 25 mpg you’d expect to get a range of 500 miles. Maybe you will, maybe you won’t. My BMW estimated mileage would vary after fill up. What I actually got varied even more based on how I drove. The electric car is the same. The battery percentage is correct. It’s 100%, it’s 50%, etc. the mileage is an estimate and will vary. If it freaks you out just put the setting to %.

You’ll get better range if you use cruise control a lot. An average is an average. For you 164 wh/km if you weren’t on cruise control you spent half the time using less energy and half the time above. Energy consumption is an exponential curve. The faster you go the more energy you need for each unit of speed. I’m making up numbers to make this easy but say it takes 0.1 wh to travel a km at 80 kph. It might take 0.2 wh at 90 kph and 0.5 wh at 100 kph. The faster you go the more energy it takes. Gas or electric using cruise control maximizes efficiency at a given speed because you aren’t accelerating and spending time at the steeper parts of the curve.

Folks get their panties in a twist over mileage estimates. It seems to be pretty rare for someone to actually have a battery issue and the car notifies you. Enjoy your car.
 
Another thing I wonder is about the delayed or non-existent increase in range when doing regen

There is hysteresis. The range will increase in 4-mile increments. If you don’t get to that increment (say you add 2 miles), the range won’t change, but it just won’t won’t decrement the next mile until (for example) 3 rated miles are used up.
 
  • Like
Reactions: darth_vad3r
Brand spanking new Model 3 LR AWD EPA rated range is 499 km or 310 miles. I have added about 560km on odometer and this is 2nd charge. I charged the car to 100% and its reporting only 460 km range.

Now that's about 8% difference for a car with less than 600 km on it. Do I need to drain the battery to 5% and then recharge to 100% to see if there is anything to be worried about?

I haven't been hammering it or anything the odometer reads all time 164Wh/km.
Which may translate to 460km.. if my math is correct.. ( 75kWh/16kWh) * 100km = ~ 468km.

I just drove about 50km on it and the battery is reporting at 82% which should be around 409 but it's around 380.

Shed some light please.

Thanks

Where are you seeing 460km range and “around 380”? Next to the battery gauge at the top left, or in the energy app?

I wouldn’t suggest leaving it in percent as others have suggested ... that’s like sticking your head in the sand.

You just need to keep an eye on it and not fret over a few km changes, but larger drops you can fret about.

I’d say you are on the borderline, and if it’s worse after some time of charging regularly to 90% and a couple dips down to 50% or lower thrown into the mix, then I’d start bugging Tesla.

If it gets worse it might be a sign of a bad pack.

If you keep it on percent you won’t notice this.
 
Where are you seeing 460km range and “around 380”? Next to the battery gauge at the top left, or in the energy app?

I wouldn’t suggest leaving it in percent as others have suggested ... that’s like sticking your head in the sand.

You just need to keep an eye on it and not fret over a few km changes, but larger drops you can fret about.

I’d say you are on the borderline, and if it’s worse after some time of charging regularly to 90% and a couple dips down to 50% or lower thrown into the mix, then I’d start bugging Tesla.

If it gets worse it might be a sign of a bad pack.

If you keep it on percent you won’t notice this.
I am talking about the green better gauge on speedometer

Question then. Should the kWh consumed since charge correctly reflect the energy consumed? Assuming there is 10kWh consumed by car on top of driving then driving + other energy consumption should add up to the current capacity of the car? E.g 75Kwh.

So driven + phantom + sentry + AC + etc = should add up to ~75kWh if Battery had no energy left.

Right now I've used 19kWh and my battery reads 62% charge. Assuming I've used 10kWh in additional consumption and 19kWh (~110km - car reads 16.4kWh used since charge) over 4 Day

That means I used 29kWh of 75kWh rated. That leaves 46kWh left. Which is 61% of charge remaining. (Does reflect the % on car)

The question is did my 16h of sentry + low AC consume over 10kWh in 4 days ? Or could it be that the battery had fault and the capacity is low in the first place?
 
Question then. Should the kWh consumed since charge correctly reflect the energy consumed?

No. This meter (and the trip meter) ONLY counts energy use when not in Park.

The question is did my 16h of sentry + low AC consume over 10kWh in 4 days

Sentry consumes about 200W. So 16 hours would be about 3.2kWh (14 rated miles, 23rated km). Which would NOT show up on the meter in the car.

Or could it be that the battery had fault and the capacity is low in the first place?

Based on what you first described it sounds like there may be some issues with your battery, but it is definitely TBD - see people's comments above. A picture showing % and rated miles remaining would help dispel confusion here.

E.g 75Kwh.

So driven + phantom + sentry + AC + etc = should add up to ~75kWh if Battery had no energy left.

That means I used 29kWh of 75kWh rated.

A couple comments:

1) Actual good battery kWh capacity is a true 78kWh based on all available information. (EPA submissions, running the battery until it is dead)
2) The meter in the car does not appear to measure "true" kWh. It seems to consistently read about 5% low. Really, it's just a meter, so it can read just about anything it wants; it doesn't matter what the units are. It is pretty consistent - which DOES matter for making it useful.
3) The maximum amount the meter could measure "Since Last Charge" is about 71.3 "kWh" (remember they aren't real kWh). This is for a full 310 to 0 rated miles discharge with no time spent in Park. AND on a battery that fully charges to 310 rated miles... (also assumes no net elevation loss - that would allow you to get higher numbers here since it's like an additional battery). So you see 71.3 "kWh" on the meter, when you consume about 75-76 true kWh (there is probably some reserve energy below 0 miles but don't count on it). I should mention that I've never proven this is the maximum, with an actual full discharge, but all trips I've taken have produced results consistent with this assertion.
4) YOUR battery seems to have LESS energy than the 71.3 "kWh" available. We "know" this because you stated your 100% charge was 286 rated miles (460 rated kilometers) (if we understood you correctly). That would mean your battery has 66 "kWh" available at a full charge. So that's the max YOU would ever see on the "since last charge" meter - if you did nothing but drive until 0 miles.

This situation MAY resolve itself, but it is cause for concern - but you need to stick with it for a little longer to see whether the issue resolves itself.

As mentioned above, I would second the recommendation for you to keep the display set to "rated miles/km" not %, for now, until you understand what is going on.
 
Last edited:
No. This meter (and the trip meter) ONLY counts energy use when not in Park.



A couple comments:

1) Actual good battery kWh capacity is a true 78kWh based on all available information. (EPA submissions, running the battery until it is dead)
2) The meter in the car does not appear to measure "true" kWh. It seems to consistently read about 5% low. Really, it's just a meter, so it can read just about anything it wants; it doesn't matter what the units are. It is pretty consistent - which DOES matter for making it useful.
3) The maximum amount the meter could measure "Since Last Charge" is about 71.3 "kWh" (remember they aren't real kWh). This is for a full 310 to 0 rated miles discharge with no time spent in Park. AND on a battery that fully charges to 310 rated miles... (also assumes no net elevation loss - that would allow you to get higher numbers here since it's like an additional battery). So you see 71.3 "kWh" on the meter, when you consume about 75-76 true kWh (there is probably some reserve energy below 0 miles but don't count on it). I should mention that I've never proven this is the maximum, with an actual full discharge, but all trips I've taken have produced results consistent with this assertion.
4) YOUR battery seems to have LESS energy than the 71.3 "kWh" available. We "know" this because you stated your 100% charge was 286 rated miles (460 rated kilometers) (if we understood you correctly). That would mean your battery has 66 "kWh" available at a full charge. So that's the max YOU would ever see on the "since last charge" meter - if you did nothing but drive until 0 miles.

This situation MAY resolve itself, but it is cause for concern - but you need to stick with it for a little longer to see whether the issue resolves itself.

As mentioned above, I would second the recommendation for you to keep the display set to "rated miles/km" not %, for now, until you understand what is going on.

Just to clarify this, because I think it might be very confusing to people who haven't considered this before ... by "meter" above (specifically the red ones, as in (2)) you are referring to the trip meters or the battery gauge? ... since it seems to be confusing to me as well who has considered this :D

I call the things on the cards "trip meters", and the numbers and battery icon a "gauge"... but ... my understanding is the trip meters measure real kWh ... why would they not? and the gauge, displayed in km, indirectly measures "kWh" above zero.

If you are referring to the trip meters, how do you figure they are 5% low and do not measure "true" kWh?
If you are referring to the battery gauge, I agree.

Man, I really wish they'd posted screen shots of the trip meter when they ran the EPA test :D
 
by "meter" above (specifically the red ones, as in (2)) you are referring to the trip meters

Trip meters: the ones in the lower left hand panel.

why would they not?

Why would they be accurate? There is no reason for them to be.

I mean actual kWh as in real kWh. I do think the meters are self-consistent with the gauge.

gauge, displayed in km, indirectly measures "kWh" above zero.

The gauge measures “kWh” and the meter measures “Wh”/mi. (Or km). They are consistent with each other.


If you are referring to the trip meters, how do you figure they are 5% low and do not measure "true" kWh?
If you are referring to the battery gauge, I agree.

Because there is no way to reconcile the total displayed on the meter for a full discharge (which will not exceed 72”kWh”) vs. what we KNOW from the EPA test (78kWh (real)). There is no way there is 6kWh in reserve.

So we know the meter reads low. Or equivalently, we can say it doesn’t measure “real” kWh.

I think that the added kWh on the charging screen (when set to %) may reflect closer to “real” kWh. But I’m not sure about that. Would be easy to check. There are various experiments you can do using your charging efficiency formulas and see whether you can prove/disprove the assertions. I’ve got a few ideas but won’t go into them in detail since they are hard to describe without a lot of words :) . I’ll try them and post back at some point.
 
Last edited:
Why would they be accurate? There is no reason for them to be..

Well, to be clear, I expect them to do their best at measuring “real kWh” and if their is any error or loss from low power draw or low regen inaccuracy that’s different than the battery gauge which I feel has been intentionally programmed differently to tick down slower for safety and a below zero buffer.



Because there is no way to reconcile the total displayed on the meter for a full discharge (which will not exceed 72”kWh”) vs. what we KNOW from the EPA test (78kWh (real)).

There is no way there is 6kWh in reserve.

I still wonder as to the EPA method of seemingly multiplying measure mAh by “average voltage”. Also “dyno mode” turning traction control off might also enable VW-testing mode that lets it use all of the buffer past zero? :)

So we know the meter reads low. Or equivalently, we can say it doesn’t measure “real” kWh.

Do we know this. Have you ever confirmed the Wh/rmi at a low SoC (below 20%)? The rate could change.

I think the meter wouldn’t consistently have the same error on purpose, so any error is true error and likely small if you always get the same results in your tests.

I think that the added kWh on the charging screen (when set to %) may reflect closer to “real” kWh. But I’m not sure about that. Would be easy to check. There are various experiments you can do using your charging efficiency formulas and see whether you can prove/disprove the assertions. I’ve got a few ideas but won’t go into them in detail since they are hard to describe without a lot of words :) . I’ll try them and post back at some point.

For all my data points I have charge energy added from the API in kWh (with 2 decimal place to boot!) so let me know what you are thinking. Just a simple reported miles added per energy added “consumption” number?
 
Well, to be clear, I expect them to do their best at measuring “real kWh” and if their is any error or loss from low power draw or low regen inaccuracy that’s different than the battery gauge which I feel has been intentionally programmed differently to tick down slower for safety and a below zero buffer.

I don’t think it will necessarily be accurate, and it makes people feel better if the number is a little low. It’s 100% consistent with the gauge, as you know, though. The gauge when we use the 230Wh/rmi (AWD) constant and the trip meter are measuring the same number of kWh as far as I can tell.

I still wonder as to the EPA method of seemingly multiplying measure mAh by “average voltage”

They measure volts and amps with calibrated instruments. It should be robust. It is supposed to be repeatable and sometimes it is re-checked by the EPA.

Do we know this. Have you ever confirmed the Wh/rmi at a low SoC (below 20%)? The rate could change.

No I have not checked that low. I’ve checked over a range of 30% to 90%
The gauge could be nonlinear at the bottom end (tick off more slowly) and that would resolve some discrepancy but I have seen no evidence of that.

However, while I have not checked myself, there are various complaints out there that the trip meter cannot be made to read more than 72kWh for a full discharge (with no time spent in park, no vampire), and somehow people use this to argue the battery is smaller than Tesla says. Haha! It’s super odd...the much more likely explanation, backed up by the evidence we have (EPA!) is that the meter reads low and there is some reserve.


I think the meter wouldn’t consistently have the same error on purpose, so any error is true error and likely small if you always get the same results in your tests.

I think the meter consistently reads low.

For all my data points I have charge energy added from the API in kWh (with 2 decimal place to boot!) so let me know what you are thinking.

Could be useful. I’ll try looking at a couple things and if it seems promising, I can describe what you should gather from the API. The core thing I want to try is to compare the energy added on the display, on the charging screen, to the miles added to the battery gauge (converted to KWh using the 230Wh/rmi constant). I suspect these may differ. And also measure energy from the Chargepoint concurrently (which obviously will be 5-10% higher).
 
Last edited:
The core thing I want to try is to compare the energy added on the display, on the charging screen, to the miles added to the battery gauge (converted to KWh using the 230Wh/rmi constant). I suspect these may differ. And also measure energy from the Chargepoint concurrently (which obviously will be 5-10% higher).

This is a recent supercharger session that I had some data points for. The bolded top row is the average of the signed deltas below.

D65EC7F2-14F7-4BEB-9B76-BC4D047E6D90.jpeg


This was over about 20 minutes and ~40% added. SR+ at a V2 station starting at 100 kW tapering to ~65 kW by the end. The session added ~100 miles of range.

The third column just proves that my 219 Wh/mi number is accurate (for the SR+) as the internal constant they use to map energy to distance on the charge screen (for both power -> ‘speed’, and energy -> ‘miles added’). Average is 0.000 and the min/max error was -0.24 to 0.23.
  1. Battery range is reported in miles with 2 decimal places (battery_range, same as battery_range_ideal on the 3)
  2. Energy added is reported in kWh with two decimal places (charge_energy_added)
  3. Miles added is reported in miles with 1/2 a decimal place (I see only zero decimal places or .5 increments) (charge_miles_added_rated, same as _ideal on the 3)
This last one is going to account for a lot more variance in the error in my first two columns, but even so, they mostly very tightly agree with each other. Especially of note is that they do not diverge as the session continues.

So I’d say the charge screen miles “added” and dashboard delta miles added are going to generally agree.

They get added at 219 Wh/mi but tick down at a lower rate ... perhaps this is ~5% loss going in and out of the battery? Hmmm. I haven’t thought about this too much.

Why would they be (presumably using the same instruments on the DC side of things) measuring two different numbers for kWh intentionally, or if not intentionally why would it differ if measured the same way?

Is there a 5% loss factored in from moving energy into then out of the battery?

In that case there isn’t “true kWh” but kWh-in and kWh-out. Although they could factor that in ahead of time when reporting the miles added.

I dunno. :)
 
Last edited:
A bit confused. Just need some definitions:

Energy Added / 219Wh/mi => This is the +kWh showing on the large charge screen during a charging session (but pulled from the API?)
(Units of charge kWh?)

Charge Screen Added (miles) => This is the + miles showing on the large charge screen during a charging session (but pulled from the API?) (Units of charge miles?)

Dashboard Delta => This is the difference in current displayed rated miles next to the battery gauge, from the initial value displayed there before starting the charge? (Units rmi?)

Final confusion:

Looking at your column headers:

Column 3 - Column 2
= ( Energy Added /219 Wh/mi - Charge Screen Added ) - (Dashboard Delta - Charge Screen Added)
= (Energy Added/219Wh/mi - Dashboard Delta)
= Column 1

However, Column 3 - Column 2 != Column 1 in your columns above.


In your data, if I'm reading it correctly, I see a 4.3% discrepancy there in column 2 (which is the sort of discrepancy I believe exists), but I'm just confused about your definitions and your data.

They get added at 219 Wh/mi but tick down at a lower rate ... perhaps this is ~5% loss going in and out of the battery? Hmmm. I haven’t thought about this too much.

Actually I think to make things work they'd have to tick down at a faster rate (210Wh/rmi would be faster than 219Wh/rmi). But yes I suspect this could be the case. Fortunately, there is no need for mystery: it's knowable by examining exactly the type of data you have gathered, and then supplementing that by observing the behavior of the battery gauge (delta rated miles) during a driving event and comparing to the trip meter (kWh).

Is there a 5% loss factored in from moving energy into then out of the battery?

Remember that in the EPA test they MEASURE ~78kWh (through 4 different current clamps). So if the discrepancy exists it's not due to internal battery losses because those could not be measured. There are 78kWh of AVAILABLE energy in the battery (for a particular current draw matching the EPA test setup - the available energy depends on current due to internal resistance losses).
 
A bit confused. Just need some definitions:

Energy Added / 219Wh/mi => This is the +kWh showing on the large charge screen during a charging session (but pulled from the API?)
(Units of charge kWh?)
The API shows the +kWh from the charge screen as charge_energy_added (kWh). I’ve converted this one energy number to miles by dividing by 219 Wh/mi so as to compare it to the dashboard miles change, and the miles added reported on the charge screen.

Charge Screen Added (miles) => This is the + miles showing on the large charge screen during a charging session (but pulled from the API?) (Units of charge miles?)

Yes. charge_miles_added_rated or _ideal (both the same for the 3)

Dashboard Delta => This is the difference in current displayed rated miles next to the battery gauge, from the initial value displayed there before starting the charge? (Units rmi?)
Yes, from the API as well (battery_range or batter_range_ideal which are the same for the 3).

Final confusion:

Looking at your column headers:

Column 3 - Column 2
= ( Energy Added /219 Wh/mi - Charge Screen Added ) - (Dashboard Delta - Charge Screen Added)
= (Energy Added/219Wh/mi - Dashboard Delta)
= Column 1

However, Column 3 - Column 2 != Column 1 in your columns above.

I have the middle column labeled wrong (or flip the sign on all the numbers). Originally, I just had absolute values but then I changed to signed deltas.

You can see from the delta of column A vs column C that the numbers match column B exactly (but the wrong sign).

In your data, if I'm reading it correctly, I see a 4.3% discrepancy there in column 2 (which is the sort of discrepancy I believe exists), but I'm just confused about your definitions and your data.

No, these numbers are all in miles, no percent. 0.043 miles is the average of the signed errors.

After the end of a +100 mile charge, the differences are only 0.02, 0.07, 0.05 miles between the 3 different methods of measuring charge added.

Actually I think to make things work they'd have to tick down at a faster rate (210Wh/rmi would be faster than 219Wh/rmi). But yes I suspect this could be the case. Fortunately, there is no need for mystery: it's knowable by examining exactly the type of data you have gathered, and then supplementing that by observing the behavior of the battery gauge (delta rated miles) during a driving event and comparing to the trip meter (kWh).

Ya, by tick down at a “lower” rate I confusingly meant the number is lower than 219 Wh/mi, as measured using the trip-meters-over-a-long-trip method, so ya the miles drop “faster” when used than when added. I forget what my number is ... 205? 209? A lower number than 219 :)

Remember that in the EPA test they MEASURE ~78kWh (through 4 different current clamps). So if the discrepancy exists it's not due to internal battery losses because those could not be measured. There are 78kWh of AVAILABLE energy in the battery (for a particular current draw matching the EPA test setup - the available energy depends on current due to internal resistance losses).

Ya, I dunno. I think it’s a combination of (a) the magical energy buffer being used (filled up) on the way down and (b) miles below dash zero. I don’t see why they’d measure lower on the actual trip meter intentionally. I do see why they would on the gauge. Maybe the meter uses dashboard numbers instead of real numbers like you said.
 
Last edited:
Ya, I dunno. I think it’s a combination of (a) the magical energy buffer being used (filled up) on the way down and (b) miles below dash zero. I don’t see why they’d measure lower on the actual trip meter intentionally. I do see why they would on the gauge. Maybe the meter uses dashboard numbers instead of real numbers like you said.

Mystery solved!

So I just did a charge, and gathered some data.

Chargepoint: 13.297kWh (30A @ ~208V)

Stats Miles Added: 51.5 (Car Charge screen displayed +51 miles; Stats rounds to the nearest half mile...)
Stats kWh Added: 12.56kWh (Car Charge screen displayed 13kWh) (Charging Efficiency: 94.4%)

Wh/rmi added: 12.56/51.5 = 244Wh/rmi

I looked at more historical Stats data with longer charges - and the exact ratio is 245Wh/rmi. So the answer is 245Wh/rmi.


The key point is that: this does not match my 230"Wh"/rmi constant I get whenever I compare the Trip Meter results to the battery rated miles reduction for a trip with only driving.

I expected different constants, and that's what I got. Mystery solved.

Summarizing:

1) AC Energy = AC Energy from wall (From Chargepoint)

2) DC Available Energy Added = (AC Energy * AC-DC Efficiency - Overhead Energy) * Battery Charging Efficiency (which inherently accounts for discharge inefficiency) (Reported by Stats and the charge screen as kWh added in energy display mode)

3) Rated Miles Added (Reported by Stats and the charge screen as miles added in distance display mode)

4) My calculation: Rated "Energy" Added Using Trip Meter Constant (converted Rated Miles Added to "kWh" using my 230"Wh"/rmi constant for AWD, as usual)

DC Available Energy Added should basically match EPA discharge measurements - Tesla is measuring AVAILABLE energy added, not energy added (it takes some energy to shove energy into the battery, but that's accounted for in charging efficiency - what matters for EPA testing is how much you can take out of it). If you use the new constant above to extrapolate to a full charge, you get: 245Wh/rmi*310rmi = 76kWh -> this is exactly what I expect; gives about 2 kWh of reserve.

Of course, the last line, "Rated Energy Added" calculated with the "Wh"/mi constant will NOT match the DC Energy Added, because the constant is different!

The Trip Meter "Wh" are a little bigger than real Wh, meaning that the meter would appear to read low.

Here is a simplified example, summarizing what happens in AWD vehicles.

- Start at 279 rated miles, 90%.
- Do a 100-mile trip, at 280"Wh"/mi average (for example) as indicated on the trip meter.
- You will find (if you spent no time in park, your remaining rated range will be):
279 rmi - 100mi*280"Wh"/mi / 230"Wh"/rmi = 157.3rmi
- Then immediately charge the car to 90% again, the charging stats will be:
Rated Miles Added: 121.7mi
kWh Added: 29.82kWh (121.7mi * 245Wh/rmi)

Conclusion: True Efficiency: 29.82kWh/100miles = 298.2Wh/mi (NOT 280 "Wh"/mi)

Conclusions for AWD:

1) Trip meter Wh/mi in the car reads low by ~6% (for no good reason that I can identify)
2) True available AWD battery energy for a battery at 310 miles is 76kWh (to discharge to 0 miles)
3) The reserve energy below 0 miles is about 1.5kWh-2kWh -> about 6-8 rated miles, just over what Elon stated.
4) Total pack energy is ~78kWh, as indicated in EPA document.

Anyway...

So now all you need to do for the SR+ is do that "trip-meters-over-a-long-trip method at a reasonably constant temp" and see whether you still get your 219Wh/rmi! (You won't - I will guess you will get 210"Wh"/rmi.)

If you do get the same constant, that just means in the SR+ the trip meter does not read low. All your other data is still valid. There's no particularly good reason for it to read low other than to make people feel smug, so maybe it won't be the same as the AWD.

However, my guess is you'll calculate something like 210Wh/rmi, using the trip meter.


We already know, from your 219Wh/rmi constant, the following:

SR+ Battery Capacity: 219Wh/rmi * 240rmi = 52.56kWh
Reserve capacity: EPA document kWh - 52.56kWh = ~ 2kWh (I think the EPA document said the battery is about 55kWh.)
 
Are you saying the Energy App is wrong?

Yes! Based on my post immediately above, I now know what the Energy App is doing:

It is using the Trip Meter Wh/mi (which read low) to calculate remaining range, but it is still scaling the battery rated miles remaining by the TRUE Wh/rmi. This leads to an error.

237 rated miles remaining
30-mile Efficiency: 230"Wh"/mi

Projected Range: 237 rmi * 245Wh/rmi / 230 "Wh"/mi = 252.45 mi (but note the units don't work - it is actually 252mi * (Wh/"Wh") )

Note this 252 miles matches your picture exactly.

Obviously, since it is using the trip meter Wh/mi, which read low, to calculate remaining range, the actual range is less than projected!

But we also know from above that there are 230 "Wh" / 245Wh for the AWD.

So we can correct the units above:

252mi * (Wh/"Wh") * 230"Wh"/245Wh = 237 miles

It's just an error on Tesla's part in their projection. Mystery remains why the trip meter reads low, though...

Now, it is possible that on some cars the Wh/mi on the trip meter does not read low (each individual would have to measure it according to the method described). Which would mean your projected range would actually be correct. So it's possible that on your car when you're doing 230Wh/mi, it would show up as 216Wh/mi in my car for the exact same drive. But I doubt it. Easy enough to check, anyway - just see how many rated miles tick off for a reasonably long trip and then we'll know your battery gauge relationship to Wh consumed on the trip meter.
 
I looked at more historical Stats data with longer charges - and the exact ratio is 245Wh/rmi. So the answer is 245Wh/rmi.

The key point is that: this does not match my 230"Wh"/rmi constant I get whenever I compare the Trip Meter results to the battery rated miles reduction for a trip with only driving.

I expected different constants, and that's what I got. Mystery solved.

Ya, I thought we already knew there were two constants? LOL. I got my hopes up with the first “mystery solved!”

One higher constant used to add energy and one lower constant that it ticks down at.

The larger one (245 for you) is also very easy to extract from 6 data points that are available to anyone at any time:
Go to the energy app, cycle through all 6 of last x,y,z in miles and km (6 different distances) and multiply the 2 numbers togeher (projected miles * average Wh/mi), then divide by the dashboard miles. You should get 245 +/- 1 is my guess (or 152 for km).

It’s also calculateable from the charge data energy added as you did, and from the charge power to charge “speed” relationship (which is what I originally did to get 219). It’s the same constant in all three of these places (appears to be 245 for LR AWD, and 219 for the SR+).

2) DC Available Energy Added = (AC Energy * AC-DC Efficiency - Overhead Energy) * Battery Charging Efficiency (which inherently accounts for discharge inefficiency)
[…]

Tesla is measuring AVAILABLE energy added, not energy added (it takes some energy to shove energy into the battery, but that's accounted for in charging efficiency

How do you know Tesla is measuring available energy added? I think you are assuming this. So far I have been assuming nothing wrt to this number, other than that it matches what they add to the battery gauge. Do you think they are (a) measuring energy as added, or (b) just measuring SoC? Or (c) measuring energy as added but discounting it for inefficiency? (Talking about the DC side here only)

Conclusions for AWD:

1) Trip meter Wh/mi in the car reads low by ~6% (for no good reason that I can identify)

I dunno... I’m still unconvinced the trip meter is “reading low” as opposed to some combination of factors such as them NOT factoring battery in-and-out losses while charging and the “out” numbers are just naturally lower due to that, or some energy buffer usage, and/or below dashboard zero math going on.

To me that was the mystery, and it’s far from de-mystified still for me :)

Say as you use 230 Wh to drive, it subtracts 245 Wh from the battery gauge and stuffs 15 Wh into an energy buffer to use for when it gets closer to 0% (and/or below 0%).

We have little data ourselves from testing the 230 constant below 20 or 10%, right?

I still think there’s a way the trip meter is accurate, and the battery gauge is showing us only part of the available energy while another part is hidden in the energy buffer.

Still not demystified over here :)

We know the regen getting added to the battery gauge has some hysteresis that you’ve determined is a couple miles, right? 2? 4? Where do you think that energy is when the battery gauge doesn’t show it? I think it’s in the “energy buffer” which is a hidden battery gauge that can hold up to 4 kWh, and they like to keep it somewhere in the middle as SoC drains down to save their bacon in case of miscalculations or sudden adjustments. It starts empty (at 100% SoC) and slowly fills as you drive.

If you drove 310 miles, 15 Wh/mi stolen from the battery gauge would be 4.65 kWh, so I think some of this gets given back to the battery gauge as you get to very low SoC ... so the battery gauge may tick down slower at some point. Possibly at 14% when the 15 Wh/mi accumulated from 100% down to there would add up to 4 kWn and the buffer is full.
 
Last edited:
  • Informative
Reactions: AlanSubie4Life
so the battery gauge may tick down slower at some point. Possibly at 14% when the 15 Wh/mi accumulated from 100% down to there would add up to 4 kWn and the buffer is full.

Maybe. You’d have to demonstrate that. I doubt there is significant nonlinearity for a pack that is in good shape where the BMS knows what is up.

(a) measuring energy as added, or (b) just measuring SoC? Or (c) measuring energy as added but discounting it for inefficiency? (Talking about the DC side here only)

I think they are measuring the available energy that is added, as estimated by the SoC. I don’t know what you mean by c.

I’m still unconvinced the trip meter is “reading low” as opposed to some combination of factors such as them NOT factoring battery in-and-out losses while charging

I guess this is kind of a philosophical question. We know how much energy is drawn from the wall. So we know how much energy it takes to go a given distance. Really this boils down to how efficient you want to say the charging is, vs. the driving. For a given amount of energy from the wall, you can say the charging is very efficient (lots of energy put in - 76kWh) but driving is inefficient (the meter reads low). Or you can say that charging is inefficient (not as much energy put in - 71.3kWh) but driving is efficient - meter reads accurately.

It really does not matter. Maybe you could use Supercharger data to help resolve the “dispute” - not sure.

Say as you use 230 Wh to drive, it subtracts 245 Wh from the battery gauge and stuffs 15 Wh into an energy buffer to use for when it gets closer to 0% (and/or below 0%)

Maybe. That is also a philosophical point. It makes no difference from a practical standpoint. You could argue you have zero reserve at 310miles and 2kWh reserve at 0 miles and you might be correct. You can prove or disprove it by measuring energy added to the pack I would think, but haven’t thought it through. If you assume the trip meter reads accurately, then that is exactly what is happening based on available data, I think. It is of little consequence in any case.

At this point, until I see other evidence, I’m confident that I understand it now. It aligns with the EPA data. I’m happy to assume there is about 76kWh available to me (which matches EPA not incl reserve), but the meter will only show me 72kWh or so for a full discharge.

If there is nonlinearity that someone can point to, that would be interesting to see. And then I can adjust my understanding. Obviously in pathological cases with bad SoC estimation the nonlinearity will exist, but in general I have not seen it.

As far as the hidden reserve from regen goes - I have no idea why they hide the miles for a while, but there is no mystery there. If they hide 3 miles, that makes the first mile to click off 4 times as energetic as a regular mile. It takes a long time to click! They don’t amortize it across the full SoC. It is super clear to me; I see it every day on my way to work. So no hidden reserve or anything there.

EDIT: while most of this is philosophical, I don’t think the last 30-miles range estimate on the Trip Page is correct, unless there really is that nonlinearity at the bottom of the battery that you suggest might exist. They are simply using the wrong constant - or they know there is significant non-linearity at the bottom of the battery.
 
Last edited:
On a related note. I dropped the car to 14% and charged it back to 80% and now it shows I have 370km range (229mi) while it should have be "close to" 399km (247mi). This is consistent with when I full charged the car and reported 460km (285mi) is around 8% of range that is completely missing from the car. This 30km or 18mi is missing. That's pretty huge for a car that is brand new with than 900km odometer
 
Maybe. You’d have to demonstrate that. I doubt there is significant nonlinearity for a pack that is in good shape where the BMS knows what is up.



I think they are measuring the available energy that is added, as estimated by the SoC. I don’t know what you mean by c.



I guess this is kind of a philosophical question. We know how much energy is drawn from the wall. So we know how much energy it takes to go a given distance. Really this boils down to how efficient you want to say the charging is, vs. the driving. For a given amount of energy from the wall, you can say the charging is very efficient (lots of energy put in - 76kWh) but driving is inefficient (the meter reads low). Or you can say that charging is inefficient (not as much energy put in - 71.3kWh) but driving is efficient - meter reads accurately.

It really does not matter. Maybe you could use Supercharger data to help resolve the “dispute” - not sure.



Maybe. That is also a philosophical point. It makes no difference from a practical standpoint. You could argue you have zero reserve at 310miles and 2kWh reserve at 0 miles and you might be correct. You can prove or disprove it by measuring energy added to the pack I would think, but haven’t thought it through. If you assume the trip meter reads accurately, then that is exactly what is happening based on available data, I think. It is of little consequence in any case.

At this point, until I see other evidence, I’m confident that I understand it now. It aligns with the EPA data. I’m happy to assume there is about 76kWh available to me (which matches EPA not incl reserve), but the meter will only show me 72kWh or so for a full discharge.

If there is nonlinearity that someone can point to, that would be interesting to see. And then I can adjust my understanding. Obviously in pathological cases with bad SoC estimation the nonlinearity will exist, but in general I have not seen it.

As far as the hidden reserve from regen goes - I have no idea why they hide the miles for a while, but there is no mystery there. If they hide 3 miles, that makes the first mile to click off 4 times as energetic as a regular mile. It takes a long time to click! They don’t amortize it across the full SoC. It is super clear to me; I see it every day on my way to work. So no hidden reserve or anything there.

EDIT: while most of this is philosophical, I don’t think the last 30-miles range estimate on the Trip Page is correct, unless there really is that nonlinearity at the bottom of the battery that you suggest might exist. They are simply using the wrong constant - or they know there is significant non-linearity at the bottom of the battery.

Just to add another wrinkle to all this ... every time I see “true” pack size estimates for the SR+ and LR, I see if they are in a 31:46 ratio and they never seem to be. I would expect them to be though, if there are 46 cells per brick in the LR models and 31 in the SR+ bricks.

With the numbers we just discussed (240 * 219) and (310 * 245) I can’t get them to fit this ratio, even with some additional buffer ... I actually have to *subtract* ~4 kWh from each to get a 31:46 ratio.

If we assume one number is correct, maybe it fits and there’s just some fudge factors going on.

(310 * 245) = 75.95 kWh * (31/46) = 51.18 kWh, using 219 Wh/mi implies 234 miles rated range, not 240 (unlikely?)

(240 * 219) = 52.56 kWh * (46/31) = 77.99 kWh, using 245 Wh/mi implies 318 miles rated range ... possible?