Ya, I dunno. I think it’s a combination of (a) the magical energy buffer being used (filled up) on the way down and (b) miles below dash zero. I don’t see why they’d measure lower on the actual trip meter intentionally. I do see why they would on the gauge. Maybe the meter uses dashboard numbers instead of real numbers like you said.
Mystery solved!
So I just did a charge, and gathered some data.
Chargepoint: 13.297kWh (30A @ ~208V)
Stats Miles Added: 51.5 (Car Charge screen displayed +51 miles; Stats rounds to the nearest half mile...)
Stats kWh Added: 12.56kWh (Car Charge screen displayed 13kWh) (Charging Efficiency: 94.4%)
Wh/rmi added: 12.56/51.5 =
244Wh/rmi
I looked at more historical Stats data with longer charges - and the exact ratio is 245Wh/rmi. So the answer is 245Wh/rmi.
The key point is that
: this does not match my 230"Wh"/rmi constant I get whenever I compare the Trip Meter results to the battery rated miles reduction for a trip with only driving.
I expected
different constants, and that's what I got. Mystery solved.
Summarizing:
1) AC Energy = AC Energy from wall
(From Chargepoint)
2) DC Available Energy Added = (AC Energy * AC-DC Efficiency - Overhead Energy) * Battery Charging Efficiency
(which inherently accounts for discharge inefficiency) (Reported by Stats and the charge screen as kWh added in energy display mode)
3) Rated Miles Added
(Reported by Stats and the charge screen as miles added in distance display mode)
4) My calculation: Rated "Energy" Added Using Trip Meter Constant (
converted Rated Miles Added to "kWh" using my 230"Wh"/rmi constant for AWD, as usual)
DC Available Energy Added should basically match EPA discharge measurements - Tesla is measuring AVAILABLE energy added, not energy added (it takes some energy to shove energy into the battery, but that's accounted for in charging efficiency - what matters for EPA testing is
how much you can take out of it). If you use the new constant above to extrapolate to a full charge, you get: 245Wh/rmi*310rmi =
76kWh -> this is exactly what I expect;
gives about 2 kWh of reserve.
Of course, the last line, "Rated Energy Added" calculated with the "Wh"/mi constant will NOT match the DC Energy Added, because the constant is different!
The Trip Meter "Wh" are a little bigger than real Wh, meaning that the meter would appear to read low.
Here is a simplified example, summarizing what happens in
AWD vehicles.
- Start at 279 rated miles, 90%.
- Do a 100-mile trip, at 280"Wh"/mi average (for example) as indicated on the trip meter.
- You will find (if you spent no time in park, your
remaining rated range will be):
279 rmi - 100mi*280"Wh"/mi /
230"Wh"/rmi = 157.3rmi
- Then immediately charge the car to 90% again, the charging stats will be:
Rated Miles Added: 121.7mi
kWh Added: 29.82kWh (121.7mi *
245Wh/rmi)
Conclusion:
True Efficiency: 29.82kWh/100miles = 298.2Wh/mi (NOT 280 "Wh"/mi)
Conclusions for AWD:
1) Trip meter Wh/mi in the car reads
low by ~6% (for no good reason that I can identify)
2) True available AWD battery energy
for a battery at 310 miles is
76kWh (to discharge to 0 miles)
3) The reserve energy below 0 miles is about 1.5kWh-2kWh -> about 6-8 rated miles, just over what Elon stated.
4) Total pack energy is ~78kWh, as indicated in EPA document.
Anyway...
So now all you need to do for the SR+ is do that "trip-meters-over-a-long-trip method at a reasonably constant temp" and see whether you still get your 219Wh/rmi! (You won't - I will guess you will get 210"Wh"/rmi.)
If you do get the
same constant, that just means in the SR+ the trip meter does
not read low. All your other data is still valid. There's no particularly good reason for it to read low other than to make people feel smug, so maybe it won't be the same as the AWD.
However, my guess is you'll calculate something like 210Wh/rmi, using the trip meter.
We already know, from your 219Wh/rmi constant, the following:
SR+ Battery Capacity: 219Wh/rmi * 240rmi = 52.56kWh
Reserve capacity: EPA document kWh - 52.56kWh = ~ 2kWh (I think the EPA document said the battery is about 55kWh.)