Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Calculate usable battery capacity based on rated miles values

This site may earn commission on affiliate links.
@Boatguy
thanks for all the info. i have had similar experience. especially the 300 Wh/m in the energy graph not matching the rated range. jason's findings in the firmware conflic with the energy graph as well.
I think the rated range number is artificially manipulated to keep owners at peace. But at the same time the fact stanfs that battery capacity cannot be acurately measured when the battery is charged and discharged partially like it happens in our cars. I think we should seriouly let got of the rated range numbrr as a measure for battery capacity. It is clearly not accurate. the car reports the battery capacity on the CAN bus. regardless of weather and temperature, ths number does not change much at all. at the same time, the reported capacity on my car has dropped about 1 kWh over the last 8 months, yet the rated range at 100% has not changed. so clearly the car does some serious rounding. i think Jason is right, we should take the rated range as a rough estimae, nothing more.
What do you use to get the value on the CAN bus?
 
I'll state again... and again and again: The value used by the car as the static Wh/Rated-Mile *DOES NOT EVER CHANGE*. The only thing that affects this number is the physical configuration of the vehicle, as specified in the first post in this thread, which doesn't change unless you do something unusual like a battery upgrade (to different capacity group) or motor upgrade (non-PD to/from PD.) It is possible Tesla has modified it from much older firmware, but the values I've reported have been the same since ~4.x versions.

Finally have some new data about 100 kWh pack: Model X P100D: 342 Wh/rated mile.

I will get the value for the Model S P100D next time I have that car online (forgot to grab it), but some math suggests it would be ~314 Wh/rated mile.
What do you estimate the actual charging efficiency would be from the plug to the battery? Not just the charger efficiency.
 
Go watch Bjorn's Kia Soul EV trip and then come back to claim that's true.
To clarify I'm talking about granularity in terms of determining long term life. For example, in a Tesla it's trivial to figure out a rough idea of degradation (in percentage) by looking at how much rated range has dropped.

I looked at the Kia Soul EV manual and it has a battery SOC meter (which reports straight percentage, so does not show degradation) and a range GOM (guess-o-meter) that varies based on previous energy use. I don't see a straightforward way to get an estimate of degradation. Maybe the European model is different (I didn't see Bjorn's video)?
 
Last edited:
I'll state again... and again and again: The value used by the car as the static Wh/Rated-Mile *DOES NOT EVER CHANGE*.
Then why does the empirical data produce a different value than the 285Wh you report?

Shouting does not make you correct. Talk about the data and the facts.

Tesla's rated range display is already one of the best in the industry in terms of granularity.
The granularity of all the displays is one mile so how can one be better than another? Are you trying to make a point about accuracy or granularity? And about displaying Rated Miles, which I think we're all going to agree is pretty much a useless concept that Tesla does not even apply consistently across its apps, or actual odometer range?
 
Then why does the empirical data produce a different value than the 285Wh you report?

Shouting does not make you correct. Talk about the data and the facts.

The granularity of all the displays is one mile so how can one be better than another? Are you trying to make a point about accuracy or granularity? And about displaying Rated Miles, which I think we're all going to agree is pretty much a useless concept that Tesla does not even apply consistently across its apps, or actual odometer range?

I've stated nothing but facts here. The the firmware itself sets the rated wh/mi at 285 Wh/mi for the refresh-non-P dual motor configuration. You refusing to accept that doesn't change the firmware. My "shouting" is simply because people seem to be skipping this particular fact to fit their arguments.
 
I've stated nothing but facts here. The the firmware itself sets the rated wh/mi at 285 Wh/mi for the refresh-non-P dual motor configuration. You refusing to accept that doesn't change the firmware. My "shouting" is simply because people seem to be skipping this particular fact to fit their arguments.
I'm trying to reconcile the fact you present, with the empirical data. I just drove 2,000 miles and RM disappeared at a rate 273Wh/RM, not 285Wh/RM.

I charged, either SC or destination L2, recorded the RM and drove off (consumed kWh was always 0.0). At the next destination I recorded kWh consumed since last charge and the change in RM, divide and I get 273Wh/RM.

So we have two facts that don't agree with each other. How do we explain that? I'm not attached to my theory or "argument", I'm searching for an explanation and would appreciate your help.
 
I'm trying to reconcile the fact you present, with the empirical data. I just drove 2,000 miles and RM disappeared at a rate 273Wh/RM, not 285Wh/RM.

I charged, either SC or destination L2, recorded the RM and drove off (consumed kWh was always 0.0). At the next destination I recorded kWh consumed since last charge and the change in RM, divide and I get 273Wh/RM.

So we have two facts that don't agree with each other. How do we explain that? I'm not attached to my theory or "argument", I'm searching for an explanation and would appreciate your help.

Interesting. Assuming you're notes are correct, I have no explanation for this behavior other than to assume that for some reason your trip meter's kWh count is simply wrong. 273 Wh/mi would mean even less usable capacity than I've measured in multiple 90 packs since 273 wh*294 mi = 80.3 kWh... which is way off. Do you have a trip meter photo for a long drive from ~100%? Would be interesting to see.
 
Great post...very informative. Cold weather obviously has been having an impact on usable battery capacity - eg. a 400wh/mile consumption vs the static 290wh/mile on my S70 resulting in slightly less miles., which is not bad.
 
Interesting. Assuming you're notes are correct, I have no explanation for this behavior other than to assume that for some reason your trip meter's kWh count is simply wrong. 273 Wh/mi would mean even less usable capacity than I've measured in multiple 90 packs since 273 wh*294 mi = 80.3 kWh... which is way off. Do you have a trip meter photo for a long drive from ~100%? Would be interesting to see.

This is the behavior I keep pointing out. He's taking the energy consumed as reported by the trip meter, and dividing by the actual rated miles consumed, exposing the fudge factor. I do the opposite but mathematically identical, use the rated miles lost, multiply by the Wh/RM, then compare to what the trip meter says, which seems to heavily understate usage.
 
I'm trying to reconcile the fact you present, with the empirical data. I just drove 2,000 miles and RM disappeared at a rate 273Wh/RM, not 285Wh/RM.

I charged, either SC or destination L2, recorded the RM and drove off (consumed kWh was always 0.0). At the next destination I recorded kWh consumed since last charge and the change in RM, divide and I get 273Wh/RM.

So we have two facts that don't agree with each other. How do we explain that? I'm not attached to my theory or "argument", I'm searching for an explanation and would appreciate your help.

What you are describing (discrepancy between rated miles actual consumption rate vs. firmware set value) is happening in my case, and in every case that I have seen reported. There has to be a straightforward explanation for this, but from your experience in writing to Tesla, they don't seem very interested in explaining it.
 
The granularity of all the displays is one mile so how can one be better than another? Are you trying to make a point about accuracy or granularity? And about displaying Rated Miles, which I think we're all going to agree is pretty much a useless concept that Tesla does not even apply consistently across its apps, or actual odometer range?
I clarified in my other post, but I'm talking specifically about granularity in terms of determining degradation. Once you accoubt for the conditions @wk057 pointed out up thread (calibration, range mode, immediate off charge), it's good way to monitor degradation.

Most other EVs have either a far less granular display (like the 12 bars in the Leaf) or no real way to monitor degradation (they have GOM range displays that vary based on the conditions the car is driven and can't be relied on for the same purposes).
 
Interesting. Assuming you're notes are correct, I have no explanation for this behavior other than to assume that for some reason your trip meter's kWh count is simply wrong. 273 Wh/mi would mean even less usable capacity than I've measured in multiple 90 packs since 273 wh*294 mi = 80.3 kWh... which is way off. Do you have a trip meter photo for a long drive from ~100%? Would be interesting to see.
I don't have a dash photo, but I've included a screen shot of the spreadsheet I created for the trip. The data was transcribed from the instrument display at each stop and entered into the spreadsheet that evening.
Screen Shot 2017-01-28 at 10.51.47 AM.png
 
I don't have a dash photo, but I've included a screen shot of the spreadsheet I created for the trip. The data was transcribed from the instrument display at each stop and entered into the spreadsheet that evening.
View attachment 212377

Try measuring your kWh by metering your charge, independently of the car.

edit - actually you can also use the energy added as reported by the car during charge as well. Not as good as above, but better than using reported consumption.
 
I don't have a dash photo, but I've included a screen shot of the spreadsheet I created for the trip. The data was transcribed from the instrument display at each stop and entered into the spreadsheet that evening.
View attachment 212377
I remember a long thread about the efficiency numbers and how it appears to be nonlinear a few years ago, but I forgot what was the factor. I think it has to do with the buffer, "below zero" range, and also non-propulsion energy usage.

I did find this reference:
Can someone PLEASE explain how rated range is calculated.

Throughout the years and various updates, the below zero thing has seemed to be eliminated or disproven, but the basics may still apply. Instead of using a fixed trip Wh / (Wh/mi) = RM, it may be a formula that uses trip Wh / (Wh/mi) + Constant mi = RM. In other words, the trip Wh may not necessarily correspond directly with the internal Wh, but there may be some other constant in between (back then it was guessed to be a "below zero" buffer).
 
Try measuring your kWh by metering your charge, independently of the car.

edit - actually you can also use the energy added as reported by the car during charge as well. Not as good as above, but better than using reported consumption.
Unless you always charge to 100%, which is very rare at SCs, that won't tell you how much was consumed on the prior leg. And if you use an L2 charger, then you have to make assumptions about charge efficiency and vampire loss.

Why do think the kWh consumed since last charge as reported in the Trip Meter is less accurate?
 
I remember a long thread about the efficiency numbers and how it appears to be nonlinear a few years ago, but I forgot what was the factor. I think it has to do with the buffer, "below zero" range, and also non-propulsion energy usage.

I did find this reference:
Can someone PLEASE explain how rated range is calculated.

Throughout the years and various updates, the below zero thing has seemed to be eliminated or disproven, but the basics may still apply. Instead of using a fixed trip Wh / (Wh/mi) = RM, it may be a formula that uses trip Wh / (Wh/mi) + Constant mi = RM. In other words, the trip Wh may not necessarily correspond directly with the internal Wh, but there may be some other constant in between (back then it was guessed to be a "below zero" buffer).
That could be of interest, but for the moment I'm looking for an explanation of the discrepancy between what @wk057 found in the BMS, and my empirical data.
 
I'm trying to reconcile the fact you present, with the empirical data. I just drove 2,000 miles and RM disappeared at a rate 273Wh/RM, not 285Wh/RM.

I charged, either SC or destination L2, recorded the RM and drove off (consumed kWh was always 0.0). At the next destination I recorded kWh consumed since last charge and the change in RM, divide and I get 273Wh/RM.

So we have two facts that don't agree with each other. How do we explain that? I'm not attached to my theory or "argument", I'm searching for an explanation and would appreciate your help.

My 2013 S85 is almost exactly the same.
 
Unless you always charge to 100%, which is very rare at SCs, that won't tell you how much was consumed on the prior leg. And if you use an L2 charger, then you have to make assumptions about charge efficiency and vampire loss.
Charge to 90%. There's an additional variable sure, but over many samples the error should average out.

Why do think the kWh consumed since last charge as reported in the Trip Meter is less accurate?

I'm sure of it, it's not just not accurate, there appears to be a large fudge factor. Because energy is always conserved, and the trip meter consumption does not reflect that. Try comparing the self-reported energy added numbers while charging versus the value reported consumed. People have come up with all kinds of theories as to why that is, giving Tesla the benefit of a doubt. I no longer do. In the most flattering light it's simply a bug, one that they've neglected fix for a very long time.

If your data were my data, I would inflate your calculated Wh/RM by 10%. I get the feeling the fudge factor is much bigger for P models however.

edit -

And if you use an L2 charger, then you have to make assumptions about charge efficiency and vampire loss.
I have not simply made assumptions, I have made measurements.
 
Charge to 90%. There's an additional variable sure, but over many samples the error should average out.



I'm sure of it, it's not just not accurate, there appears to be a large fudge factor. Because energy is always conserved, and the trip meter consumption does not reflect that. Try comparing the self-reported energy added numbers while charging versus the value reported consumed. People have come up with all kinds of theories as to why that is, giving Tesla the benefit of a doubt. I no longer do. In the most flattering light it's simply a bug, one that they've neglected fix for a very long time.

If your data were my data, I would inflate your calculated Wh/RM by 10%. I get the feeling the fudge factor is much bigger for P models however.

edit -

I have not simply made assumptions, I have made measurements.
Can you please share your measurements with us? And how do they compare with the 310Wh/RM which the OP found in the BMS for your P90D?