Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Calculate usable battery capacity based on rated miles values

This site may earn commission on affiliate links.
It might be interesting to note that Tesla used 3.5 miles per kWh / 285 Wh/mile for the RAV4 EV.

Unfortunately, Toyota did not retain the Rated Range / Ideal Range concept, so the take the perfectly good Tesla data and spit out Guess-O-Meter (GOM) data on the dash.

A new condition 2012-2014 Toyota RAV4 EV will acheive 146 Rated Range miles:

146 = 41.8 usable * 3.5
 
Can you please share your measurements with us? And how do they compare with the 310Wh/RM which the OP found in the BMS for your P90D?
I was using 312, but I knew the value to be only within sub 1% accuracy. So 310 sounds fine. It just means I have a half kWh less energy capacity than I thought before. This numbers was confirmed in several ways, like by comparing charge energy added to RM added, seeing how many kWh went into n % charge, then working backwards from 100% RM, etc. So I had independently arrived at that being the right number long before the value being fetched directly from firmware.

My charge efficiency was measured to be 91.5%-92.5%, the difference between metered at the wall and charge energy added reported. Charger configuration and charging current affecting that number, also your current conditions, i.e. does the pack require active heating or cooling. Vampire loss is just a matter of sampling the car long enough. For earlier last year, I was getting a value of ~70W average continuous drain. I'm also now using online loggers to track it continuously, and I can see it's rather lumpy. Neither value is included in trip meter consumption.

I've also spot checked %, RM, and ideal miles at many different SoC, and I have seen no evidence of non-linearity, they always have mathematically fixed ratios to each other.

Because of the above, I can only conclude using battery level, expressed either as reported %, RM, or kWh through multiplication, is a more accurate representation of energy that is in the pack or has left the pack. This disagrees wildly with any numbers produced by the trip meter in my car, a large mode is 10%, with the value sometimes being much higher, and sometimes much lower but still positive, but never negative. If the problem was just difficult, I'd expect to see as many negative values as positives ones, i.e. over time the mean is 0%. This is not true. My car is an energy hog.
 
I was using 312, but I knew the value to be only within sub 1% accuracy. So 310 sounds fine. It just means I have a half kWh less energy capacity than I thought before. This numbers was confirmed in several ways, like by comparing charge energy added to RM added, seeing how many kWh went into n % charge, then working backwards from 100% RM, etc. So I had independently arrived at that being the right number long before the value being fetched directly from firmware.

My charge efficiency was measured to be 91.5%-92.5%, the difference between metered at the wall and charge energy added reported. Charger configuration and charging current affecting that number, also your current conditions, i.e. does the pack require active heating or cooling. Vampire loss is just a matter of sampling the car long enough. For earlier last year, I was getting a value of ~70W average continuous drain. I'm also now using online loggers to track it continuously, and I can see it's rather lumpy. Neither value is included in trip meter consumption.

I've also spot checked %, RM, and ideal miles at many different SoC, and I have seen no evidence of non-linearity, they always have mathematically fixed ratios to each other.

Because of the above, I can only conclude using battery level, expressed either as reported %, RM, or kWh through multiplication, is a more accurate representation of energy that is in the pack or has left the pack. This disagrees wildly with any numbers produced by the trip meter in my car, a large mode is 10%, with the value sometimes being much higher, and sometimes much lower but still positive, but never negative. If the problem was just difficult, I'd expect to see as many negative values as positives ones, i.e. over time the mean is 0%. This is not true. My car is an energy hog.
If I'm understanding this, you're saying that energy goes into the battery, net of charge efficiency, and is reflected as RM or SOC at a rate of 310Wh/RM (or the equivalent SOC). Is that correct?
 
Charging losses are not 13%. More like 8%, according to the numbers reported real-time which could be fudged but it also matches the spec'd efficiency, and the test value from @wk057 IIRC.
It doesn't matter what the charging losses *are,* the important point is what loss factor EPA applies.

Your driving style is irrelevant to figuring out 100% usable battery capacity of a new car.
 
Your driving style is irrelevant to figuring out 100% usable battery capacity of a new car.

In a perfect world, you would be right. Unfortunately - this is not the case. The cells in the Model S/X pack have an internal resistance, or IR. This IR is not linear. When you pull, let's say, 20kW from a 85kWh pack, you waste a bit of power to this internal resistance and it's transformed into heat. At low power, the IR is pretty linear.. Pull 10kWh and you waste about the same ratio as 30kW... But if you pull more than that, then a lot more waste heat is generated.

Bogus number for the sake of the example :
e.g. Pulling 20kW give you 200W of waste heat (1%)
e.g. Pulling 200kW gives you 8000W of waste heat (4%).

The energy used that's computed by the BMS and the trip meter only account for actual power you pull from the battery. It does not account for the waste heat generated in the pack. You also need to run the coolant pump a big harder but that's taken into account since you're pulling that additional power from the battery.

In fact, the car has 2 "energy remaining" counters under the hood. One is called "Ideal Remaining" and the other one is called "Remaining". The ideal one represents how much energy you can pull from the pack if you don't exceed a specific threshold (don't know the exact value). The other one is computer using the latest X miles/kms of driving
 
In a perfect world, you would be right. Unfortunately - this is not the case. The cells in the Model S/X pack have an internal resistance, or IR. This IR is not linear. When you pull, let's say, 20kW from a 85kWh pack, you waste a bit of power to this internal resistance and it's transformed into heat. At low power, the IR is pretty linear.. Pull 10kWh and you waste about the same ratio as 30kW... But if you pull more than that, then a lot more waste heat is generated.
You are describing the fraction of usable battery capacity that reaches the drivetrain. Similarly, you could talk about variables that effect motor efficiency. Or air friction from high speeds

So I completely agree that the range will vary with how the car is driven and in what environment -- but it does not change the usable battery capacity.
 
In a perfect world, you would be right. Unfortunately - this is not the case. The cells in the Model S/X pack have an internal resistance, or IR. This IR is not linear. When you pull, let's say, 20kW from a 85kWh pack, you waste a bit of power to this internal resistance and it's transformed into heat. At low power, the IR is pretty linear.. Pull 10kWh and you waste about the same ratio as 30kW... But if you pull more than that, then a lot more waste heat is generated.

Bogus number for the sake of the example :
e.g. Pulling 20kW give you 200W of waste heat (1%)
e.g. Pulling 200kW gives you 8000W of waste heat (4%).

The energy used that's computed by the BMS and the trip meter only account for actual power you pull from the battery. It does not account for the waste heat generated in the pack. You also need to run the coolant pump a big harder but that's taken into account since you're pulling that additional power from the battery.

In fact, the car has 2 "energy remaining" counters under the hood. One is called "Ideal Remaining" and the other one is called "Remaining". The ideal one represents how much energy you can pull from the pack if you don't exceed a specific threshold (don't know the exact value). The other one is computer using the latest X miles/kms of driving

The theory may be true, may be not. However one thing is clear, a Tesla owner should NEVER need to know such a thing. If consumption is high, it should be reported.
 
You are describing the fraction of usable battery capacity that reaches the drivetrain. Similarly, you could talk about variables that effect motor efficiency. Or air friction from high speeds

So I completely agree that the range will vary with how the car is driven and in what environment -- but it does not change the usable battery capacity.

The theory may be true, may be not. However one thing is clear, a Tesla owner should NEVER need to know such a thing. If consumption is high, it should be reported.

Nope, that's not what I meant. Everything you mentioned : motor efficiency, air friction, hills, whatever the car experiences that might take more power, All of this pulls current that is metered by the car. The battery pack has a shunt (current measuring device) that site between the cells and the rest of the system. All of the looses to heat or inefficiencies that are downstream (from the shunt to the motors) is accounted for in the calculations. You have denser air, a flat tire of going uphill, the car measures the additional power and adds that to the trip meter.

The issue lies in what's ahead : when pulling power from the cells to the shunt, some of that power goes to waste. That's fine with and everybody expects that. The problem is that when you consume lots of power, the amount that goes to waste is not proportional. 2x the power is not 2x the waste heat. It's more than that. And all of this is not metered by the shunt!

That means that is you want to use the trip meter (that uses the shunt) to check the usable capacity of the pack, you need to avoid pulling too much power. And I'm not even mentioning the inaccuracies of the trip meter. It simply adds up what data it has.. but it can drop frames and not update for a couple of milliseconds because of lag. Milliseconds every couple of seconds means adding up errors and such.

This is not a theory, this is a fact. Every single battery manufacturer will list capacity of the cell based on a specific discharge rate. You want to get the full, printed-on-the-cell, capacity, you need to pull a maximum of 0.5amps of some low value. Check this NCR18650 chart from Panasonic. This is not the cell in the Model S but look at the bottom right graph. The blue line, taken at 0.2C or 0.2x the capacity (540 mA in that case) can extract a lot more energy than the Purple one at 2C (5.4amps in that case). The charts will be different for each cell type but pulling more power always means wasting more heat, exponentially.

I'm with you on the fact that from a owner perspective, these should be easier... But batteries are hard, especially knowing the current % SOC of the battery when it's under a load.. aka, driving. I can't speak for other models but my 85D trip planner is REALLY accurate. sure, it starts out optimistic, especially in winter but will readjust after a couple of miles/km of driving and its prediction is always spot on. Do a couple of launches and yeah, you've now wasted a lot of power that the car can't really account for.. but the range estimate will go down after a few minutes of driving. While it can't add that to the trip meter, it knows the voltage is now lower and that remaining range is less.
 
This is not a theory, this is a fact. Every single battery manufacturer will list capacity of the cell based on a specific discharge rate. You want to get the full, printed-on-the-cell, capacity, you need to pull a maximum of 0.5amps of some low value. Check this NCR18650 chart from Panasonic. This is not the cell in the Model S but look at the bottom right graph. The blue line, taken at 0.2C or 0.2x the capacity (540 mA in that case) can extract a lot more energy than the Purple one at 2C (5.4amps in that case). The charts will be different for each cell type but pulling more power always means wasting more heat, exponentially.
For sure.

"Usable battery capacity" is per a set of environmental conditions. Call them EPA stc :)
 
I can't speak for other models but my 85D trip planner is REALLY accurate. sure, it starts out optimistic, especially in winter but will readjust after a couple of miles/km of driving and its prediction is always spot on. Do a couple of launches and yeah, you've now wasted a lot of power that the car can't really account for.. but the range estimate will go down after a few minutes of driving. While it can't add that to the trip meter, it knows the voltage is now lower and that remaining range is less.

Mine is utter crap. Start a trip saying that I'll get back with 47% battery. Get back with 17%. That's with turning the performance down and featherfooting it as much as possible. If you watch the ending charge diverging, it keeps diverging up until the end of the trip. It's not like it sees your current consumption and extrapolates.

edit - I can only come to the conclusion that this is due to fudging of the actual consumption of P models.
 
Nope, that's not what I meant. Everything you mentioned : motor efficiency, air friction, hills, whatever the car experiences that might take more power, All of this pulls current that is metered by the car. The battery pack has a shunt (current measuring device) that site between the cells and the rest of the system. All of the looses to heat or inefficiencies that are downstream (from the shunt to the motors) is accounted for in the calculations. You have denser air, a flat tire of going uphill, the car measures the additional power and adds that to the trip meter.

The issue lies in what's ahead : when pulling power from the cells to the shunt, some of that power goes to waste. That's fine with and everybody expects that. The problem is that when you consume lots of power, the amount that goes to waste is not proportional. 2x the power is not 2x the waste heat. It's more than that. And all of this is not metered by the shunt!

That means that is you want to use the trip meter (that uses the shunt) to check the usable capacity of the pack, you need to avoid pulling too much power. And I'm not even mentioning the inaccuracies of the trip meter. It simply adds up what data it has.. but it can drop frames and not update for a couple of milliseconds because of lag. Milliseconds every couple of seconds means adding up errors and such.

This is not a theory, this is a fact. Every single battery manufacturer will list capacity of the cell based on a specific discharge rate. You want to get the full, printed-on-the-cell, capacity, you need to pull a maximum of 0.5amps of some low value. Check this NCR18650 chart from Panasonic. This is not the cell in the Model S but look at the bottom right graph. The blue line, taken at 0.2C or 0.2x the capacity (540 mA in that case) can extract a lot more energy than the Purple one at 2C (5.4amps in that case). The charts will be different for each cell type but pulling more power always means wasting more heat, exponentially.

I'm with you on the fact that from a owner perspective, these should be easier... But batteries are hard, especially knowing the current % SOC of the battery when it's under a load.. aka, driving. I can't speak for other models but my 85D trip planner is REALLY accurate. sure, it starts out optimistic, especially in winter but will readjust after a couple of miles/km of driving and its prediction is always spot on. Do a couple of launches and yeah, you've now wasted a lot of power that the car can't really account for.. but the range estimate will go down after a few minutes of driving. While it can't add that to the trip meter, it knows the voltage is now lower and that remaining range is less.

For sure.

"Usable battery capacity" is per a set of environmental conditions. Call them EPA stc :)

@llavalle made the point that I debated making or not and he is completely correct. The output capacity of a cell varies based on the draw rate because of the internal resistance (this is abundantly clear from cell spec sheets). For evidence this applies equally in EVs, you can look at the INL datasheets with testing at various speeds/test cycles and see that the capacity varies.
Vehicle Testing - Light Duty - BEV | Advanced Vehicle Testing Activity

So the amount of energy you get out of the pack will never be exactly the same as you put in (unless you have a battery with zero internal resistance, which doesn't exist in the real world).

This does become an issue when rating an EV's capacity. The easiest way is just to use nameplate capacity of the cells from the manufacturer and multiply it by the amount of cells. Other ways is to do it by the EPA cycle, or by a steady state cycle.
 
Mine is utter crap. Start a trip saying that I'll get back with 47% battery. Get back with 17%. That's with turning the performance down and featherfooting it as much as possible. If you watch the ending charge diverging, it keeps diverging up until the end of the trip. It's not like it sees your current consumption and extrapolates.

edit - I can only come to the conclusion that this is due to fudging of the actual consumption of P models.

I always lose ~10% from what the trip meter estimates within the first 30 minutes of freeway driving. I expect it to adjust on the second leg of the trip, but it makes the same mistake.
 
Nope, that's not what I meant. Everything you mentioned : motor efficiency, air friction, hills, whatever the car experiences that might take more power, All of this pulls current that is metered by the car. The battery pack has a shunt (current measuring device) that site between the cells and the rest of the system. All of the looses to heat or inefficiencies that are downstream (from the shunt to the motors) is accounted for in the calculations. You have denser air, a flat tire of going uphill, the car measures the additional power and adds that to the trip meter.

The issue lies in what's ahead : when pulling power from the cells to the shunt, some of that power goes to waste. That's fine with and everybody expects that. The problem is that when you consume lots of power, the amount that goes to waste is not proportional. 2x the power is not 2x the waste heat. It's more than that. And all of this is not metered by the shunt!

That means that is you want to use the trip meter (that uses the shunt) to check the usable capacity of the pack, you need to avoid pulling too much power. And I'm not even mentioning the inaccuracies of the trip meter. It simply adds up what data it has.. but it can drop frames and not update for a couple of milliseconds because of lag. Milliseconds every couple of seconds means adding up errors and such.

This is not a theory, this is a fact. Every single battery manufacturer will list capacity of the cell based on a specific discharge rate. You want to get the full, printed-on-the-cell, capacity, you need to pull a maximum of 0.5amps of some low value. Check this NCR18650 chart from Panasonic. This is not the cell in the Model S but look at the bottom right graph. The blue line, taken at 0.2C or 0.2x the capacity (540 mA in that case) can extract a lot more energy than the Purple one at 2C (5.4amps in that case). The charts will be different for each cell type but pulling more power always means wasting more heat, exponentially.

Also, I could design you several compensation mechanisms right here on the spot. The first one being a lookup table that references current draw with SoC and reports an internal loss number. Simple straightforward implementation.
 
I always lose ~10% from what the trip meter estimates within the first 30 minutes of freeway driving. I expect it to adjust on the second leg of the trip, but it makes the same mistake.
Probably from startup costs. I should add that the above example was with preheating the cabin, AND the battery using the max battery option. So there were no startup costs on the way there.

I will also point out that it kept diverging, it was not a one-time cost.
 
edit - I can only come to the conclusion that this is due to fudging of the actual consumption of P models.

Could be. There's definitely some fudging going on. I've a couple of Model S owners and it seems that it's either :
#1)Trip meter is super accurate
#2)Trip meter is way off.

For some reasons, most people in #1 are people who drive more slowly and/or have cars equipped with the base 19in wheels and for the most part, non P models.. Most people in #2 have P or drive faster than the posted speed limit and / or have 21in wheels.

I guess the model (as in math model, not Model S model) they use to predict energy consumption assumes 19in wheels and driving below or at the speed limit. A local P85D owner can beat the rated consumption on his car no problems in here... he tells me the trick is to never exceed ~80kW of power and drive 100kph on the highway. He also has 19in on the car with the low-rolling resistance Michelin tires.