Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

How I Recovered Half of my Battery's Lost Capacity

This site may earn commission on affiliate links.
This is one of the main reasons I think the OP actually observed their pack being balanced, not some sort of calibration.

To summarize for those that don't want to read the whole thead:
  • The BMS is actually very accurate. To recover that amount of range, it should not have just been the BMS being off.
  • OP's changed habits allowed the car to balance much more than before, to our collective understanding of Model 3's balancing procedure.
  • Imbalance can result in reduced usable pack capacity, but may be recovered if the battery simply wasn't having enough time to correct that balance.
  • Recovering from imbalance is very, very slow on Model 3.
I also just can't believe that calibration would take something on the order of months. By the time you "calibrate", what have you calibrated based on? Months-old data? The pack capacity has likely gone down by then, so it requires "calibration" again, and so the cycle repeats. It just can't work that way, at least not wholly and to this degree.



I'm confused by the top-end lock you're referring to, can you elaborate? (Maybe I missed something in the thread, sorry if I did)



I bet this was Sentry or Summon Standby or something, especially since you mentioned you couldn't charge it (maybe implying not being at home?). If I assume 12h overnight, that's about 240W average - implies the car was awake the whole time, as that would be about right. Or perhaps it was uploading/downloading something really slowly. Or any other reason for it to be awake. Who knows, but the numbers correspond strongly to being awake.

if your max range drops by 25km over night then this is not true degradation, the BMS has detected that it doesnt believe the cells are either safe to charge to a higher voltage or they discharge themselves too quickly so cant hold a voltage over a certain limit. so it limits your max charge.

this and many threads try to figure out how the BMS thinks the way it does and how to change its way of thinking so that it will unlock range again.

the ovc battery connector reading theory is a good theory as many ppl have noticed chunks of range coming back and going on which is highly suggestive of the bms waiting for certain info (such as those readings)and then unlocking the max kwh it allows you to dump in the battery. for most people this seems to happen once every 3 months or so.
 
  • Informative
Reactions: Arctic_White
Anecdotally on my 2018 LR RWD (17k miles) I have no home charging so I charge once every 1-2 weeks. Mostly L2 but for the last year mostly Fast-DC (Chademo and Supercharger). I almost always charge to 90% and run down to 10%.

My 100% SOC is still 312 miles. I for like 1 week had the 325 mi 'upgrade' and then the next update put it back at 310-315.

I never run Sentry mode because I don't have home charging. I almost always do 70%+ charge cycles because I try to go as long as possible without charging since I can only charge at public locations and my car goes into deep sleep whenever possible to avoid phantom drain and needing to charge.

So the fact that I've seen pretty much 0% degradation and I have lived OP's advice would seem to affirm his claim. A feasting and fasting lifestyle seems to keep the BMS with the least estimated loss.
 
Like many others, I have been concerned with loss of 100% indicated battery range on one of my Model 3s. My P3D (build date 9/13/2018, delivery date 10/8/2018) had gotten down to 270.3 miles at 100% charge on January 20, 2020, at about 30,700 miles, which is a loss of 40.8 miles since the car was new.

I posted about going to the service center to talk with them about battery degradation, which I did on March 9, 2020. It was a great service appointment and the techs at the Houston Westchase service center paid attention to my concerns and promised to follow up with a call from the lead virtual tech team technician. I detailed this service visit in the following post:

Reduced Range - Tesla Issued a Service Bulletin for possible fix

While that service visit was great, the real meat of addressing the problem came when I spoke to the virtual tech team lead. He told me some great things about the Model 3 battery and BMS. With the knowledge of what he told me, I formulated a plan to address it myself.

So here is the deal on the Model 3 battery and why many of us might be seeing this capacity degradation.

The BMS system is not only responsible for charging and monitoring of the battery, but computing the estimated range. The way it does this is to correlate the battery's terminal voltage (and the terminal voltage of each group of parallel cells) to the capacity. The BMS tries to constantly refine and calibrate that relationship between terminal voltage and capacity to display the remaining miles.

For the BMS to execute a calibration computation, it needs data. The primary data it needs to to this is what is called the Open Circuit Voltage (OCV) of the battery and each parallel group of cells. The BMS takes these OCV readings whenever it can, and when it has enough of them, it runs a calibration computation. This lets the BMS now estimate capacity vs the battery voltage. If the BMS goes for a long time without running calibration computations, then the BMS's estimate of the battery's capacity can drift away from the battery's actual capacity. The BMS is conservative in its estimates so that people will not run out of battery before the indicator reads 0 miles, so the drift is almost always in the direction of estimated capacity < actual capacity.

So, when does the BMS take OCV readings? To take a set of OCV readings, the main HV contactor must be open, and the voltages inside the pack for every group of parallel cells must stabilize. How long does that take? Well, interestingly enough, the Model 3 takes a lot longer for the voltages to stabilize than the Model S or X. The reason is because of the battery construction. All Tesla batteries have a resistor in parallel with every parallel group of cells. The purpose of these resistors is for pack balancing. When charging to 100%, these resistors allow the low cells in the parallel group to charge more than the high cells in the group, bringing all the cells closer together in terms of their state of charge. However, the drawback to these resistors is that they are the primary cause of vampire drain.

Because Tesla wanted the Model 3 battery to be the most efficient it could be, Tesla decided to decrease the vampire drain as much as possible. One step they took to accomplish this was to increase the value of all of these resistors so that the vampire drain is minimized. The resistors in the Model 3 packs are apparently around 10x the value of the ones in the Model S/X packs. So what does this do to the BMS? Well, it makes the BMS wait a lot longer to take OCV readings, because the voltages take 10x longer to stabilize. Apparently, the voltages can stabilize enough to take OCV readings in the S/X packs within 15-20 minutes, but the Model 3 can take 3+ hours.

This means that the S/X BMS can run the calibration computations a lot easier and lot more often than the Model 3. 15-20 minutes with the contactor open is enough to get a set of OCV readings. This can happen while you're out shopping or at work, allowing the BMS to get OCV readings while the battery is at various states of charge, both high and low. This is great data for the BMS, and lets it run a good calibration fairly often.

On the Model 3, this doesn't happen. With frequent small trips, no OCV readings ever get taken because the voltage doesn't stabilize before you drive the car again. Also, many of us continuously run Sentry mode whenever we're not at home, and Sentry mode keeps the contactor engaged, thus no OCV readings can be taken no matter how long you wait. For many Model 3's, the only time OCV readings get taken is at home after a battery charge is completed, as that is the only time the car gets to open the contactor and sleep. Finally, 3 hours later, OCV readings get taken.

But that means that the OCV readings are ALWAYS at your battery charge level. If you always charge to 80%, then the only data the BMS is repeatedly collecting is 80% OCV readings. This isn't enough data to make the calibration computation accurate. So even though the readings are getting taken, and the calibration computation is being periodically run, the accuracy of the BMS never improves, and the estimated capacity vs. actual capacity continues to drift apart.

So, knowing all of this, here's what I did:

1. I made it a habit to make sure that the BMS got to take OCV readings whenever possible. I turned off Sentry mode at work so that OCV readings could be taken there. I made sure that TeslaFi was set to allow the car to sleep, because if it isn't asleep, OCV readings can't get taken.

2. I quit charging every day. Round-trip to work and back for me is about 20% of the battery's capacity, and I used to normally charge to 90%. I changed my standard charge to 80%, and then I began charging the car at night only every 3 days. So day 1 gets OCV readings at 80% (after the charge is complete), day 2 at about 60% (after 1 work trip), and day 3 at about 40% (2 work trips). I arrive back home from work with about 20% charge on that last day, and if the next day isn't Saturday, then I charge. If the next day is Saturday (I normally don't go anywhere far on Saturday), then I delay the charge for a 4th day, allowing the BMS to get OCV readings at 20%. So now my BMS is getting data from various states of charge throughout the range of the battery.

3. I periodically (once a month or so) charge to 95%, then let the car sleep for 6 hours, getting OCV readings at 95%. Don't do this at 100%, as it's not good for the battery to sit with 100% charge.

4. If I'm going to take a long drive i.e. road trip, then I charge to 100% to balance the battery, then drive. I also try to time it so that I get back home with around 10% charge, and if I can do that, then I don't charge at that time. Instead, let the car sleep 6 hours so it gets OCV readings at 10%.

These steps allowed the BMS to get many OCV readings that span the entire state of charge of the battery. This gets it good data to run an accurate calibration computation.

So what's the results?

20200827Battery100PctRange.png


On 1/20/2020 at 30,700 miles, I was down to 270 miles full range, which is 40.8 miles lost (15.1 %). The first good, accurate recalibration occurred 4/16/2020 at 35,600 miles and brought the full range up to 286 miles. Then another one occurred on 8/23/2020 at 41,400 miles and brought the range up to 290 miles, now only a 20 mile loss (6.9 %).

Note that to get just two accurate calibration computations by the BMS took 7 months and 11,000 miles.

So, to summarize:

1. This issue is primarily an indication/estimation problem, not real battery capacity loss.
2. Constant Sentry mode use contributes to this problem, because the car never sleeps, so no OCV readings get taken.
3. Long voltage stabilization times in the Model 3 prevent OCV readings from getting taken frequently, contributing to BMS estimation drift.
4. Constantly charging every day means that those OCV readings that do get taken are always at the same charge level, which makes the BMS calibration inaccurate.
5. Multiple accurate calibration cycles may need to happen before the BMS accuracy improves.
6. It takes a long time (a lot of OCV readings) to cause the BMS to run a calibration computation, and therefore the procedure can take months.

I would love if someone else can perform this procedure and confirm that it works for you, especially if your Model 3 is one that has a lot of apparent degradation. It will take months, but I think we can prove that this procedure will work.
Like many others, I have been concerned with loss of 100% indicated battery range on one of my Model 3s. My P3D (build date 9/13/2018, delivery date 10/8/2018) had gotten down to 270.3 miles at 100% charge on January 20, 2020, at about 30,700 miles, which is a loss of 40.8 miles since the car was new.

I posted about going to the service center to talk with them about battery degradation, which I did on March 9, 2020. It was a great service appointment and the techs at the Houston Westchase service center paid attention to my concerns and promised to follow up with a call from the lead virtual tech team technician. I detailed this service visit in the following post:

Reduced Range - Tesla Issued a Service Bulletin for possible fix

While that service visit was great, the real meat of addressing the problem came when I spoke to the virtual tech team lead. He told me some great things about the Model 3 battery and BMS. With the knowledge of what he told me, I formulated a plan to address it myself.

So here is the deal on the Model 3 battery and why many of us might be seeing this capacity degradation.

The BMS system is not only responsible for charging and monitoring of the battery, but computing the estimated range. The way it does this is to correlate the battery's terminal voltage (and the terminal voltage of each group of parallel cells) to the capacity. The BMS tries to constantly refine and calibrate that relationship between terminal voltage and capacity to display the remaining miles.

For the BMS to execute a calibration computation, it needs data. The primary data it needs to to this is what is called the Open Circuit Voltage (OCV) of the battery and each parallel group of cells. The BMS takes these OCV readings whenever it can, and when it has enough of them, it runs a calibration computation. This lets the BMS now estimate capacity vs the battery voltage. If the BMS goes for a long time without running calibration computations, then the BMS's estimate of the battery's capacity can drift away from the battery's actual capacity. The BMS is conservative in its estimates so that people will not run out of battery before the indicator reads 0 miles, so the drift is almost always in the direction of estimated capacity < actual capacity.

So, when does the BMS take OCV readings? To take a set of OCV readings, the main HV contactor must be open, and the voltages inside the pack for every group of parallel cells must stabilize. How long does that take? Well, interestingly enough, the Model 3 takes a lot longer for the voltages to stabilize than the Model S or X. The reason is because of the battery construction. All Tesla batteries have a resistor in parallel with every parallel group of cells. The purpose of these resistors is for pack balancing. When charging to 100%, these resistors allow the low cells in the parallel group to charge more than the high cells in the group, bringing all the cells closer together in terms of their state of charge. However, the drawback to these resistors is that they are the primary cause of vampire drain.

Because Tesla wanted the Model 3 battery to be the most efficient it could be, Tesla decided to decrease the vampire drain as much as possible. One step they took to accomplish this was to increase the value of all of these resistors so that the vampire drain is minimized. The resistors in the Model 3 packs are apparently around 10x the value of the ones in the Model S/X packs. So what does this do to the BMS? Well, it makes the BMS wait a lot longer to take OCV readings, because the voltages take 10x longer to stabilize. Apparently, the voltages can stabilize enough to take OCV readings in the S/X packs within 15-20 minutes, but the Model 3 can take 3+ hours.

This means that the S/X BMS can run the calibration computations a lot easier and lot more often than the Model 3. 15-20 minutes with the contactor open is enough to get a set of OCV readings. This can happen while you're out shopping or at work, allowing the BMS to get OCV readings while the battery is at various states of charge, both high and low. This is great data for the BMS, and lets it run a good calibration fairly often.

On the Model 3, this doesn't happen. With frequent small trips, no OCV readings ever get taken because the voltage doesn't stabilize before you drive the car again. Also, many of us continuously run Sentry mode whenever we're not at home, and Sentry mode keeps the contactor engaged, thus no OCV readings can be taken no matter how long you wait. For many Model 3's, the only time OCV readings get taken is at home after a battery charge is completed, as that is the only time the car gets to open the contactor and sleep. Finally, 3 hours later, OCV readings get taken.

But that means that the OCV readings are ALWAYS at your battery charge level. If you always charge to 80%, then the only data the BMS is repeatedly collecting is 80% OCV readings. This isn't enough data to make the calibration computation accurate. So even though the readings are getting taken, and the calibration computation is being periodically run, the accuracy of the BMS never improves, and the estimated capacity vs. actual capacity continues to drift apart.

So, knowing all of this, here's what I did:

1. I made it a habit to make sure that the BMS got to take OCV readings whenever possible. I turned off Sentry mode at work so that OCV readings could be taken there. I made sure that TeslaFi was set to allow the car to sleep, because if it isn't asleep, OCV readings can't get taken.

2. I quit charging every day. Round-trip to work and back for me is about 20% of the battery's capacity, and I used to normally charge to 90%. I changed my standard charge to 80%, and then I began charging the car at night only every 3 days. So day 1 gets OCV readings at 80% (after the charge is complete), day 2 at about 60% (after 1 work trip), and day 3 at about 40% (2 work trips). I arrive back home from work with about 20% charge on that last day, and if the next day isn't Saturday, then I charge. If the next day is Saturday (I normally don't go anywhere far on Saturday), then I delay the charge for a 4th day, allowing the BMS to get OCV readings at 20%. So now my BMS is getting data from various states of charge throughout the range of the battery.

3. I periodically (once a month or so) charge to 95%, then let the car sleep for 6 hours, getting OCV readings at 95%. Don't do this at 100%, as it's not good for the battery to sit with 100% charge.

4. If I'm going to take a long drive i.e. road trip, then I charge to 100% to balance the battery, then drive. I also try to time it so that I get back home with around 10% charge, and if I can do that, then I don't charge at that time. Instead, let the car sleep 6 hours so it gets OCV readings at 10%.

These steps allowed the BMS to get many OCV readings that span the entire state of charge of the battery. This gets it good data to run an accurate calibration computation.

So what's the results?

20200827Battery100PctRange.png


On 1/20/2020 at 30,700 miles, I was down to 270 miles full range, which is 40.8 miles lost (15.1 %). The first good, accurate recalibration occurred 4/16/2020 at 35,600 miles and brought the full range up to 286 miles. Then another one occurred on 8/23/2020 at 41,400 miles and brought the range up to 290 miles, now only a 20 mile loss (6.9 %).

Note that to get just two accurate calibration computations by the BMS took 7 months and 11,000 miles.

So, to summarize:

1. This issue is primarily an indication/estimation problem, not real battery capacity loss.
2. Constant Sentry mode use contributes to this problem, because the car never sleeps, so no OCV readings get taken.
3. Long voltage stabilization times in the Model 3 prevent OCV readings from getting taken frequently, contributing to BMS estimation drift.
4. Constantly charging every day means that those OCV readings that do get taken are always at the same charge level, which makes the BMS calibration inaccurate.
5. Multiple accurate calibration cycles may need to happen before the BMS accuracy improves.
6. It takes a long time (a lot of OCV readings) to cause the BMS to run a calibration computation, and therefore the procedure can take months.

I would love if someone else can perform this procedure and confirm that it works for you, especially if your Model 3 is one that has a lot of apparent degradation. It will take months, but I think we can prove that this procedure will work.[/QUOTE
Great work! I love the thread!
 
I charge up to ~ 80% in the winter and ~ 70% in the summer, and recharge when the battery is 20 - 30% for my usual at home routine. Charges to 100% and Supercharging are uncommon events.
The calculated range for my June 2018 Model 3 LR was 312 when new, bumped up to 316 after the software 'unlock' (for lack of a better term) and has been 318 - 320 for many months.

Last week I collected 'scan my tesla' data for the first time. Rather than guess what correlates, I'll just post a screen capture:
SMT says degradation is 4.5%. I'm not sure, but in any case I'll use the app to trend whatever it is measuring/calculating.

SMT.jpg
 
SMT says degradation is 4.5%. I'm not sure, but in any case I'll use the app to trend whatever it is measuring/calculating.

Yeah, seems right. Have that initial hidden capacity loss (likely inflated energy per unit distance until hitting the threshold) to account for, which is why 318/325 is not equal to 95.5%. It is a bit more complicated on the LR RWD due to the range change, but that appears to have just been some capacity unlock (maybe from the low end - I don't know. Not aware of any regen changes at 100% with the update so probably not from the high end.).

My 100% SOC is still 312 miles.

So that is 234Wh/rmi*312rmi / ~78kWh = 93.6% of your original capacity, or just 6.4% capacity loss. A very good result for a car of this age (though mileage is fairly low). You're definitely above the 50% percentile!


Looks like SMT assumes 77.8kWh as a typical starting capacity, but fairly sure it can sometimes be higher (based on what I have heard people say only, though, so no idea). Also the EPA docs say it's often as high as 79kWh (but those may be slightly "different" kWh...meaning they have a small error in the BMS measurement - or, possibly, during the EPA test, the actual BMS remaining capacity goes to indicated zero somewhat before the car stops rolling and shuts down).
 
Last edited:
Last week I collected 'scan my tesla' data for the first time. Rather than guess what correlates, I'll just post a screen capture:
SMT says degradation is 4.5%. I'm not sure, but in any case I'll use the app to trend whatever it is measuring/calculating.
... I think I have a handle on these different battery capacity numbers. I invite corrections as needed.

Nominal new pack capacity is 77.8 kWh
EPA value constant for my LR model is 237 Wh/mile
~ 3.4 kWh is usable 'energy buffer' below '0% SoC remaining' on the display before the car shuts down. This amount decreases as the battery ages
The 'miles remaining' on the Tesla display includes the energy buffer in its calculation
~ 3 kWh of capacity is typically -- perhaps universally -- lost in the first few months of car use. I don't know the physics or chemistry but the loss appears to be distinct from what we track as normal degradation over time. This is why it is reasonable to trend battery capacity from 75 kWh once the car is a couple of months old.

Going by the above, my SMT report shows 74.4 kWh 'nominal full pack' which is the capacity from 100% SoC on the display until the car shuts off. Starting from 75 kWh for a new but broken in pack, I have lost 0.6 kWh of usable capacity over two years, which works out to a loss of 1.25 EPA miles a year.
 
Last edited:
... I think I have a handle on these different battery capacity numbers. I invite corrections as needed.

Nominal new pack capacity is 77.8 kWh
EPA value constant for my LR model is 237 Wh/mile
~ 3.4 kWh is usable 'energy buffer' below '0% SoC remaining' on the display before the car shuts down. This amount decreases as the battery ages
The 'miles remaining' on the Tesla display includes the energy buffer in its calculation
~ 3 kWh of capacity is typically -- perhaps universally -- lost in the first few months of car use. I don't know the physics or chemistry but the loss appears to be distinct from what we track as normal degradation over time. This is why it is reasonable to trend battery capacity from 75 kWh once the car is a couple of months old.

Going by the above, my SMT report shows 74.4 kWh 'nominal full pack' which is the capacity from 100% SoC on the display until the car shuts off. Starting from 75 kWh for a new but broken in pack, I have lost 0.6 kWh of usable capacity over two years, which works out to a loss of 1.25 EPA miles a year.

The energy buffer is always 4.5% of the current capacity (so 3.4kwh is correct for a new battery). From independent cell tests the pack is likely an 80kwh pack so theres an additional 1.5kwh or so for brick protection to prevent copper from melting.

Miles remaining does not include the energy buffer (as evident by it showing 0km at 0%). The EPA number (340 miles or whatever it was) does however include it. Given that noone drives EPA style this is fairly irrelevant.

75 kwh is the official capacity of the pack as it doesnt include the energy buffer --> 75 + 3.4kwh = 78.4kwh.

when talking about pack capacity it is common convention to not include the 4.5% energy buffer or the brick protection.

Most Cars do indeed loose around 2-3kwh in the first few months so you end up with a capacity of around 72.5kwh or so which equals 485km/ 300 miles. The original RWD Model 3 only goes below 310 miles once it lost over... 2.5 or 3kwh. Reason being that Tesla initially wanted the typical range to display 322 miles but chose for unknown reasons to hide 12 miles and display 310 miles like the later AWDs.
That means that you need more than 3% degradation for it to show any degradation at all.
 
Miles remaining does not include the energy buffer
I was relying on driver testing as exemplified by
How Tesla Calculates Range (hidden buffer)

A full battery shows RM that match the SMT nominal battery capacity, while a RM of zero occurs when 4.5% of nominal battery capacity remains. The energy buffer grows linearly from 0 to 4.5% of nominal capacity as the RM drops from max to zero.
 
Last edited:
  • Disagree
Reactions: AlanSubie4Life
I was relying on driver testing as exemplified by
How Tesla Calculates Range (hidden buffer)

A full battery shows RM that match the SMT nominal battery capacity, while a RM of zero occurs when 4.5% of nominal battery capacity remains. The energy buffer grows linearly from 0 to 4.5% of nominal capacity as the RM drops from max to zero.

ehm no. if you have 1% left you have 5km. the buffer doesnt get used until you are sub 0% and sub 0 km.
 
The energy buffer grows linearly from 0 to 4.5% of nominal capacity as the RM drops from max to zero.

That is an incorrect conclusion to draw from that specific post by @TimothyHW3. The energy buffer is always 4.5% of your energy at 100%, regardless of what your SoC is. The way to explain this is that while (for 2018/2019 AWD) there is a energy constant of 245Wh/rmi (2018/2019 AWD only) applying to the whole battery ONLY when at 100%, including the buffer, there is a separate energy content constant for each *displayed* rated mile which is 4.5% less. That is ~234Wh/rmi (AWD 2018/2019 only). Using this constant at any SOC will always give you available energy before you hit the buffer. (The trip meter will confirm this, though it will read about 1% lower).

So the buffer is always there, always the same size above 0% SoC. It doesn’t grow as you drive. The reason it is NOT just a philosophical distinction is: to add back that rated mile energy, it only takes 234Wh/rmi (however there are some charging losses so obviously it actually takes more). The car itself of course claims each rated mile added is 245Wh/rmi, but if you follow that to its logical conclusion it would lead to an unrealistically high charging efficiency (which is inconsistent with the EPA results which suggest about 90% efficient charging). This is 100% measurable - no one has to believe me, you can just check it yourself and you’ll find this to be correct. (Again, all for 2018/2019 AWD)

The energy buffer grows linearly from 0 to 4.5% of nominal capacity as the RM drops from max to zero.

No it does not. You can look at SMT to see this. Energy buffer is just always there, only starts shrinking below 0%. Or you can see observations outlined above. It’s definitely knowable! Each rated mile you use (on an AWD) consumes 234Wh. For a RWD it is about 223Wh.

You have 5 km (~ 0.75 kWh) on the remaining range display, but much more usable energy in the battery. That is the point.

That’s correct but the buffer size does not change as you discharge. It is confusing due to what Tesla displays on the charging screen as the energy per rated mile added, but that is just a number that does not reflect actual energy added to the battery (you can compare the charging screen energy added to what SMT shows is added during a charging event, if you want to confirm that the charging screen number is just made up (not a measurement - just rated miles * 245Wh/mi (AWD)).
 
The reason it is NOT just a philosophical distinction
I don't think so, since addition of energy to the battery would work in reverse -- the reserve would shrink according to the way I set it up.
So this still looks 'philosophical' to me, and since the math works out the same when calculating degradation I'm not going to lose any sleep.

I agree though that SMT always declaring the same energy reserve independent of SoC points to your interp. Thanks for the discussion.
 
Last edited:
  • Like
Reactions: AlanSubie4Life
I don't think so, since addition of energy to the battery would work in reverse -- the reserve would shrink according to the way I set it up.
So this still looks 'philosophical' to me

Well, if it works that way, that would mean that you are adding less energy per rated mile than you think, as you charge (so charging efficiency would be correct). And using less energy than you think per rated mile as you discharge.

This is functionally equivalent (but philosophically different I suppose) to a smaller constant (234Wh/rmi AWD 2019/2018, 223Wh/rmi RWD) for charging and discharge - which is what is happening.

So, yes, you could think of it as the buffer getting bigger as you discharge - then that truly would be equivalent to what I am saying (less energy per rated mile going in and out!). But physically that’s not what is happening. The buffer will be 4.5% of your 100% energy when you hit 0 rated miles. What it is before that point is really not important, of course...since you are not using it. But it is easiest to think about it as being static (and probably most accurate).

However, it is important to realize that each rated mile (displayed) is 234Wh (AWD 2019/2018). Not 245Wh. That 245Wh is 4.5% bigger, and accounts for the buffer (so this is the constant to use if you want to figure out your total capacity at 100%, including buffer - but the formula only works as a representation of energy available in the battery when you use the 100% rated miles - you can’t use miles at 50% and multiply by 245Wh/rmi to calculate remaining energy - to do that accurately you have to use the 4.5% of 100% energy for the buffer and add 234Wh/rmi * remaining rated miles).

Coming full circle, to explicitly describe the equivalency, if you want, you can think of each rated mile (AWD 2019/2018) as “using” 245Wh - but only 234Wh is used, while 11Wh is stuffed into the buffer (not used!). And the reverse happens when charging. But that seems like unnecessary complication, and the fact remains with this formulation that only 234Wh are actually being used (or added!) per rated mile.

It’s not a huge issue, but to think about it most simply (and closest to reality), think of the buffer as a static 4.5% of your 100% energy available. It never gets drawn down or changes in size unless your battery capacity reduces (when it scales to match the reduction - it is always 4.5% of the 100% value), or you drop below 0 rated miles.

Probably best to take this to a more relevant thread if we want to discuss further - this is related to battery capacity but is a little off topic here.

But since you have SMT, you can always calculate using the formulas I provide and verify that it really does work this way, if you want.
 
Last edited:
We’ll see if it lasts but I’ve gained 6 miles after 2 weeks of not plugging in and charging every day. I go from about 90% to 30% over the course of a week. The car sits at home in a garage for extended periods when I’m not driving.
 
Looks like SMT assumes 77.8kWh as a typical starting capacity, but fairly sure it can sometimes be higher (based on what I have heard people say only, though, so no idea). Also the EPA docs say it's often as high as 79kWh (but those may be slightly "different" kWh...meaning they have a small error in the BMS measurement - or, possibly, during the EPA test, the actual BMS remaining capacity goes to indicated zero somewhat before the car stops rolling and shuts down).

The idea I've had in my head is simply that the discharge rate during the EPA testing cycles are less than how Tesla otherwise counts on during driving. The fudge factor accounts for power differences, but not the impact that power has on extractable energy.

For example, in real environments with wind resistance, we could expect the car is using ~43% more power (1/~0.7 constant ~= 143%). That constant exists for other reasons too, I'm just doing rough extrapolation and not using actual aerodynamic data. Actually, your "porkiness factor" from the constants thread is perhaps a better thing to use, but I digress.

It is well known that higher discharge rates on typical Li-ion cells result in less energy extracted. The first graph on this page shows that, for example: Calculating the Battery Runtime - Battery University ("energy" here would be the area under the curves).

The difference between actual highway driving (let's say 15kW) and simulated in rating tests (let's say 10.5kW) isn't huge in terms of C-rate. We're talking something like 0.19C vs 0.13C. But the difference between the SMT "Tesla" value and the EPA values are also small, so that may be the reason for the difference.

My above theory may be based on a misunderstanding and complete garbage.
 
  • Informative
Reactions: Arctic_White
The idea I've had in my head is simply that the discharge rate during the EPA testing cycles are less than how Tesla otherwise counts on during driving. The fudge factor accounts for power differences, but not the impact that power has on extractable energy.

For example, in real environments with wind resistance, we could expect the car is using ~43% more power (1/~0.7 constant ~= 143%). That constant exists for other reasons too, I'm just doing rough extrapolation and not using actual aerodynamic data. Actually, your "porkiness factor" from the constants thread is perhaps a better thing to use, but I digress.

It is well known that higher discharge rates on typical Li-ion cells result in less energy extracted. The first graph on this page shows that, for example: Calculating the Battery Runtime - Battery University ("energy" here would be the area under the curves).

The difference between actual highway driving (let's say 15kW) and simulated in rating tests (let's say 10.5kW) isn't huge in terms of C-rate. We're talking something like 0.19C vs 0.13C. But the difference between the SMT "Tesla" value and the EPA values are also small, so that may be the reason for the difference.

My above theory may be based on a misunderstanding and complete garbage.

the capacity is always the same, teala aims for around 77.
The idea I've had in my head is simply that the discharge rate during the EPA testing cycles are less than how Tesla otherwise counts on during driving. The fudge factor accounts for power differences, but not the impact that power has on extractable energy.

For example, in real environments with wind resistance, we could expect the car is using ~43% more power (1/~0.7 constant ~= 143%). That constant exists for other reasons too, I'm just doing rough extrapolation and not using actual aerodynamic data. Actually, your "porkiness factor" from the constants thread is perhaps a better thing to use, but I digress.

It is well known that higher discharge rates on typical Li-ion cells result in less energy extracted. The first graph on this page shows that, for example: Calculating the Battery Runtime - Battery University ("energy" here would be the area under the curves).

The difference between actual highway driving (let's say 15kW) and simulated in rating tests (let's say 10.5kW) isn't huge in terms of C-rate. We're talking something like 0.19C vs 0.13C. But the difference between the SMT "Tesla" value and the EPA values are also small, so that may be the reason for the difference.

My above theory may be based on a misunderstanding and complete garbage.


there is some small variety in how much you can extract from the pack depending on discharge speed but for a model 3 battery there are other factors which influence this more i. e. heat losses.

some packs are better than others and can just squeeze out a few more amphours but overall the model 3 battery pack is meant to hold 78.5kwh with the bottom 4.5% locked out from the display (so 0 to 100% being 75kwh). And from cell trials we know that the pack is actually an 80kwh pack with a 2.5kwh brick protection.

im sure if discharged at 0.001c the pack might be able to give i. e. 85kwh but this is irrelevant for the usual use..
 
there is some small variety in how much you can extract from the pack depending on discharge speed but for a model 3 battery there are other factors which influence this more i. e. heat losses.

I believe he was referring to the discrepancy between the Tesla-measured capacity in the EPA submissions (usually measured in excess of 79kWh, to when the car stops moving, for a car with ~4000 miles on it), and the typical ~78kWh that is usually observed for a new car on the CAN bus using SMT (I am sure we have plenty of data from SMT for cars with 4000 miles and typically they will be between 77 and 78kWh I think).

The question is why there is a discrepancy there. It’s not a large difference or particularly important, but it is a curious discrepancy.

Could be scaling, could be heat losses (Tesla CAN bus capacity numbers could assume a higher average current draw than the EPA test average draw), could be a little anti-brick reserve below the buffer which can actually be used in low-draw conditions, etc. I have no idea.

In the end the starting capacity is (typically) between 78kWh and 79.5kWh, it seems. But fairly sure SMT would not ever show a number as high as 79.5kWh? Even if that is the “actual” capacity (for a given current draw).
 
Last edited:
  • Informative
Reactions: Arctic_White
I believe he was referring to the discrepancy between the Tesla-measured capacity in the EPA submissions (usually measured in excess of 79kWh, to when the car stops moving, for a car with ~4000 miles on it), and the typical ~78kWh that is usually observed for a new car on the CAN bus using SMT.

The question is why there is a discrepancy there. It’s not a large difference or particularly important, but it is a curious discrepancy.
Indeed. I realised upon re-reading my post I didn't make that clear except that I had stripped down the part of your post I was quoting, which is my bad.

Although I kinda wanna know more about this cell trials thing, especially at what rate they were discharging it and under what temperatures etc.
 
  • Like
Reactions: AlanSubie4Life