This thread really sparked up an interest in me so I got into searching/thinking. Here's my speculation about what's happening;
DISCLAIMER: I'm not an engineer or a technician. Everything I know about EVs and batteries I learned online reading vigirously about this very exciting technology.
First I'll sum up what I understood so far;
- OP,
@supratachophobia has a Model S 90D. He also monitors his car's Canbus messages using a fan-made software likeTM-Spy to spy on his BMS values. He claims that his car is showing rated miles on his dash by dividing the rated mile consumption figure
(revealed by @wk057 in his amazing work) by the nominal capacity his pack reports.
(including the alleged 4kWh buffer) However his observations suggest this
Rated Miles = nominal capacity / RM efficiency formula changes as the battery SoC gets lower. So much so that he loses 4-6% at the bottom 20% of his charge.
- wk057 supports this idea and suggests this just might be Tesla's way of trying to hide more than average degredation of the original 90 packs. As we all remember original 90s lost a lot of rated miles quickly when they first started delivery. Then their supercharging C-rates dropped, then maybe this update was pushed hiding the lost rated miles by absorbing the degradation inside the entire discharge. To make things more worrying he shared his working theory that in virtually all cars including the Model 3, consumption figures are shown less in the first half of the discharge, more at the second half.
Here's how I think before we jump the gun on Tesla.
Batteries are not a pint, a jug or a litre. What you get out of it depends on A LOT of different factors that are hard - if not impossibl - to replicate do the decimal every single time. So SoC determination is always a guess. We are reading from the cars' Canbus that they have xx kWh of energy in them. We don't know how that energy figure was derived. Basic math suggest that for the 90 pack every cell is 3,3Ah at 3.7V nominal and there is a 74P96S setup which results in 86,7kWh nominal. However in reality battery capacity is the total area under the discharge voltage curve, which changes a lot on every situation. So nominal values just give us a ballpark idea.
Also, voltage range is determined by Tesla, not the battery itself. For Tesla's case 4.2V = full and 3.0V = empty. So as you can see from the graph below (I know it isn't accurate, just to give an idea) assuming a linear n
ominal voltage x amp hour = capacity formula is wrong. When you consider the discharge period and assume linearity it could be that it is understated in the first half and overstated in the second. Because I think the way 'consumption' is showed is just the delta SoC divided by distance traveled repeated every n time period. Which could explain the variance in consumption figures depending on SoC
@wk057. (sorry for the terrible drawing)
However I refuse to believe Tesla would just report kWh capacity in their BMS by just multiplying nominal voltage with rated amp hour capacity. So I got into searching how fuel gauging is done in batteries. Gotta say it is
much more complex than I expected. Here's what I got so far;
There isn't a single true method. There are multiple methods and a combination of them is used. We don't know which Tesla uses but in general;
1) Book keeping. Also called coloumb counting. Manufacturer knows the rated ideal capacity of the battery when full and tells it to BMS at the beginning. Starting from the first cycle BMS counts the coloumbs going in and out, keeping the books to report how much 'should be' in the pack.
However this can be full of errors over time so an adjustment method is also there so;
2) Impedence and open circuit voltage measurement: Battery's voltages wouldn't say much about the SoC under load however when contactors are open - hence pack is in open voltage state - battery's impedence, voltage and temperature inputs would give us an accurate reading of SoC and SoH. There are very complex and proprietary algorithms about this but everytime we put the car in 'P' and leave, SoC is adjusted.
So my theory is that Tesla reports the rated miles on the dash using columb counter's figure when full. However as it is depleted other methods are used to adjust the remaining capacity more accurately so rated miles figure is also adjusted. It could also be that the Canbus message of remaining kWh nominal/usable we decoded is what the coloumb counter is reporting only. AFAIK UI SoC and Canbus SoC doesn't match up completely. When UI says 0%, can bus can still report some usable kWh. So expecting the BMS Canbus kWh figure to always be right to the decimal about rated miles figure could be wrong because of this
@supratachophobia.
However Tesla isn't off the hook just yet because even if they are reporting RM when full using coloumb counter's figures, they shouldn't be using nominal, they should be using usable. My theory explains the anomalies and non linearity we see during discharge but Tesla indeed could be hiding the more than average degredation for early 90s just by changing a line of code for the high SoC rated miles displayed.
Really interested to hear what you guys think. Please correct me if I'm wrong as everything I learned, I learned online.