Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Wiki Sudden Loss Of Range With 2019.16.x Software

This site may earn commission on affiliate links.
I've already done that. Real degradation is the original capacity when new minus the existing capacity when fully charged. Software capping a battery so that it cannot achieve it's current existing capacity is not any kind of degradation at all.

I will remind you that I'm STILL on v8 and that my actual degradation is 11 miles down from the original 253 miles. This is after 100K miles. Using existing fleet data, I can expect to lose another 5 miles over the next 100K miles.

If I were to take v9, I could very well end up losing another 20% on top of my existing minimal 4.5% and that would be instant.

I will also remind you that the only known cases where manufacturers software capped batteries after the fact are cases where those batteries were deemed defective and the capping was done as a safety measure until those devices could be returned for full refunds.

BTW, we're officially into rinse and repeat mode o_O

So glad I'm still on v8 :p
Are you able to read back CANBUS data via ScanMyTesla to get Max cell voltage data ?
 
Okay, I've been staying by the river with the luggage while the pundits explain and debate some things that have me more confused than ever. Here is my question:

My range in our 2014 S85 appears to be the same after all these updates, roughly 255-257 miles on a 100% charge. (I guess this is deemed batterygate.)

My Supercharging rate has plummeted dramatically so charging from X% to Y% takes 20-30% longer. (I guess this is chargegate.)

My very narrow understanding of electricity can be boiled down to volts times amperes equals watts. Therefore, since the charging rate is much slower, the car is receiving fewer watts. At least that is what I think.

So many of these posts are referring to 4.2 volts or 4.1 volts with handy jargon like vmax and other terms.

In the bad old days, the screen used to display volts and amperes when Supercharging. Generally, the voltage was in the 300-400 range, and amperes in excess of 250. The volts would rise slightly while the amperage dropped more quickly when the car was getting long into the taper.

I cannot reconcile the car receiving 300+ volts with Supercharging sessions with 4.2 volts. Clearly, I am missing something.

What is everyone talking about, please? Thank you for realizing that not everyone has a PhD in electrical engineering.
 
Okay, I've been staying by the river with the luggage while the pundits explain and debate some things that have me more confused than ever. Here is my question:

My range in our 2014 S85 appears to be the same after all these updates, roughly 255-257 miles on a 100% charge. (I guess this is deemed batterygate.)

My Supercharging rate has plummeted dramatically so charging from X% to Y% takes 20-30% longer. (I guess this is chargegate.)

My very narrow understanding of electricity can be boiled down to volts times amperes equals watts. Therefore, since the charging rate is much slower, the car is receiving fewer watts. At least that is what I think.

So many of these posts are referring to 4.2 volts or 4.1 volts with handy jargon like vmax and other terms.

In the bad old days, the screen used to display volts and amperes when Supercharging. Generally, the voltage was in the 300-400 range, and amperes in excess of 250. The volts would rise slightly while the amperage dropped more quickly when the car was getting long into the taper.

I cannot reconcile the car receiving 300+ volts with Supercharging sessions with 4.2 volts. Clearly, I am missing something.

What is everyone talking about, please? Thank you for realizing that not everyone has a PhD in electrical engineering.

The 4.2 volts is referring to a the average voltage of a single cell while NOT under load.
Screenshot_20190620-204310_TM-Spy-XL.jpg



Really just better if you read the thread since this has all been thoroughly explained before....multiple times.
 
  • Informative
Reactions: GSP and Droschke
Okay, I've been staying by the river with the luggage while the pundits explain and debate some things that have me more confused than ever. Here is my question:

My range in our 2014 S85 appears to be the same after all these updates, roughly 255-257 miles on a 100% charge. (I guess this is deemed batterygate.)

My Supercharging rate has plummeted dramatically so charging from X% to Y% takes 20-30% longer. (I guess this is chargegate.)

My very narrow understanding of electricity can be boiled down to volts times amperes equals watts. Therefore, since the charging rate is much slower, the car is receiving fewer watts. At least that is what I think.

So many of these posts are referring to 4.2 volts or 4.1 volts with handy jargon like vmax and other terms.

In the bad old days, the screen used to display volts and amperes when Supercharging. Generally, the voltage was in the 300-400 range, and amperes in excess of 250. The volts would rise slightly while the amperage dropped more quickly when the car was getting long into the taper.

I cannot reconcile the car receiving 300+ volts with Supercharging sessions with 4.2 volts. Clearly, I am missing something.

What is everyone talking about, please? Thank you for realizing that not everyone has a PhD in electrical engineering.
I, like you, found all of this pretty new. But the 4.2 Volts thing is pretty simple. It may help to think of it like a fuel gauge. The technocrats and pedants will no doubt feel faint at this Janet & John explanation, so take it as VERY broad brush, or as a concept.
Each battery cell is designed to have a nominal voltage (Vnom). Think of this as the start point, or mid point. (Vnom actually 3.66V.) It’s the start point because the cell can take more charge than that, up to a higher voltage, and lose charge ending up with less voltage.
However whilst adding more charge (voltage) is possible, charging beyond a certain level has the potential to damage the cell. The job of the Battery Management System (BMS) is to stop that happening. That top figure (Vmax) is 4.2V. 4.2V is pretty much an industry standard. The same goes at the bottom end.
When all the cells in the battery pack are at Vmax, the BMS reports the battery as 100% full (hence fuel gauge). When all the cells are at Vmin, it reports it as 0% (fuel gauge). This doesn’t mean the battery is actually 100% full or 0% empty, just that Vmax or Vmin has been reached. When all the cells are at 4.2V, the sum of all the charge in all the cells totals 70 kWhs (in a 70 kWh battery). (think 70 gallons)
Now the sneaky bit, and the point of this thread. If the BMS changes Vmax from 4.2V to something less, say 4.07V, then when all the cells are at 4.07, (the new Vmax), the BMS sees the cells are all at Vmax and reports the battery as 100% full and stops the cells from taking on any more charge. But now, the sum of all the cells at 4.07V only totals 58 kWh, (58 gallons) not the original 70 kWhs (70 gallons). This is batterygate.

Chargegate is a completely different kettle of fish.
 
Last edited:
Except that for LiIon batteries there is actually a disclaimer that the company doesn't guarantee that over the life of the car that tank will continue to hold 20 gallons. It would have been better had Tesla given guarantees about what degradation of the capacity they would have considered "abnormal" over the years, but that would have been hard, since that depends on the battery chemistry, the way the battery was charged and discharged over the years, and also sheer luck with how "above average" the cells in the pack are. My guess is that they themselves didn't know yet how the batteries would age over the years, because they were on the bleeding edge of how much capacity you could cram in a given volume and weight.

On the other hand, if you could prove that even when new your gas tank couldn't hold 20 gallons, or only held 20 gallons when you filled up part of the gas tank that shouldn't have been filled with fuel, while someone else's gas tank did hold 20 gallons, then you have a different case (and one for which class actions against manufacturers have succeeded in the past).
You keep arguing that a Tesla can only be expected to retain a majority (if that much) of its capacity when it is new (which you haven't defined yet, 1 day, a week, a month...?). I don't think that sort of thing will fly in U.S. courts, it hasn't worked for others, but it certainly isn't going to work for Tesla in the marketplace.Tesla lives or dies by reputation. What do you think will happen to sales if what you are claiming becomes the common impression?
 
From a study about battery aging, the capacity fade of a similar Panasonic NCA cell like the one used in your 85kWh packs:

charging degradation.jpg


The charging method is CCCV at 77°F. 3A (1,1C) would need 95min for 0 to 100% SOC at 4,2V, but imho is comparable to SuC charging protocol from 2014 concerning degradation.
The values fit well with the datasheet, where 0,5C CCCV is recommended at 77°F, and ~ 500 EFCs are the nominal at 80% capacity.
 
Last edited:
Thank you for effectively explaining what some people on this thread are having a really hard time grasping. There are many valid explanations as to why the BMS is reducing max voltage. All are possibilities, just as it is possible Vmax reduction is done solely for legally questionable reasons.

Some people here have a very narrow view of what constitutes “degradation” and how it’s dealt with. According to them, reducing Vmax is never used deal with any degradation type. I find this line of thinking naïve given all the chemical, mechanical, electrical interactions in a battery. Regardless, a possible solution shouldn’t be excluded just because it isn’t usually done. People need to keep an open mind and stop assuming the worst. I’m sure Tesla crap communications skills aren’t helping.

I find it amusing to have internet experts continuously explain what voltage is to those of us with electrical engineering/technical backgrounds. Perhaps I should use the Funny button more, similar to our passive aggressive friends in this thread. On second thought, I’ll just hit the straight up Disagree to be clear.
I think it goes beyond "crap communication skills" when the only information they have given is either blatantly false or misleading.

They have denied that this capping is safety related. They are denying that victim's batteries are anything but "normal", yet they are being treated very differently than most others. They claim the cap is for longevity, but it is reasonable question whether that longevity is just enough to get them past the warranty when the only known customer alternative after that time is a $20,000 replacement.
 
Okay, I've been staying by the river with the luggage while the pundits explain and debate some things that have me more confused than ever. Here is my question:

My range in our 2014 S85 appears to be the same after all these updates, roughly 255-257 miles on a 100% charge. (I guess this is deemed batterygate.)

My Supercharging rate has plummeted dramatically so charging from X% to Y% takes 20-30% longer. (I guess this is chargegate.)

My very narrow understanding of electricity can be boiled down to volts times amperes equals watts. Therefore, since the charging rate is much slower, the car is receiving fewer watts. At least that is what I think.

So many of these posts are referring to 4.2 volts or 4.1 volts with handy jargon like vmax and other terms.

In the bad old days, the screen used to display volts and amperes when Supercharging. Generally, the voltage was in the 300-400 range, and amperes in excess of 250. The volts would rise slightly while the amperage dropped more quickly when the car was getting long into the taper.

I cannot reconcile the car receiving 300+ volts with Supercharging sessions with 4.2 volts. Clearly, I am missing something.

What is everyone talking about, please? Thank you for realizing that not everyone has a PhD in electrical engineering.

Thanks for the question, and I see @sorka and @Ferrycraigs have been helping out to explain.

But regarding this:
not everyone has a PhD in electrical engineering

No worries. You do not have to have PhD in EE to figure this out. In fact, there is a poster who credentials his EE background every chance he gets but astonishingly he has gotten everything wrong all along. Thanks for posting in this thread with your good questions.
 
From a study about battery aging, the capacity fade of a similar Panasonic NCA cell like the one used in your 85kWh packs:

View attachment 448124

The charging method is CCCV at 77°F. 3A (1,1C) would need 95min for 0 to 100% SOC at 4,2V, but imho is comparable to SuC charging protocol from 2014 concerning degradation.
The values fit well with the datasheet, where 0,5C CCCV is recommended at 77°F, and ~ 500 EFCs are the nominal at 80% capacity.

Would you please provide the link to this study?

The graphs seem to indicate the Tesla BMS @ SuC has basically been doing the job of the traffic cop at my town's busy intersection where he just sits in a V8 SUV, polluting the air with the engine running while playing with his cellphone, waiting for an accident to happen, when all along he should be out directing the heavy traffic and "protecting" people. Our batteries have not been protected by design, looks to me. Shove as much as high voltage/amp to that pack so we can claim no competition. And, if it's damaged, well, we software band-aid it by capping your capacity, that way even our warranty obligation can be handled by software updates.

Also, which datasheet are you referring to?

Thanks.
 
Last edited:
  • Like
Reactions: Chaserr and sorka
Would you please provide the link to this study?
The dissertation of Peter Keil, which was linked here already. https://mediatum.ub.tum.de/doc/1355829/file.pdf

Also, which datasheet are you referring to?
NCR18650A

Please note that the graph is at 77°F, where Li-plating is the main driver for degradation at 3A! I don't know how was the thermal management at the SuC back in 2014, but maybe some of you used to read out battery temperature already at this time?
 
We know they are capping volts. This is theft of horsepower and range - products we paid for. They are guilty of manipulating officially tested EPA ratings a la dieselgate - Not good. We don't need to know why they did this, we know they did and that it is illegal.

If they did it to reduce the impact of a design flaw, that's a recall problem. if they did it to reduce the impact of warranty claims on Tesla's budget it's still a warranty issue. What we know for certain is that it is not something owners have done, and it is not natural. Tesla is avoiding telling us the reason for the thefts, but they are still guilty of theft. They need to return what was stolen, it's as simple as that.

If they can't safely return what was stolen, they need to perform a safety recall. If they can't afford to warranty the product as they sold it... that isn't our fault either. In every possible explanation, Tesla is punishing owners for Tesla's mistakes. This is why they are not telling us why they did it - and probably why they returned *some* of the stolen property starting a day after the class action suit was filed. They're afraid of telling us the reason, and want to avoid the discovery process exposing it. Unfortunately, anything less that 100% restoration of the missing volts is still theft and keeps the suit active - discovery is inevitable as long as they refuse to charge batteries that they have chosen to reduce with software capped voltages.

The batteries can't be in spec if the voltage is software capped. The EPA's rated spec requires a charge to 4.2v and anything less than that at 100% is a dieselgate manipulation. Customer cars can't be software manipulated to not meet what the EPA certified - Vw tried this and that's where the term dieselgate comes from. The spec requires 4.2v - range, performance, any other side effects of voltage manipulation are just side effects. We are not discussing range, or "within spec range" we are discussing voltage that has been illegally capped. Range is impacted, but it is a shadow of that central issue and not the issue itself.

Similarly, Tesla could replace our batteries with real 60kWh packs that have the same range as teh software cap allows. This

Wonderful post!
That sums it all up. Nothing more to add.
This is where the lawyers should base arbitration upon.
 
Last edited:
Thank you for effectively explaining what some people on this thread are having a really hard time grasping. There are many valid explanations as to why the BMS is reducing max voltage. All are possibilities, just as it is possible Vmax reduction is done solely for legally questionable reasons.

Some people here have a very narrow view of what constitutes “degradation” and how it’s dealt with. According to them, reducing Vmax is never used deal with any degradation type. I find this line of thinking naïve given all the chemical, mechanical, electrical interactions in a battery. Regardless, a possible solution shouldn’t be excluded just because it isn’t usually done. People need to keep an open mind and stop assuming the worst. I’m sure Tesla crap communications skills aren’t helping.

I find it amusing to have internet experts continuously explain what voltage is to those of us with electrical engineering/technical backgrounds. Perhaps I should use the Funny button more, similar to our passive aggressive friends in this thread. On second thought, I’ll just hit the straight up Disagree to be clear.
The title of this thread is “Sudden loss of Range with 2019.16.x software”. For me, that sudden loss of Range is 100% down to an artificial capping of the battery. I do not see it as due to degradation.
Indirectly, if the affected cars have batteries that are beginning to fail (and that is still only supposition) then I agree, the fact that they are beginning to fail could be described as degradation, which lead to Tesla capping the battery. But I regard that as an indirect link. So I may have a foot in both camps, but it feels more like a full foot in one, and a toe in the other.
 
For anyone wanting more information on li-ion batteries, I found these two articles to be especially relevant and helpful

Charging Lithium-Ion Batteries

https://batteryuniversity.com/index.php/learn/article/how_to_prolong_lithium_based_batteries

Just FYI, this site is more or less a scam. While most information is at least somewhat factual, there are no sources to back up any of the claims of the author. Also, there are a lot of generalizations that are just not applicable for most use cases or just plain wrong. There is definitely not any institution called a "Battery University".

From a study about battery aging, the capacity fade of a similar Panasonic NCA cell like the one used in your 85kWh packs:

View attachment 448124

The charging method is CCCV at 77°F. 3A (1,1C) would need 95min for 0 to 100% SOC at 4,2V, but imho is comparable to SuC charging protocol from 2014 concerning degradation.
The values fit well with the datasheet, where 0,5C CCCV is recommended at 77°F, and ~ 500 EFCs are the nominal at 80% capacity.

Great study, but two important caveats:
1) The cell used in this study, the Panasonic NCR18650PD, is not the exact chemistry of the Tesla cells and most likely bears the most similarity to the earliest, "A" battery packs that were capped at 90kW for Supercharging. Starting with the "B" packs, new cells were used that were more optimized for fast-charging with a corresponding increase in max. supercharging speed to 120kW

2) CC-CV charging at 3A (1C) or more is definitely hurting the battery and not advisable in any case. This is however not the charging protocol that Tesla uses for Supercharging! The actual supercharging protocols were also tested in the study and give much different results. Bear in mind this corresponds to an older cell that was in the "A" batteries only charged with 1,1C at max (corresponding to the "SC-3A (3.2V)" trace in the graphic below). The other protocols are faster and lead to rapid degradation. It becomes clear that while regular supercharging is definitely increasing degradation, if you limit charging to 4.1V (~90%) you can expect reasonable cycle life of 500 cycles to 80%. Keep in mind that the actual degradation is likely to be much lower due to less stressful AC charging, improved chemistry in the actual cars and the batteries' "self-healing effect". This effect has unfortunately never been studied extensively, but it is clear from multiple studies and sources that Tesla batteries seem to recover a lot of lost capacity when they are stored for a longer amount of time at low-ish SoCs. Maybe I will make a longer post about this, if anyone is interested.
Unbenannt.PNG
 
The title of this thread is “Sudden loss of Range with 2019.16.x software”

Sadly, NOT for the poster you are replying to.

Remember, what he has believed all along and keeps repeating different versions of that same thought process with no intention of giving up and accepting the facts presented here, only being very energetic to disagree so fervently with the impacted owners' point of views. Let me quote one of his posts as a reminder (the bolding and underlining are done by me for emphasis):

They are not reducing battery capacity. They are reducing usable capacity, which was never advertised to anybody. Having said that, if your range decreases by 30% from the advertised range when the car was new, then you have a legitimate battery warranty claim. It’s obvious that Tesla is reducing the max voltage on the cells for a legitimate concern (probably related to fires). I understand your frustration, but I’m not sure what exactly you would have them do. Replace everyone’s pack instead reducing max cell voltage? That’s not a realistic solution and is not a contractual obligation they have as long as they don’t continue to reduce the range so it hits the 30% warranty threshold. If people don’t want Tesla messing with their vehicle operation, then disconnect the vehicle from the internet and avoid the updates. Every Tesla owner knows that updates can increase or decrease performance. It’s part of the Tesla experience.
 
The dissertation of Peter Keil, which was linked here already. https://mediatum.ub.tum.de/doc/1355829/file.pdf


NCR18650A

Please note that the graph is at 77°F, where Li-plating is the main driver for degradation at 3A! I don't know how was the thermal management at the SuC back in 2014, but maybe some of you used to read out battery temperature already at this time?

Yes, I forgot about Mr. Keil's paper, even though I have read it once and bookmarked it.

Regarding monitoring the battery temperature during supercharging, well, the owners relied on that mighty Tesla BMS (remember that guy) to do that job and to keep our batteries protected before any damage done. I certainly never used any tool to watch the Fahrenheit when supercharging. I napped ;)
 
NCR18650A

Please note that the graph is at 77°F, where Li-plating is the main driver for degradation at 3A! I don't know how was the thermal management at the SuC back in 2014, but maybe some of you used to read out battery temperature already at this time?

Bear in mind that the NCR18650A is a very different cell from the NCR1865PD or the Tesla 18650s. The former is rated for a max discharge of 2C (6A) while the PD is rated for 10A and can therefore take higher currents.
 
Great study, but two important caveats:
1) The cell used in this study, the Panasonic NCR18650PD, is not the exact chemistry of the Tesla cells and most likely bears the most similarity to the earliest, "A" battery packs that were capped at 90kW for Supercharging. Starting with the "B" packs, new cells were used that were more optimized for fast-charging with a corresponding increase in max. supercharging speed to 120kW

No, they didn't change the cells between the "A" and "B" packs. The difference was the internal wiring was beefed up.