Okay, I've been staying by the river with the luggage while the pundits explain and debate some things that have me more confused than ever. Here is my question:
My range in our 2014 S85 appears to be the same after all these updates, roughly 255-257 miles on a 100% charge. (I guess this is deemed batterygate.)
My Supercharging rate has plummeted dramatically so charging from X% to Y% takes 20-30% longer. (I guess this is chargegate.)
My very narrow understanding of electricity can be boiled down to volts times amperes equals watts. Therefore, since the charging rate is much slower, the car is receiving fewer watts. At least that is what I think.
So many of these posts are referring to 4.2 volts or 4.1 volts with handy jargon like vmax and other terms.
In the bad old days, the screen used to display volts and amperes when Supercharging. Generally, the voltage was in the 300-400 range, and amperes in excess of 250. The volts would rise slightly while the amperage dropped more quickly when the car was getting long into the taper.
I cannot reconcile the car receiving 300+ volts with Supercharging sessions with 4.2 volts. Clearly, I am missing something.
What is everyone talking about, please? Thank you for realizing that not everyone has a PhD in electrical engineering.
I, like you, found all of this pretty new. But the 4.2 Volts thing is pretty simple. It may help to think of it like a fuel gauge. The technocrats and pedants will no doubt feel faint at this Janet & John explanation, so take it as VERY broad brush, or as a concept.
Each battery cell is designed to have a nominal voltage (Vnom). Think of this as the start point, or mid point. (Vnom actually 3.66V.) It’s the start point because the cell can take more charge than that, up to a higher voltage, and lose charge ending up with less voltage.
However whilst adding more charge (voltage) is possible, charging beyond a certain level has the potential to damage the cell. The job of the Battery Management System (BMS) is to stop that happening. That top figure (Vmax) is 4.2V. 4.2V is pretty much an industry standard. The same goes at the bottom end.
When all the cells in the battery pack are at Vmax, the BMS reports the battery as 100% full (hence fuel gauge). When all the cells are at Vmin, it reports it as 0% (fuel gauge). This doesn’t mean the battery is actually 100% full or 0% empty, just that Vmax or Vmin has been reached. When all the cells are at 4.2V, the sum of all the charge in all the cells totals 70 kWhs (in a 70 kWh battery). (think 70 gallons)
Now the sneaky bit, and the point of this thread. If the BMS changes Vmax from 4.2V to something less, say 4.07V, then when all the cells are at 4.07, (the new Vmax), the BMS sees the cells are all at Vmax and reports the battery as 100% full and stops the cells from taking on any more charge. But now, the sum of all the cells at 4.07V only totals 58 kWh, (58 gallons) not the original 70 kWhs (70 gallons). This is batterygate.
Chargegate is a completely different kettle of fish.