What do you estimate the actual charging efficiency would be from the plug to the battery? Not just the charger efficiency.
To clarify I'm talking about granularity in terms of determining long term life. For example, in a Tesla it's trivial to figure out a rough idea of degradation (in percentage) by looking at how much rated range has dropped. I looked at the Kia Soul EV manual and it has a battery SOC meter (which reports straight percentage, so does not show degradation) and a range GOM (guess-o-meter) that varies based on previous energy use. I don't see a straightforward way to get an estimate of degradation. Maybe the European model is different (I didn't see Bjorn's video)?
Then why does the empirical data produce a different value than the 285Wh you report? Shouting does not make you correct. Talk about the data and the facts. The granularity of all the displays is one mile so how can one be better than another? Are you trying to make a point about accuracy or granularity? And about displaying Rated Miles, which I think we're all going to agree is pretty much a useless concept that Tesla does not even apply consistently across its apps, or actual odometer range?
I've stated nothing but facts here. The the firmware itself sets the rated wh/mi at 285 Wh/mi for the refresh-non-P dual motor configuration. You refusing to accept that doesn't change the firmware. My "shouting" is simply because people seem to be skipping this particular fact to fit their arguments.
I'm trying to reconcile the fact you present, with the empirical data. I just drove 2,000 miles and RM disappeared at a rate 273Wh/RM, not 285Wh/RM. I charged, either SC or destination L2, recorded the RM and drove off (consumed kWh was always 0.0). At the next destination I recorded kWh consumed since last charge and the change in RM, divide and I get 273Wh/RM. So we have two facts that don't agree with each other. How do we explain that? I'm not attached to my theory or "argument", I'm searching for an explanation and would appreciate your help.
Interesting. Assuming you're notes are correct, I have no explanation for this behavior other than to assume that for some reason your trip meter's kWh count is simply wrong. 273 Wh/mi would mean even less usable capacity than I've measured in multiple 90 packs since 273 wh*294 mi = 80.3 kWh... which is way off. Do you have a trip meter photo for a long drive from ~100%? Would be interesting to see.
Great post...very informative. Cold weather obviously has been having an impact on usable battery capacity - eg. a 400wh/mile consumption vs the static 290wh/mile on my S70 resulting in slightly less miles., which is not bad.
This is the behavior I keep pointing out. He's taking the energy consumed as reported by the trip meter, and dividing by the actual rated miles consumed, exposing the fudge factor. I do the opposite but mathematically identical, use the rated miles lost, multiply by the Wh/RM, then compare to what the trip meter says, which seems to heavily understate usage.
What you are describing (discrepancy between rated miles actual consumption rate vs. firmware set value) is happening in my case, and in every case that I have seen reported. There has to be a straightforward explanation for this, but from your experience in writing to Tesla, they don't seem very interested in explaining it.
I clarified in my other post, but I'm talking specifically about granularity in terms of determining degradation. Once you accoubt for the conditions @wk057 pointed out up thread (calibration, range mode, immediate off charge), it's good way to monitor degradation. Most other EVs have either a far less granular display (like the 12 bars in the Leaf) or no real way to monitor degradation (they have GOM range displays that vary based on the conditions the car is driven and can't be relied on for the same purposes).
I don't have a dash photo, but I've included a screen shot of the spreadsheet I created for the trip. The data was transcribed from the instrument display at each stop and entered into the spreadsheet that evening.
Try measuring your kWh by metering your charge, independently of the car. edit - actually you can also use the energy added as reported by the car during charge as well. Not as good as above, but better than using reported consumption.
I remember a long thread about the efficiency numbers and how it appears to be nonlinear a few years ago, but I forgot what was the factor. I think it has to do with the buffer, "below zero" range, and also non-propulsion energy usage. I did find this reference: Can someone PLEASE explain how rated range is calculated. Throughout the years and various updates, the below zero thing has seemed to be eliminated or disproven, but the basics may still apply. Instead of using a fixed trip Wh / (Wh/mi) = RM, it may be a formula that uses trip Wh / (Wh/mi) + Constant mi = RM. In other words, the trip Wh may not necessarily correspond directly with the internal Wh, but there may be some other constant in between (back then it was guessed to be a "below zero" buffer).
Unless you always charge to 100%, which is very rare at SCs, that won't tell you how much was consumed on the prior leg. And if you use an L2 charger, then you have to make assumptions about charge efficiency and vampire loss. Why do think the kWh consumed since last charge as reported in the Trip Meter is less accurate?
That could be of interest, but for the moment I'm looking for an explanation of the discrepancy between what @wk057 found in the BMS, and my empirical data.
Charge to 90%. There's an additional variable sure, but over many samples the error should average out. I'm sure of it, it's not just not accurate, there appears to be a large fudge factor. Because energy is always conserved, and the trip meter consumption does not reflect that. Try comparing the self-reported energy added numbers while charging versus the value reported consumed. People have come up with all kinds of theories as to why that is, giving Tesla the benefit of a doubt. I no longer do. In the most flattering light it's simply a bug, one that they've neglected fix for a very long time. If your data were my data, I would inflate your calculated Wh/RM by 10%. I get the feeling the fudge factor is much bigger for P models however. edit - I have not simply made assumptions, I have made measurements.
Can you please share your measurements with us? And how do they compare with the 310Wh/RM which the OP found in the BMS for your P90D?