I hesitate to mention another explanation for the range drop that some 40kwh folks are seeing. I have no facts to support this theory and I certainly hope it is not the case. This is the possibility that, as the battery cells/components are tested during production they are split into three streams. Those that fail the test, those that pass, and those that are marginal. Obviously failed component are rejected. However, given the short supply of battery components at the time the 40's were being built, there might of been a temptation to use marginal cells in the 40's with the logic that the pack was over designed. Again, I have no data on this but we do see this sort of thing happen in production.
Interesting point. Imagine you're in QC, for Tesla, with limited supply and need to set a cutoff for outright cell rejection. You already know the buffer to 60kwh, on the 40kwh, cars is enough that you can use more cells that hold marginally less voltage. I'm only noodling your statement, but it would seem reasonable not only to use the cells that test out to a bottom capacity tier, but also to allow more variation among the pack. The flip side would be a greater need for balancing, right? It may not just be a SOC, lack of full-charging, issue.
It oversimplifies, but would you rather have "40" cells with individual voltage characteristics all in an upper band, or "60" cells with lower results and potentially greater variability, set to push the capacity of "40"? I'd still be happy, as a 40kwh owner, to have the extra cells, but it depends on QC choices I am helping to create speculation about.
Owning an EV for more than one winter, and recently getting involved lithium batteries in RC cars, I can say I officially know boo about Tesla batteries. The EV's range, however, did go back up with temps last year (in a way that a great big swell of Tesla owners have yet to experience). It will be nice when Tesla's battery diagnostics are less guarded. It can't be much more than the same process, times a few thousand 18650's.