Nope, did not miss that. Did you miss my charge from 10% - 80% note?

You are correct, of course, but that is factored into the average "KWh for charging".

Let's imagine a race, with two identical cars with 75kWh batteries. Both start at Charger A with 10% charge, and want to end up at Charger B 100 miles away with 10% charge. I plan to drive at 90 mph and will charge just enough to make that possible. You plan to drive at 100 mph and charge just enough to make that possible.

We both plug in and start charging. For this race, it doesn't matter how fast the low end of the battery charges, because our identical cars are both getting the same rate, whether that's 120kW or 10kW. Let's say both batteries hit 72% charge at 12:00 noon, and I unplug and start driving at that time.

Driving at 90 mph uses 464 Wh/mi for a total of 46.4 kWh or 62% of my battery, leaving 10% as I arrive at Charger B. Drive time is 66 min 40 sec, making my arrival 1:06:40.

Driving at 100 mph uses 535 Wh/mi for a total of 53.5 kWh or 71% of your battery, meaning you need to continue charging from 72% to 81% after I've left. You're well into the battery's taper, but I'll be generous and say you're still getting 60kW. So, charging the extra 7.1 kWh you need takes 7 min 8 sec, plus 1 hour drive time puts you in at 1:07:08.

So, driving at 100 MPH comes in slower, contradicting the "charge speed = drive speed" rule. Not only was average charge speed irrelevant (the first 71% could have been done on L2 and it wouldn't have changed the results), but even the ending charge speed (60kW = 206 mph charge) didn't match the optimal drive speed.