What have you found the charging efficiency to be? Maybe 85%? Presumably, total use at the service entrance would be another 2% higher due to line loss from the panel to the 14-50?
With the
caveat that this is only one tiny data point, I did an efficiency experiment today:
Charged to my usual 60% (at 240 V, 32 A, 7.7 kW)
Drove car, according to the trip data: 75.9 miles, 20.0 kWh, 264 Wh/mile
Charged back to 60%, wall meter said that I used 23 kWh and I'd guess that's ± 0.5 kWh (the meter readings are a bit coarse).
That works out to 87.0 ± 1.9% efficient. The way to drastically reduce those error bars is to do the experiment over a longer period of time, such as a week or a month. But I would have to forgo opportunity charging to do it and I am reluctant to do that.
There are numerous variables that will affect charging efficiency of a given size battery:
• Battery temperature: A cold battery will take extra energy to warm and a warm battery will take extra energy to cool.
• Charging speed: Faster charging means that the car systems run for a reduced time, leading to reduced overhead losses; very slow Level 1 charging at 120 V, 12 A is substantially less efficient than typical Level 2 charging.
• Battery SOC: Charging a battery near empty is more efficient than charging a battery near full (because the charge rate tapers but the overhead loss continues apace).
• Measurement error: Just how accurate is the usage data provided by the car?
• Preheating/precooling will, obviously, substantially reduce apparent charging efficiency.
• "Vampire" losses: How does one account for vampire losses when running the experiment over an extended period of time?