There is a difference in efficiency and overhead.
Efficiency typically has to do with power-losses incurred within a system, in this case the UMC & the in-car charger are the major components. They cannot convert 100% of the energy from AC to DC to deliver to the pack, but 90+% is attainable. I've seen 94-98% clamed for the Tesla chargers. Note that often lower currents are more efficient due less resistive losses.
This is different than the overhead associated with the car running it's systems. The car can consume something like 300-400W when parked. So if you are charging at 120V/12A (1.44kW), and the car eats 400W of that, then only ~75% of the energy delivered by the output side of the charger (after losses) is going to make in in to the battery.
If you are charging at 240V/40A (9.6kW), then ~96% of the charger output is making it to the battery, with the rest being eaten by the car's systems while idle. (Bothe examples above ignore the few percentage points of charger inefficiency)
But in both those cases, that's car overhead accounting for the major loss, which doesn't affect the charger efficiency. The car is going to consume 400W/hr regardless if you are charging at 120V, 240V, or not at all. It does, however, have an impact on how much time it takes to charge the battery... so whether the car saps that energy from a slow charge rate all night... or the car saps it from the fully charged battery after you finish charging at a higher rate, makes no difference, really.
All of which to say, the point made about the impact to your efficiency, while there is undoubtedly some variation in efficiency at different input voltages, it's not enough to get worked up over, IMO.