I’m going by this post (below) which indicates efficiency is higher at 240V for all amperages.
Well, there are no examples in that table at all of a 120V source with higher power than the 240V sources, so that is not for "all amperages". They start at equal power, and then the 240V ones go higher, so of course they cannot show there that a high power low voltage source could be better efficiency, because there are no examples of it shown.
I am confused at where someone is making up some of those numbers. For the 2.88 kW examples I gave, they show both of those, but they show a thing called "kW actual", where they list them at different power levels. And then there is this quote after the table:
"the other values are guesses."
I prefer to go with math over guessing about whether 120 X 24 equals 240 X 12 or not. But yeah, voltage boost transformers are not ideal, so maybe starting from the higher voltage to go to 400V would be more efficient if the total power were equal to start with.
But there is a small factor I hadn't remembered to bring up earlier. Running higher current does have a little more heat from resistance in the wires. So there is a small difference there, where if you were running the same power level, there would be a little extra heat loss from the one running higher amps.
I'll grant that it's just rare to find real examples that could really disprove this stuff, because you just don't really find 50 or 60A circuits at 120V that would be much higher power and better efficiency than the really low amp 240V ones.