Yes I agree that the efficiency is better when the inverter is driven at higher wattage. But I'd argue that this is a very small difference compared to the clipping loss.
I do think the industry could with minimal effort put out some studies, or even just some basic calculations, to explain both the energy AND cost efficiencies of oversizing vs undersizing vs right-sizing.
It might be true that the efficiency differences between 80% capacity vs 100% capacity are fairly low, I looked at say some SolarEdge inverter curves and the difference seemed like 0.1% or less.
But in general I think folks with oversized arrays relative to inverter over judge two things:
-One, they look at how long they are clipping during the best part of the year, and overestimate how much energy they are losing. Even if one is clipping for say three hours - the actual area (energy) lost under the curve can be pretty tiny, because the insolation curve is pretty flat if you clip for "only" three hours. The energy lost that day is a few %, and this only happens for a few months of the year. At ratios above say 1.3-1.4, the area is larger, but you have to be clipping for 4-6 hours a day.
-Two, they overlook the losses during the worst parts of the year. At least three months of the year, my array barely peaks at 50% of DC rating. Which also means half the day it's operating at below 25% of DC rating. Looking at the SolarEdge inverter curves, some of them lose about 1% efficiency between 20-40% power, maybe 5% efficiency between 0-20%. Of course at those lower power periods, you're losing that efficiency on smaller amounts of power - so it does take some pretty complex calculus to estimate. But if we just wing it that 3 months of the year, I would not be surprised at all if 1% of energy was lost all production over the ENTIRE winter season - and that's assuming perfect insolation (no storms, clouds, etc). Then let's add in all the cloudy days in spring and fall, plus the beginning and end of every day all year round, where the inverter is operating at 0-40% capacity.
So while I don't have full technical understanding nor the calculus to make the full analysis, I wouldn't be surprised at all if inverter ratios of 1.0 to 1.2 generate NO greater energy over the entire course of the year period, i.e. no economic value even if a larger inverter were free. And economically even worse off if you have to make a big step up, say going from a 7.6 to a 10 kw inverter (which is a 33% increase in capacity, even if free (so it's not about being only $300 more to upsize an inverter) - because it greatly increases those periods of the year where you're operating below 20% and 40% respectively, where inverter efficiency really drops off.