I think they've got algorithm problems. My suspicion is it has fixed voltage limits. There's clear evidence for this. There's a particular CS-90 that I've used, which has the lowest voltage I've seen anywhere. I've been told it now always backs down 25%. My office charger is on the low side, and it backs down about half the time. I recently charged on a CS-90 with > 235 volts and it never backed down at all. So it's pretty clear there's a fixed threshold. Bad design.
Every location has a different nominal voltage. If it's three phase source it will be 208V +/- several volts. If it's two phase it will usually be in the range of 220V to 240V, but in my experience there's a lot more variation in this number.
The grid is a major source of voltage variation. I've observed it at my office - significant changes in a fairly short time period, and it clearly wasn't our load doing it because we don't really have that much (no car plugged in at the time).
You can't have fixed limits - it simply doesn't work. What it SHOULD do is record the nominal open circuit voltage when it first connects. It should then calculate a threshold voltage below that point, where it is going to trigger the back-off. Actually what it should really do is run the algorithm for 40A chargers only, because the hardware is far more robust in higher power charging stations. More to the point, if there's an issue with the UMC then there should be thermal cutout in the adapter.
Backing power down 25% increases the charging time by 33%. This is seriously impacting road charging, and will lead to people sitting for extra hours waiting for their cars to charge. I'd like to see this algorithm either fixed, or removed entirely and a more effective solution implemented.