I think there's a distinction to make between 'Phantom' braking (where there is no visible cause) and simply an abundance of caution.
In the original post example, and in some others here, there was a car waiting to come out of a side junction. Now, if the car starts nudging forward, the autopilot sees this and has to work out what is going to happen. It cannot predict what the driver will do, so if the car is moving, however slowly, it simply has to assume it will cross your path and apply the brakes. The alternative is that it keeps on driving at full speed and ends up in a situation where there isn't room to stop if the other car continues.
Now most humans driving will see this situation and assume from experience that the other car will not pull out. Most humans would continue at their current speed. However, there are thousands of such incidents every day, where the other car does continue to pull out, and a crash occurs. Just a cursory look at youtube will show this. Anyone who rides a bicycle around town will have stories of car drivers pulling out in front of them without looking. Humans can and do cause crashes all the time. As a human driver, I am cautious. I was taught defensive driving when I learned to drive, and as I cycle a lot I know only too well how inattentive a lot of drivers are. If I see a car nudging out of a side street, I will ease off the accelerator slightly, to give myself time to react should something happen. I will also slow down significantly if I see young children waiting to cross a road, or walking along a narrow pavement, because I know that children often run out without looking.
Now, if you are programming autopilot software, and you get to the bit where you are teaching the car what to do in this situation - what do you do? Program it to continue regardless and hope for the best? Or program it to be cautious and avoid a crash? It's clear that you'd do the latter, or the lawsuits would be piling in in no time.
So I think some appreciation of the task at hand is required by users. The same could be applied to the passing lorries on motorways issue - it's nearly always because the lorry is drifting into your lane. If you are attentive when autopilot is on, you can spot the braking coming a mile off. I can't see how any software can avoid having to do this, until such time as each car has a massively capable AI system that can predict human behaviour based on probabilities etc, and even then it will still come down to a straight choice - proceed and risk a crash, or slow down.