This happens to me frequently, and is one of the major regressions of Autopilot, along with increased "phantom" and micro braking.
In fact, it's an extremely dangerous regression. For me, this will happen sometimes (but not always) when overtaking a car: the Tesla will suddenly decide it wants to match the speed of the car as you're overtaking. This results in the Tesla slamming on the brakes, just as you're alongside the other car, which is unpredictable and quite dangerous for those around me... and uncomfortable and alarming for me and my passengers.
I've spent two years "explaining" random AP behaviour to concerned passengers, but I'm finally bored of it. It's gone from a system that was dumb enough to be predictable, yet mostly fit for purpose, to a system that's actively dangerous and unpredictable even when working as intended. I don't expect it to improve on my car.
I think the biggest single "feature" that's responsible for the majority of these regressions is the "cut in detection", which clearly doesn't work very well, at least not in the UK where roads are narrower, and there is apparently a penchant for driving relatively close to the outer lane line (I'd never noticed how common it is - until now)