This happens to me frequently, and is one of the major regressions of Autopilot, along with increased "phantom" and micro braking.
In fact, it's an extremely dangerous regression. For me, this will happen sometimes (but not always) when overtaking a car: the Tesla will suddenly decide it wants to match the speed of the car as you're overtaking. This results in the Tesla slamming on the brakes, just as you're alongside the other car, which is unpredictable and quite dangerous for those around me... and uncomfortable and alarming for me and my passengers.
I've spent two years "explaining" random AP behaviour to concerned passengers, but I'm finally bored of it. It's gone from a system that was dumb enough to be predictable, yet mostly fit for purpose, to a system that's actively dangerous and unpredictable even when working as intended. I don't expect it to improve on my car.
I think the biggest single "feature" that's responsible for the majority of these regressions is the "cut in detection", which clearly doesn't work very well, at least not in the UK where roads are narrower, and there is apparently a penchant for driving relatively close to the outer lane line (I'd never noticed how common it is - until now)
You captured my feelings and sentiments on autopilot. In addition, my vehicle has been randomly disconnecting very frequently over the past few updates. I stopped using AP with passengers in the car as they would freak out every time the car starts beeping loudly and the screen flashes red for no reason.
I find the phantom braking dangerous too (the braking all of a sudden and spilling coffee all over myself was a wake-up call) although now that I understand why it's caused, I know what situations it has difficulty with.
FYI it has happened with car on my left on an on-ramp and also cars on my right when I'm in the left lane.
It's a regression but it seems to be a regression based on a new safety feature they've added. What I'm understanding is that the car detects the end of a lane and is now trying to find an open spot in the adjacent lane so that it can swerve into it incase the driver is not paying attention and the lane ends. It's trying to avoid a potentially more dangerous situation in
I get it and the car needs to be able to do things like this for FSD. Is it polished and working well? No.
I understand that Autopilot was more predictable when it was dumber although you kinda have to expect for the system to fail more and more as they add new features. It's a complexe Beast and it will take some trial/error to fine tune everything. Even if I don't believe it, Elon promised "feature complete FSD" by end of year. If that's true, I expect a super buggy version of FSD with lots of random issues like these. Tesla will have to focus on each one and implement a deep learning approach to fine tune each feature. Let's not forget that this is Beta software and that Tesla is using a brute-force CNN approach which relies on trial/error.
Ever look at the Teslafi firmware tracker? There always seems to be 2-3 major firmware versions in parallel on all cars. I think Tesla output competing releases with different tunings and save the one with the most confidence (least false positives) and continue to evolve this way.
I would rather that Tesla keep evolving and testing software on us with the objective of progression than keep a stable "dumb" version deployed. We know this will be a constant adventure and the better and better the perception model is, the more complexe the planning model will have to be. Just last night there were garbage trucks picking up our compost bins and it was parked on a one-lane street. I can't wait to see how an FSD Tesla will handle this for the first time, it will have to pass the truck in the opposite lane in a safe manner. I fully expect this to be a disaster in the beginning with the hopes that Tesla keeps learning from our dangerous experiences to make autopilot better.
My hopes are that once autopilot is "feature complete", then the Dojo computer will be able to fine-tune all the parameters at once since it will have all the parameters to play with. Right now, we are living with a machine that does not fully understand the world around it so it's dangerous in some cases. We have to accept that and be vigilant if we want to help Tesla make it to FSD in the near future. I use autopilot less since Tesla updated my car to 2019.32.2.1 but I also believe Tesla knows that I use it less now and I expect V10 AP to be an improvement.