Just a note, watching your videos is somewhat nerve-wracking but perhaps for a different reason: I disengage FSD beta way before the car gets into such dangerous situations. You are even aware of it with your comments, but don’t actually disengage FSD beta until much later. Disengage the moment it looks hairy. We don’t need to be pushing it so far as to getting into bad situations. It’s known to not be autonomous driving. We are explicitly told to pay attention and be in control at all times. We are told it may do exactly the wrong thing at the wrong time. We should use it like we understand this.
I have been testing FSD beta on most every drive since public access 1.5 years ago (including 3 cross country trips). I only allow the car to get into these situations when no other drivers are around. Some of these situations are only if other drivers are around, but that’s okay. Tesla doesn’t need us crashing to get the data to advance AV solutions. Disengaging because we aren’t sure or are worried the car may do the wrong thing is exactly the data Tesla needs.
As Tesla continues to state, this is an early limited release. As a beta tester, our primary directive is to still operate the vehicle safely. This can be done with FSD beta engaged, but at least in these cases requires earlier disengagements. There are SO MANY edge cases and situations the developers need to address, both safety and convenience/comfort. Let’s advance these ADAS and AV programs by getting them useful data, not clickbait for short sellers, haters, and regulators to find an excuse to stop/slow innovation.