Unfortunately, with AutoPilot you have to be ready to hit the brake or the accelerator at any time. It requires constant monitoring. Anytime that I feel the car braking in a way that a fellow human driver would not expect or appreciate I immediately hit the accelerator a little to override the behavior and then let AutoPilot resume. From reading the reports on "self driving cars", it seems the most common cause of fender benders are the very safe rules built into the self driving logic. Humans expect a certain type of behavior from their fellow drivers and if they don't get it, it could mean a hit in the rear or elsewhere.
Despite what Elon may lead you to believe, there is still a decade or so of work to be done as far as self driving is concerned. The regulators have yet to agree to a set of standards to make coexistence with human drivers work well or agree to isolate the automated cars (at least in some scenarios on the road). I purchased the FSD package so I can stay up to date with the latest features as much as possible with my current Tesla. But, I never anticipated that the FSD will be able to safely drive itself while I turn my attention elsewhere. At least during the entire time I own my current Tesla.
Yeah, the Porsche driver was an a$*. But, we also have the burden of responsibility of playing well with others. Of course, we know a ton of Tesla drivers will never be responsible enough to take that responsibility on. The unwillingness to take on that responsibility of actively monitoring a car is why Waymo and some others skipped Level 2 + 3 (and Level 4) and just focused on Level 5.