I don't see how it would prevent more accidents than the same technology applied to automatic emergency braking.
It seems like proving that city NoA improves safety will be very tricky. When the inevitable accidents happen Tesla is going to have to have solid statistical evidence that it improves safety overall. So far misuse of Autopilot has only injured or killed the driver, it will be very different when a third party is injured or killed.
It is indeed going to be a PR challenge, convincing the public that the accidents they see are vastly outnumbered by the accidents that never happened because FSD prevented them.
... FSD will have a calming behavior. ...
Based on my own experience I really believe this is true: When I am driving I get angry at the stupidity of other drivers. When I'm a passenger I just sit back and relax. And when I have EAP engaged, I stay alert, but stupid behavior by other drivers doesn't anger me, because EAP is dealing with it.
Right, and its that one death (out of ten or ten thousand) that will keep the Regulators and lawyers up at night (and IMO, significantly delay a higher Level of self-driving roll out).
Contrary opinions have been expressed, but I really think that insurance companies will have a big influence on regulators and lawmakers, and once insurance companies, which are the world's experts on risk, see that FSD systems have become safer than human drivers, they'll push hard for regulatory approval. The real issue is that we're still at least a decade away from generalized non-geofenced FSD. As I keep pointing out, Tesla has nothing yet that does not require constant driver attention.
Please, folks. Tesla does not "promise" to do anything. You all know that Musk shares his plans, but to construe these as "promises" is a little too much excitement over nothing. At least Daniel, here, admits that it might have been an implied promise in his opinion.
If you need a promise, get it in writing and get it signed by Musk himself. Otherwise, it's all hot air.
The implied promise is the time-line: That if you pay for FSD you'll get FSD within the expected lifetime of your car. Because if you say "Buy this car and at some point in the future it will do X" if X is not available before normal wear and tear renders the car junk then your promise was a lie.
Before moving the goal posts and changing the definition of FSD, and including when I bought my car, Tesla and Musk said that if I paid for FSD (which I didn't) my car could operate as a robotaxi. My car will never be robotaxi-capable because when FSD does become available it will require significant hardware upgrades, which will not be realistically possible.
Then Tesla moved the goal posts and now "full self-driving" means the car will navigate city streets but will require a human driver to be alert and ready to take full control at any time, instantly, when the car fails to perform properly. Tesla lied, and in moving the goal posts has tacitly admitted that it cannot fulfill its original promise. By everyone else's definition, "full self driving" means the car does not require a driver. They never should have called their package "FSD." They should have called it EEAP (extended enhanced autopilot) and said that the $6,000 (or whatever) would get you all the new driver-assist features as they become available.
The Tesla Model 3 is the best car that's ever been built. It's really sad that they had to encumber it with an impossible promise.