Suicidal is right. The system was designed and behaves as intended and designed. Its all assistive, YOUR the DRIVER of any current automobile on the planet for public roads anywhere Tesla included and more so.
Anything happens outside of the ordinary, its your fault 1000%. No court case, no money and probably no life or life of others.
IF the car was meant to stop for any object it would never move. Your eyes are the eyes of the car.
Tell that to everyone buying a Tesla today and they would probably walk away from it. Most do not understand what is actually happening.
This is true technically. I have been on test drives from the Tesla store where I was encouraged to turn on EAP on surface streets. This is human nature. When I had my EAP trial I did it all the time. I suspect so do you Knightshade. If Tesla REALLY wanted us to use EAP only on highways without crossing traffic, they would have the feature unavailable on other roads, as in Cadillac's supercruise system.
So my point is even smart guys can make poor decisions based on previous experiences with the EAP system. You get comfortable and then when a failure point/case arrives you are not ready to take over in time.
Even Elon has said that most EAP accidents happen to experienced users.
Some points to unpack here:
1. Tesla's behavior of autopilot changes, release to release. This means that under certain conditions, it improves, and under certain circumstances, it regresses. I have one case open with the advanced autopilot team right now to correct a regression. These changes can make it more difficult for the operator to predict the course of the vehicle's behavior. The car looks and feels the same, but does not always operate identically.
By way of example, my neighbor, who drives the Model X, already got into a minor Autopilot related accident because the vehicle moved in a way he did not anticipate. He reacted as quickly as he could, reducing the damage, and Tesla did not charge him for the labor to repair his Model X. He was perplexed as to the behavior of the Tesla and spent a month working with Tesla to identify what led up to the collision. He and I now operate autopilot with the same level of discipline we did at the very beginning of our ownership, which is to expect the unexpected. When driving in this way, it is unclear whether AP is our assistant, or whether the assistant is us.
2. Some of us have experienced a complete shutdown of the autopilot system while engaged. I have experienced this behavior nearly a dozen times and have documented it on video and posted about it here:
AP disengaged while driving - Radar Failure (release 2019.8.3). These types of failures are examples of systems not operating "as intended," and it is Tesla's responsibility to correct.
What is surprising to me is given the number of users who've experienced this issue and travel this heavily trafficked route how long it is taking Tesla to address. Currently, we are at two months, and I have been in touch with Tesla engineering regularly.
It would not be accurate to say that AutoPilot (or any system) works as intended 100% of the time. Tesla must get as close to this number as possible, but given the number of APE failures in this release, I don't think we're even hitting 98% uptime. I've had drives lasting over an hour where it was entirely unavailable, and when it fails, it goes down for 10-15 minutes. Tesla likely needs to get to 99.99 or 99.999 percent uptime.
If given a choice, I would prefer the system to fail gracefully over unanticipated actions.
3. Recently, Tesla announced that Elon is getting more actively involved with the AP team, and there have already been additional staff changes made.
When Elon gets directly involved, it is an indication that he is not comfortable with the pace or quality of the work previously left to others. He has set an extraordinarily high bar - which is to achieve FSD this year, and for this goal to be realized, it is clear that the pace of development must accelerate. Tesla is up against several significant constraints (putting aside external, and financial constraints):
- Hardware (Both with the limits of HW 2.5, as well as new challenges arising from HW 3)
- Software (Feature enhancement and stability)
- NN training (achieving performance on par, or preferably better than human). I would imagine streamlining field data incidents to NN training is a critical achievable.
- Addressing what may be an unlimited number of corner cases
Tesla will also have to deal with technical debt and sustaining engineering for older hardware that can't get the updates Tesla has planned for its newer vehicles. Watching this development process unfold is perhaps the most interesting activity the industry has seen.