Daniel in SD
(beta)
"One of the common misimpressions is that when there is, say, a serious accident on Autopilot, people – or some of the articles – for some reason think that it’s because the driver thought the car was fully autonomous and it wasn’t, and we somehow misled them into thinking it was fully autonomous. It is the opposite.
When there is a serious accident, it’s in fact almost always, maybe always, the case that it is an experienced user and the issue is more one of complacency. They get too used to it. That tends to be more of an issue. It is not a lack of understanding of what Autopilot can do. It’s actually thinking they know more about Autopilot than they do, like quite a significant understanding of it." - Elon Musk
I agree with Elon on this, people think the system is more capable than it is because of their personal experience. I see people posting about how far they've gone on Autopilot without a disengagement as proof that the system is safe enough to use without paying attention to the road. I've never seen anyone post that they use Autopilot without supervision because Tesla told them they could.
When there is a serious accident, it’s in fact almost always, maybe always, the case that it is an experienced user and the issue is more one of complacency. They get too used to it. That tends to be more of an issue. It is not a lack of understanding of what Autopilot can do. It’s actually thinking they know more about Autopilot than they do, like quite a significant understanding of it." - Elon Musk
I agree with Elon on this, people think the system is more capable than it is because of their personal experience. I see people posting about how far they've gone on Autopilot without a disengagement as proof that the system is safe enough to use without paying attention to the road. I've never seen anyone post that they use Autopilot without supervision because Tesla told them they could.