Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

My friend's model X crashed using AP yesterday

This site may earn commission on affiliate links.
Just a reminder everyone that Autopilot can turn itself off if it senses a difficult situation. Usually in those cases, you have less than a second to react to avoid an accident. Also, auto steer can disengage due to a slightly too forceful driver nudge on the steering wheel, and an accidental brush against the brake perhaps resulting in a situation where you think AP is on, and it really isn't. Finally, Autopilot isn't fail safe and sometime it can indeed miss a turn or drift off a lane.

All this to say that you HAVE to be looking forward ready to take over at any point in your drive. Obviously this driver did not do this, which is a cautionary lesson for everyone. I'm glad no one got seriously hurt.
 
I'm going to just throw this conspiracy theory out there. Take it for what it's worth:

It's well known that there are a number of Chinese-backed Tesla competitors. In the past two days on this forum alone, I've already come across two posts clearly by Chinese posters who claim damage due to AP failure. This is the second. WeChat is a popular messaging app used by almost everyone from China (and used by almost no one who isn't Chinese).

I'm not saying this poster isn't legitimate (although, by the way, just because he claims to have an X doesn't necessarily make him legitimate). Just saying that we should all be mindful of the competitive landscape out there. I do not work for Tesla and am not associated with the company anyway other than the fact that I have an S on order, but even I feel like some of the anti-Tesla reporting and forum posts are a little odd.

Singling out one nationality may be going a bit too far - case in point: Edward Niedermeyer. As to this case, I do find it interesting that pictures of the wooden posts were available (were made at night after the crash) but not posted originally by the OP.
 
Though I share your sentiment (driving 60mph on a highway w/o shoulder is asking for trouble) I think it's best to base any kind of conclusions on facts. Yet, so far, we know very little about incidents that owners claim AP was involved and if and when driver tried to correct AP actions.

Yep. Agreed. Perhaps there need to be clearer disclaimers or maybe hands need to be on the wheel at all times.
 
Sorry to hear about the accident and glad, no one is hurt badly, except the car..
Please, don't blame AP for this. AP is not designed to detect debris on road. The driver is responsible to take control in this situation. In my mind, this is negligence from driver. Tesla not to be blamed..
 
I guarantee that will change nothing.

Maybe, maybe not. The system isn't supposed to be fail proof, it's just supposed to be safer than manual driving. The main culprit of these few accidents appears to be an over-reliance on the technology. Additional steps to engage AP may help reduce over reliance. I'd wager a guess that after all of this recent media coverage, there's been a DIP in accidents involving AP-enabled simply because everyone is now super aware of having to stay alert even on AP.
 
  • Like
Reactions: KaiserSoze
Maybe, maybe not. The system isn't supposed to be fail proof, it's just supposed to be safer than manual driving. The main culprit of these few accidents appears to be an over-reliance on the technology. Additional steps to engage AP may help reduce over reliance. I'd wager a guess that after all of this recent media coverage, there's been a DIP in accidents involving AP-enabled simply because everyone is now super aware of having to stay alert even on AP.

Definitely a possible silver lining. Hopefully people are asking questions and researching more.
 
Just a reminder everyone that Autopilot can turn itself off if it senses a difficult situation. Usually in those cases, you have less than a second to react to avoid an accident. Also, auto steer can disengage due to a slightly too forceful driver nudge on the steering wheel, and an accidental brush against the brake perhaps resulting in a situation where you think AP is on, and it really isn't. Finally, Autopilot isn't fail safe and sometime it can indeed miss a turn or drift off a lane.

All this to say that you HAVE to be looking forward ready to take over at any point in your drive. Obviously this driver did not do this, which is a cautionary lesson for everyone. I'm glad no one got seriously hurt.

I may feel this short short but it takes longer than a second to put hands back on a wheel and start steering.
 
I'm anxious to see if a black-box analysis shows that AP was even in use (regardless of OP claims) - in my experience, the vast majority of crashes occurring at 2 AM are due to the driver falling asleep at the wheel, and alcohol is involved more often than not.
 
A "How-to" video would be sufficient, I think. The use of AP, IMHO, isn't complicated enough to warrant a training program.
John, I say this with no intent to provoke: how many miles have you driven on AP?

I do not own an AP car but have driven my parents car on AP for a few hours in total, and only on Southern California freeways. My opinion is that, for many people (but nowhere near "all" people) AP is in fact complex enough -- and is a nsufficiently different driving experience -- to warrant a real life training period with a properly trained person in the passenger seat.
I'm going to just throw this conspiracy theory out there. Take it for what it's worth...
I think there is no chance your conspiracy theory has any foundation in reality.
I'm rapidly coming to the conclusion that Tesla should not allow AP use on non-divided highways, period.
I would not object if Tesla did exactly that, for a period of time until Tesla determined that AP had advanced sufficiently to expand the types of roads it should be used on.

It continues to amaze me that many owners wil use AP on roads that Tesla clearly recommends against using it on. Public roads that are used by literally billions of people (since there are now many Teslas in China).

As the number of AP Teslas grows dramatically this year, crashes resulting from stupid drivers using AP where they shouldn't or because they weren't paying attention will inevitably increase. As Elon might say, this is simply the "law of large numbers".
 
  • Helpful
Reactions: neroden
The lines are visible during the day, but at night there could, perhaps, have been a patch of localized fog. We're in need of a meteorology buff. Local weather data: OBS:WHITEHALL

The second set of pics appear to have been taken at night illuminated by a flashlight or camera flash. If there are visible in those conditions then presumably they would be visible when illuminated by vehicle headlights.
 
Though I share your sentiment (driving 60mph on a highway w/o shoulder is asking for trouble) I think it's best to base any kind of conclusions on facts. Yet, so far, we know very little about incidents that owners claim AP was involved and if and when driver tried to correct AP actions.

The worst thing about this is that whether AP was on or off, how AP was used, or any other facts - it really doesn't matter much once the damaging headline about "another Tesla AP car crashes" gets picked up by syndicated media. There is *never* a followup story with the facts. Damage is done.
 
Last edited:
I'm rapidly coming to the conclusion that Tesla should not allow AP use on non-divided highways, period.

I suspect they are trying to have their cake and eat it too: using customers a guinea pigs to beta test AP on normal roads, gathering copious amounts of data, but trying to avoid the liability for the inevitable crashes. That's negligence.

First part, that only works if Tesla has a system in place to report and correct false road data. If not, then you're just going to have a lot of frustrated customers who's AP cars won't engage on a road that's clearly a divided highway because the map database is wrong. Furthermore, there are different kinds of divided highways. Limited access, non-limited access (freeway or at grade intersections)... Where do you draw the line?

Secondly, I couldn't disagree more with you on that one. Negligence? Come on... What Tesla has proven, whether they intended to or not, is that people never want to take responsibility for their own actions or lack thereof. FL incident, person was watching a movie and not paying any attention which is completely against the T&Cs for the feature. The list goes on, all of which are the fault of the driver...

Jeff
 
  • Like
Reactions: Krugerrand
[...]

I would not object if Tesla did exactly that, for a period of time until Tesla determined that AP had advanced sufficiently to expand the types of roads it should be used on.

It continues to amaze me that many owners wil use AP on roads that Tesla clearly recommends against using it on. Public roads that are used by literally billions of people (since there are now many Teslas in China).

As the number of AP Teslas grows dramatically this year, crashes resulting from stupid drivers using AP where they shouldn't or because they weren't paying attention will inevitably increase. As Elon might say, this is simply the "law of large numbers".

Is it car manufacturer's responsibility to teach people how to drive?

I think a simple rule of thumb should be that if driver feels it'll take less than three seconds to correct AP's actions (before it results in an accident) they should take over or at least put their hands on the wheel. Driving 60mph on the road w/o shoulder qualifies as well as approaching unregulated intersection. It's quite similar to driving on cruise control.

EDITED for clarity.
 
Last edited:
I think all Tesla could do about this would be to create a training and certification program for AP with a sign off that you had received the training, understood the limitations and were responsible for the operation of the vehicle at all times. No certification, no autopilot.

One thing Tesla could consider was to develop a simulator app, so you could exercise using the AP in a stationary vehicle. The smart phone/tablet could be attached maybe to the visor or something and the driver would look at it as if looking through the windshield. The car would be in simulator mode and know what the app is showing and the expected reactions from the driver could be trained and redone until satisfactory reactions are demonstrated.

Training could be encouraged by having a ranking of the participants visible to each other.

Or something like that.
 
  • Like
Reactions: GoTslaGo
.... The paint lines are almost gone next to the creek and unfortunately it is a two lane road, not for use with AP.

I drove through Montana a month ago, and watching the road, I noticed that AP kept having trouble finding the lines. Montana evidently does not take a lot of concern in keeping road lines painted, and where I was, there had been a lot of patch paving, many places of which covered part or all of the lines. I turned off AP while in Montana.

We need to educate drivers as to limitations of AP. It may have gone off and the driver didn't notice. Who knows. Looks like a steep slope off the road into fence posts.
 
...My opinion is that, for many people (but nowhere near "all" people) AP is in fact complex enough -- and is a nsufficiently [sic] different driving experience...

As someone who owns an AP car I've had just the opposite experience. Very intuitive. Easy to engage, use, disengage. Clear visible and audible indicators when engaged. Clear visible and audible indicators when disengaged.

Within a few minutes of using it I knew what to expect...and what it could and couldn't do. Basically assistance with speed control and assistance with lane keeping.

As an "old guy" who can't figure out Snapchat if I can figure this out then anyone can!
 
First part, that only works if Tesla has a system in place to report and correct false road data. If not, then you're just going to have a lot of frustrated customers who's AP cars won't engage on a road that's clearly a divided highway because the map database is wrong. Furthermore, there are different kinds of divided highways. Limited access, non-limited access (freeway or at grade intersections)... Where do you draw the line?

Tesla has already drawn the line. Anywhere Tesla restricts AutoSteer speed today, Autopilot should be disabled. The same "dirty data" issue that exists with Autopilot restriction already exists with AutoSteer speed restrictions.

Secondly, I couldn't disagree more with you on that one. Negligence? Come on... What Tesla has proven, whether they intended to or not, is that people never want to take responsibility for their own actions or lack thereof. FL incident, person was watching a movie and not paying any attention which is completely against the T&Cs for the feature. The list goes on, all of which are the fault of the driver...

I absolutely agree that driving fundamentally requires personal responsibility.

Legally, however, negligent design on the part of the manufacturer can occur if there is "unreasonable risk of foreseeable injury." In this case, the injury was clearly forseen, as Tesla has restricted AutoSteer speeds precisely to avoid injury. The only question is whether AP on crappy roads represents an unreasonable risk. Since Tesla doesn't promise that AP will work on crappy roads, and actually advocates that it not be used in these situations, disabling it seems perfectly reasonable, and leaving it on seems unreasonable--at least to me.
 
Last edited:
  • Like
  • Disagree
Reactions: neroden and JeffK