Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Another Model X crash, driver says autopilot was engaged

This site may earn commission on affiliate links.
Where did you get the info that it was engaged eleven seconds before? I didn't see that part in Elon's tweet.

It's from: http://jalopnik.com/musk-autopilot-was-off-in-pa-tesla-model-x-crash-acco-1783695454

A Tesla spokesperson has told Jalopnik:

We got access to the logs. Data from the vehicle shows that Autosteer was not engaged at the time of this collision. Prior to the collision, Autosteer was in use periodically throughout the approximately 50-minute trip. The most recent such use ended when, approximately 40 seconds prior to the collision, the vehicle did not detect the driver’s hands on the wheel and began a rapidly escalating set of visual and audible alerts to ensure the driver took proper control. When the driver failed to respond to 15 seconds of visual warnings and audible tones, Autosteer began a graceful abort procedure in which the music is muted, the vehicle begins to slow and the driver is instructed both visually and audibly to place their hands on the wheel. Approximately 11 seconds prior to the collision, the driver responded and regained control by holding the steering wheel, applying leftward torque to turn it, and pressing the accelerator pedal to 42%. Over 10 seconds and approximately 300m later and while under manual steering control, the driver drifted out of the lane, collided with a barrier, overcorrected, crossed both lanes of the highway, struck a median barrier, and rolled the vehicle.
 
Sort of. It was engaged until eleven seconds before the crash. It had begun to disengage because the driver's hands had been undetected on the wheel for the maximum threshold. After many alerts, the vehicle began to slow and driver then took over.

So basically the Tesla was doing what some have asked it to do i.e. react to lack of driver input. It was slowing down and probably wouldn't have wrecked except the driver took over and then human error intervened.
 
Sort of? Eleven seconds (where did you get that info, by the way? Did Elon tweet it later?) is a VERY long time when driving. At 60 mph, you've traveled almost .2 miles!

I say "sort of" because I think it's important to distinguish between three classes of accidents:
  1. Autopilot was never on.
  2. Autopilot was always on.
  3. The accident occurred just after Autopilot disengaged.
The third category is the most interesting. All manner of possibilities lie there, such as:
  • Autopilot sensed a situation it couldn't handle, and disengaged itself.
  • The driver became inattentive, and Autopilot disengaged itself (as occurred here).
  • The driver was disoriented (awoke with a start, perhaps), and disengaged Autopilot.
  • The driver felt that a crash was imminent, and disengaged Autopilot. This is somewhat of a "dammed if you do, dammed if you don't" scenario. If you take control and you crash, Autopilot wasn't engaged. However, if you don't take over from Autopilot and you do crash, you're dammed for failing to override Autopilot. The question here is what caused the imminent crash situation.
 
Last edited:
  • Like
  • Love
Reactions: Breezy and srini
I say "sort of" because I think it's important to distinguish between three classes of accidents:
  1. Autopilot was never on.
  2. Autopilot was always on.
  3. The accident occurred just after Autopilot was on.
The third category is the most interesting. All manner of possibilities lie there, such as:
  • Autopilot sensed a situation it couldn't handle, and disengaged itself.
  • The driver became inattentive, and Autopilot disengaged itself.
  • The driver was disoriented (awoke with a start, perhaps), and disengaged Autopilot.
  • The driver felt that a crash was imminent, and disengaged Autopilot.
Agree and there's a lot more they need to do before we know what actually happened.
 
I say "sort of" because I think it's important to distinguish between three classes of accidents:
  1. Autopilot was never on.
  2. Autopilot was always on.
  3. The accident occurred just after Autopilot disengaged.
The third category is the most interesting. All manner of possibilities lie there, such as:
  • Autopilot sensed a situation it couldn't handle, and disengaged itself.
  • The driver became inattentive, and Autopilot disengaged itself (as occurred here).
  • The driver was disoriented (awoke with a start, perhaps), and disengaged Autopilot.
  • The driver felt that a crash was imminent, and disengaged Autopilot.

Although in this case, none of those scenarios appear to apply.
As noted, the driver took over control via torque on the wheel. He then also applied pressure to the throttle.
This was done 11 seconds before the crash, which is quite the period of time.
 
Notice the escalating warnings before the driver reacts. More importantly, all the driver had to do was take over and safely steer. Instead he accelerates. THEN he runs off the road and over controls. My 15 year old daughter just discussed with me the need to ease back on the road if you run off the shoulder. The wreck had little to do with AP and a lot to do with basic driving skills.
 
Although in this case, none of those scenarios appear to apply.
As noted, the driver took over control via torque on the wheel. He then also applied pressure to the throttle.
This was done 11 seconds before the crash, which is quite the period of time.

An eleven second span would imply, to me, that the driver was not only inattentive, he was confused or impaired. For example, the driver could be waking up from a nap, confused about how to reenable Autopilot, drunk, or needing a very long time to reorient himself.

Driver incapacitation appears to be a theme in Autopilot crashes. Autopilot seems to be turning people into zombies--or at least allowing people to turn themselves into zombies.
 
Sort of. It was engaged until eleven seconds before the crash. It had begun to disengage because the driver's hands had been undetected on the wheel for the maximum threshold. After many alerts, the vehicle began to slow and driver then took over.

Where did you get that additional bit of information? I haven't seen anything besides Elon's tweet. (Which says that had the driver been using AP there wouldn't have been an accident.) Are you maybe talking about a different accident? (That sounds more like the one in Montana where it just drifted off the road into the guard "rails".)

And doesn't AP shut down by maintaining the lane of travel and slowing down with the flashers on if it is just because of the lack of hands? It only completely shuts off if it loses confidence and has no idea what to do.
 
Alright, so he was ignoring/missing the warnings up until the car slowed down. He acted by grabbing the steering wheel and pressing the accelerator down. 10 seconds later he drifts out of his lane and crashes.

I wonder if he thought Autopilot was back on after he intervened (based on his statement to police)? Either that or he was still distracted/impaired in some manner while driving (or both - thought AP was on AND was distracted/impaired).
 
  • Like
Reactions: TacC
Where did you get that additional bit of information? I haven't seen anything besides Elon's tweet. (Which says that had the driver been using AP there wouldn't have been an accident.) Are you maybe talking about a different accident? (That sounds more like the one in Montana where it just drifted off the road into the guard "rails".)

And doesn't AP shut down by maintaining the lane of travel and slowing down with the flashers on if it is just because of the lack of hands? It only completely shuts off if it loses confidence and has no idea what to do.

It only disengaged when the driver forced the wheel and pushed the accelerator. Otherwise AP would have maintained control and slowed the car to a stop.
 
An eleven second span would imply, to me, that the driver was not only inattentive, he was confused or impaired. For example, the driver could be waking up from a nap, confused about how to reenable Autopilot, drunk, or needing a very long time to reorient himself.

Driver incapacitation appears to be a theme in Autopilot crashes. Autopilot seems to be turning people into zombies--or at least allowing people to turn themselves into zombies.

This is a leap made with no basis in fact.
Autopilot does not "turn people into zombies".
People that are behaving like zombies, and their life is extended because of autopilot.

For me, autopilot has allowed me to be even more attentive.
In this case, the driver made a conscious decision and two separate actions that indicate conscious choice.
 
So maybe it was APs fault. Maybe after the first visual and audio warnings are ignored it should only continue visual warnings and slow down and stop the car before waking the driver up. (Assuming that the driver had dozed off.) I can see suddenly getting woken up with the car slowing down in traffic and overacting while you are still groggy/dazed could cause someone to overreact.
 
  • Like
Reactions: KaiserSoze
So basically the Tesla was doing what some have asked it to do i.e. react to lack of driver input. It was slowing down and probably wouldn't have wrecked except the driver took over and then human error intervened.
Right, isn't this exactly what people wanted Tesla to do? Now that it is clear Tesla does this, it deactivated too soon? Not sure what people actually expect Tesla to do.

Also from the other thread about the truck crash, where the driver had a 3-5 second window (and people were saying that was plenty of time to react and brake to a stop), 11 seconds seems to be lot of time to react. From the article, the alerts actually started 40 seconds before the crash.
 
  • Informative
Reactions: EVie'sDad