Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Blog Tesla: Autopilot Was Activated During Fatal Model X Crash

This site may earn commission on affiliate links.
Autopilot was activated when a Model X crashed into a concrete barrier killing the driver last week near Mountain View, Calif., according to a release from Tesla.

“In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum,” the company said. “The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”

Damage to the Model X was severe, in fact Tesla said it has it has “never seen this level of damage to a Model X in any other crash.” The company blames the severity of the crash on the absence of a crash attenuator designed to reduce the impact into a concrete lane divider. The crash attenuator was reportedly destroyed in a separate accident 11 days before the Model X crash and had yet to be replaced.

“Our data shows that Tesla owners have driven this same stretch of highway with Autopilot engaged roughly 85,000 times since Autopilot was first rolled out in 2015 and roughly 20,000 times since just the beginning of the year, and there has never been an accident that we know of,” the company said in an earlier statement. “There are over 200 successful Autopilot trips per day on this exact stretch of road.”

U.S. National Transportation Safety Board is investigating the crash.

Here’s Tesla’s update in full:

Since posting our first update, we have been working as quickly as possible to establish the facts of last week’s accident. Our hearts are with the family and friends who have been affected by this tragedy.

The safety of our customers is our top priority, which is why we are working closely with investigators to understand what happened, and what we can do to prevent this from happening in the future. After the logs from the computer inside the vehicle were recovered, we have more information about what may have happened.

In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.

The reason this crash was so severe is because the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had been crushed in a prior accident without being replaced. We have never seen this level of damage to a Model X in any other crash.

Over a year ago, our first iteration of Autopilot was found by the U.S. government to reduce crash rates by as much as 40%. Internal data confirms that recent updates to Autopilot have improved system reliability.

In the US, there is one automotive fatality every 86 million miles across all vehicles from all manufacturers. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware. If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident.

Tesla Autopilot does not prevent all accidents – such a standard would be impossible – but it makes them much less likely to occur. It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists.

No one knows about the accidents that didn’t happen, only the ones that did. The consequences of the public not using Autopilot, because of an inaccurate belief that it is less safe, would be extremely severe. There are about 1.25 million automotive deaths worldwide. If the current safety level of a Tesla vehicle were to be applied, it would mean about 900,000 lives saved per year. We expect the safety level of autonomous cars to be 10 times safer than non-autonomous cars.

In the past, when we have brought up statistical safety points, we have been criticized for doing so, implying that we lack empathy for the tragedy that just occurred. Nothing could be further from the truth. We care deeply for and feel indebted to those who chose to put their trust in us. However, we must also care about people now and in the future whose lives may be saved if they know that Autopilot improves safety. None of this changes how devastating an event like this is or how much we feel for our customer’s family and friends. We are incredibly sorry for their loss.

Photo: @DeanCSmith/Twitter

 
Last edited by a moderator:
So, why was the Autopilot Mode not disabled by Tesla after five visual and one audible warnings? And how could the car just drift out of the lane even if someone is not holding the steering wheel? Did this driver have the new updated Autopilot software version that came out a couple of weeks or so ago?
 
So, why was the Autopilot Mode not disabled by Tesla after five visual and one audible warnings? And how could the car just drift out of the lane even if someone is not holding the steering wheel? Did this driver have the new updated Autopilot software version that came out a couple of weeks or so ago?

What they are saying is earlier in the drive, with AP engaged, the driver had received visual (blinking white IC) and audible (similar to emergency braking) cues to get his hands on the wheel. They are not saying that those cues happened immediately preceding the accident. They did say that in the 6-seconds before the accident their software did not detect hands on the wheel and the software was probably close to another visual cue.

I often have my hands on the wheel with AP and visual cues can still occur - easily more than 5 in a long drive. I can't ever recall getting an audible cue. I think if you ignore the audible cue it kill AP and you can't use it again until the car has cycled through time in Park.
 
The company may have to initiate braking and more decisive action to prevent misuse of the ap system. Like engage emergency flashers, force slowdown, enter park, require change of driver ... eventually the misusers will get the message to stop over-relying on the gadgets and pay attention.

AP is a great stress reliever in traffic, but it is not autonomous.
 
The company may have to initiate braking and more decisive action to prevent misuse of the ap system. Like engage emergency flashers, force slowdown, enter park, require change of driver ... eventually the misusers will get the message to stop over-relying on the gadgets and pay attention.

AP is a great stress reliever in traffic, but it is not autonomous.
What they are saying is earlier in the drive, with AP engaged, the driver had received visual (blinking white IC) and audible (similar to emergency braking) cues to get his hands on the wheel. They are not saying that those cues happened immediately preceding the accident. They did say that in the 6-seconds before the accident their software did not detect hands on the wheel and the software was probably close to another visual cue.

I often have my hands on the wheel with AP and visual cues can still occur - easily more than 5 in a long drive. I can't ever recall getting an audible cue. I think if you ignore the audible cue it kill AP and you can't use it again until the car has cycled through time in Park.
So, was the AP engaged at the time of crash, and if yes, why did the car veer out of lane?
 
I have a Model S P100D and a Long Range Model 3. Both have the enhanced autopilot. I use my autopilot and auto-steering on a regular basis. I use the autopilot as a safety backup in case I'm not paying attention. Using the auto-steering is like allowing a beginner driver to take control of your car. I frequently have to take control because the auto-steering gets confused when the lines in the road diverge. It appears to be better with the autopilot 2 upgrade. In both my Model 3 and my Model S, I have noticed that the perimeter of the driver's console flashes white in my Model S when my hands are not on the steering wheel. In my Model 3, the left side of the console screen flashes blue when my hands are not on the wheel. When in auto-steer I keep both hands loosely on the wheel so that I am ready to take control. It's kind of a game to see what auto-steer will do. It challenges my driving.
A couple of weeks ago, I was going north on 101 and I exited 101 to HWY 280. At that off-ramp, there is a Y in the road allowing you to go either south on 280 or north on 280. There is a barrier between the north-bound and south-south lanes just like the barrier between the HWY 85 fly-over and HWY 101. The lines in the road diverge at this Y-junction between south-bound 280 and north-bound 280. My Tesla made a turn towards the center-line of the barrier. I pulled my steering wheel back to the direction of 280 north-bond. I have been driving for 50 years and nothing like this has ever happened to me. I'm thinking that the auto-pilot could benefit from linking with the navigator. That way, I could tell the car where we are going before I leave. You know, pre-flight planning. I also have an aircraft pilots licence.
I'm hopeful that the autopilot and auto-steering will one day make driving much safer.

Kevin Burns
 
Kevin,
Your use and my use are very much the same. My 2015 S85D has AP1 which I've used for several years now and I still feel the need to be within a second or two of taking over when it makes a mistake. I've found fewer mistakes in stop and go heavy traffic and on open roads, but frequent mistakes in handling idiot drivers and roads with changing radius turns and undulating hills.

I can't imagine looking away for more than 3 or 4 seconds. If anyone does, they are a hazard and the flashers should come on to warn others.
 
  • Like
Reactions: McRat
I like the inference that Auto Pilot is like a newly minted (beginner) driver. Great promise, but not enough experience yet and needs an adult supervising in tight quarters.
I hope this feature continues to learn, gain experience and will become a good driver. Now - it is a welcome assistant, but I know who is supposed to be in charge. And my life depends on me being in charge.
 
Perhaps a bit off-topic, but ...
I've never understood the desire for full autonomy in personal vehicles. It seems to be a sacrifice of independence. For full autonomy to be safe, it will have to control virtually all the cars for coordinated merging, lane changing, etc., and operate at speeds which guarantee accidents like this can't happen. That means individuals won't choose their own speed or which vehicle they follow (e. g., smelly diesels), much like airliners under ATC routing. Since many drivers are much more aggressive than that, they will not be satisfied with relinquishing control. If some don't, the whole system will suffer by having to slow down even more to contend with them.

I think assisted driving requiring individual responsibility for outcome, as we have now, is much more resilient and flexible than autonomy. We'll have some opportunity to see how well software can contend with individual decisions enroute when ADS-B is implemented in air traffic routing over the next few years.

Besides, when the road is squirrelly, that's when it's fun to DRIVE the car rather than just ride in it.
 
  • Like
Reactions: Mjølner
Perhaps a bit off-topic, but ...
I've never understood the desire for full autonomy in personal vehicles. It seems to be a sacrifice of independence. For full autonomy to be safe, it will have to control virtually all the cars for coordinated merging, lane changing, etc., and operate at speeds which guarantee accidents like this can't happen. That means individuals won't choose their own speed or which vehicle they follow (e. g., smelly diesels), much like airliners under ATC routing. Since many drivers are much more aggressive than that, they will not be satisfied with relinquishing control. If some don't, the whole system will suffer by having to slow down even more to contend with them.

No, for full autonomy to be safe it has to recognize objects accurately and quickly and then not do all of the stupid, dangerous stuff that humans do. And don't worry about other drivers trying it on while you're riding in an autonomous car because if that's happening regularly, the cars will capture and report all of the dangerous driving its cameras capture to the police.

I think assisted driving requiring individual responsibility for outcome, as we have now, is much more resilient and flexible than autonomy. We'll have some opportunity to see how well software can contend with individual decisions enroute when ADS-B is implemented in air traffic routing over the next few years.

It's more flexible, but it could be end up being significantly more expensive and much less efficient.

Besides, when the road is squirrelly, that's when it's fun to DRIVE the car rather than just ride in it.

That's expensive fun that most people can't afford.
 
Autopilot was activated when a Model X crashed into a concrete barrier killing the driver last week near Mountain View, Calif., according to a release from Tesla. “In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum,” the...
[WPURI="https://teslamotorsclub.com/blog/2018/03/31/tesla-autopilot-was-activated-during-fatal-model-x-crash/"]READ FULL ARTICLE[/WPURI]


First off, I feel terrible for this guy and his family as this was such a tragic accident. Prayers to those affected!

It’s very odd if the brothers of victims comments are accurate that the victim complained to Tesla several times about this location. Personally if I had felt the need to discuss this location w Tesla bc of safety concerns that I would have at the LEAST kept my hands on the wheel and foot hovering over brake.

I know nothing besides what I have read but something isn’t right w/ this accident. The report of the driver not responding before accident makes me think maybe he was suffer a medical condition. I know my AP while let you slightly veer the vehicle left/right if tugging on steering wheel so the driver could have possibly been slumped over the wheel.

Also, would be interesting to know what software update this vehicle was on. 2018.10.4 was a significant improvement for both our Model X’s AP2 behavior. Something else I’m thinking about is I know some of the very early model X’s were equipped w AP1 so curious to what AP hardware this vehicle was running.
 
  • Like
Reactions: ChunLi Panda
OK, we get that driver was absolutely not giving the road and traffic the attention that he should. And that he had Follow Distance set to the lowest safety factor (closest). And we get that 200 teslas per day are traveling this spot (wow!) every day with no problem. So, to me the big question is then WHY DID THE ACCIDENT HAPPEN SINCE CAR WAS ON (unsupervised) AUTOPILOT (and no signif involvement of driver occurred immediately before the collision) ???
 
Autopilot was activated when a Model X crashed into a concrete barrier killing the driver last week near Mountain View, Calif., according to a release from Tesla. “In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum,” the...
[WPURI="https://teslamotorsclub.com/blog/2018/03/31/tesla-autopilot-was-activated-during-fatal-model-x-crash/"]READ FULL ARTICLE[/WPURI]
One missing puzzle in light of the autopilot protecting the occupants in case of imminent collision. If and when the autopilot determines a potential collision and the driver fails to take a corrective action, there should a "final" alternative to save the day. I believe Tesla engineer could learn from this sort of tragedies. Otherwise, the driver will always find himself/herself playing guessing game with autopilot as to when to "hand-off" the responsibility. Clearly, according to the Tesla's statement, the driver failed to take the corrective action for 6 seconds/150 meters. This should be intervened by the autopilot's collision avoidance features. This is the area of autopilot build-in safety and should be taken very seriously.
 
One missing puzzle in light of the autopilot protecting the occupants in case of imminent collision. If and when the autopilot determines a potential collision and the driver fails to take a corrective action, there should a "final" alternative to save the day. I believe Tesla engineer could learn from this sort of tragedies. Otherwise, the driver will always find himself/herself playing guessing game with autopilot as to when to "hand-off" the responsibility. Clearly, according to the Tesla's statement, the driver failed to take the corrective action for 6 seconds/150 meters. This should be intervened by the autopilot's collision avoidance features. This is the area of autopilot build-in safety and should be taken very seriously.

Was there anything to detect 6 seconds ahead?
 
Was there anything to detect 6 seconds ahead?
Imminent collision means the system detected the condition (relative to it's motion and road condition) the vehicle is in collision course with an object. In my view, at six second mark, the autopilot could detect and predict, at current speed and road condition, is not able to operate safely. At this point in time (six second mark), the vehicle is most likely NOT in "unavoidable" collision course and the driver may only have next few seconds to take action. Now, after that precious few seconds has past by, the autopilot can, with high degree of confident that the collision is unavoidable. This is the moment I suggested the autopilot would take final action to protect the occupants.
 
Perhaps a bit off-topic, but ...
I've never understood the desire for full autonomy in personal vehicles. It seems to be a sacrifice of independence. For full autonomy to be safe, it will have to control virtually all the cars for coordinated merging, lane changing, etc., and operate at speeds which guarantee accidents like this can't happen. That means individuals won't choose their own speed or which vehicle they follow (e. g., smelly diesels), much like airliners under ATC routing. Since many drivers are much more aggressive than that, they will not be satisfied with relinquishing control. If some don't, the whole system will suffer by having to slow down even more to contend with them.

I think assisted driving requiring individual responsibility for outcome, as we have now, is much more resilient and flexible than autonomy. We'll have some opportunity to see how well software can contend with individual decisions enroute when ADS-B is implemented in air traffic routing over the next few years.

Besides, when the road is squirrelly, that's when it's fun to DRIVE the car rather than just ride in it.
The real benefits of autonomous driving is to improve the overall driving safety when operated accordingly. ADS-B (in-out) provides situational awareness in air traffic. It does not harmonize the overall traffic pro-actively. It requires pilots in the system to pay attention to the information provided.