Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Blog Tesla: Autopilot Was Activated During Fatal Model X Crash

This site may earn commission on affiliate links.
Autopilot was activated when a Model X crashed into a concrete barrier killing the driver last week near Mountain View, Calif., according to a release from Tesla.

“In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum,” the company said. “The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”

Damage to the Model X was severe, in fact Tesla said it has it has “never seen this level of damage to a Model X in any other crash.” The company blames the severity of the crash on the absence of a crash attenuator designed to reduce the impact into a concrete lane divider. The crash attenuator was reportedly destroyed in a separate accident 11 days before the Model X crash and had yet to be replaced.

“Our data shows that Tesla owners have driven this same stretch of highway with Autopilot engaged roughly 85,000 times since Autopilot was first rolled out in 2015 and roughly 20,000 times since just the beginning of the year, and there has never been an accident that we know of,” the company said in an earlier statement. “There are over 200 successful Autopilot trips per day on this exact stretch of road.”

U.S. National Transportation Safety Board is investigating the crash.

Here’s Tesla’s update in full:

Since posting our first update, we have been working as quickly as possible to establish the facts of last week’s accident. Our hearts are with the family and friends who have been affected by this tragedy.

The safety of our customers is our top priority, which is why we are working closely with investigators to understand what happened, and what we can do to prevent this from happening in the future. After the logs from the computer inside the vehicle were recovered, we have more information about what may have happened.

In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.

The reason this crash was so severe is because the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had been crushed in a prior accident without being replaced. We have never seen this level of damage to a Model X in any other crash.

Over a year ago, our first iteration of Autopilot was found by the U.S. government to reduce crash rates by as much as 40%. Internal data confirms that recent updates to Autopilot have improved system reliability.

In the US, there is one automotive fatality every 86 million miles across all vehicles from all manufacturers. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware. If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident.

Tesla Autopilot does not prevent all accidents – such a standard would be impossible – but it makes them much less likely to occur. It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists.

No one knows about the accidents that didn’t happen, only the ones that did. The consequences of the public not using Autopilot, because of an inaccurate belief that it is less safe, would be extremely severe. There are about 1.25 million automotive deaths worldwide. If the current safety level of a Tesla vehicle were to be applied, it would mean about 900,000 lives saved per year. We expect the safety level of autonomous cars to be 10 times safer than non-autonomous cars.

In the past, when we have brought up statistical safety points, we have been criticized for doing so, implying that we lack empathy for the tragedy that just occurred. Nothing could be further from the truth. We care deeply for and feel indebted to those who chose to put their trust in us. However, we must also care about people now and in the future whose lives may be saved if they know that Autopilot improves safety. None of this changes how devastating an event like this is or how much we feel for our customer’s family and friends. We are incredibly sorry for their loss.

Photo: @DeanCSmith/Twitter

 
Last edited by a moderator:
Pulling off of Market Action, there are some good arguments in this and the 101 thread.

I totally get the logic and I for one would be saying the same thing but the situation is when bad things happen, people need a scapegoat. And apparently a lot of people have complained about the autopilot. I understand Tesla has warnings set in place as well as terms and conditions for the autopilot but I’m wondering are there loopholes to be found from a lawyers perspective?

Always? ;)

The problem I see is that people don't know what a normal autopilot does, so they have an incorrect view of what Tesla's AP does.
People who do know what airplane autopilot systems do aren't confused (maybe I should blame WALL-E)...
 
[Tesla issues strongest statement yet blaming driver for deadly crash...]

Tesla sent Dan Noyes a statement Tuesday night that reads in part, "Autopilot requires the driver to be alert and have hands on the wheel... the crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road."

---

As a owner of 3 Tesla's, a fierce proponent that autonomous vehicles will eventually save thousands of lives... and a frequent user of AP1... my position is Tesla is in part liable for this tragedy.

Tesla does very little to nothing in educating it's owners on the appropriate use of Autopilot. It's a new technology and still in somewhat primitive form. It's not something you learn in Driver's Ed nor does the DMV test you on it. You might get 5 minutes regarding it from your Delivery Specialist... but that only helps if you're the first time owner and does nothing for others who might drive the vehicle or those buying used.

Tesla needs to require drivers to complete an online education program regarding Autopilot. This can also be done from the console screen or the App. Have them agree to terms of usage and associate it with a driver profile before it can be activated (yes, you can get around this; perfect should not be the enemy). Require re-verification at timed intervals (annual?).

Segway/Ninebot for example requires you to go through it's App training before you can increase the speed limit from 6 mph to 12 mph... for a 60lb device. Tesla does not require any use agreement or verification for a 4500 lb vehicle reaching 90 mph in AP mode.

It won't be long before another AP tragedy happens. I'd like Tesla to prosper by limiting their liability. However, their arrogant excuses and disrespectful attitude towards their owners isn't helping.

My .02 - flame suit on.



 
Last edited:
  • Like
Reactions: daktari
[Tesla issues strongest statement yet blaming driver for deadly crash...]

Tesla sent Dan Noyes a statement Tuesday night that reads in part, "Autopilot requires the driver to be alert and have hands on the wheel... the crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road."

---

As a owner of 3 Tesla's, a fierce proponent that autonomous vehicles will eventually save thousands of lives... and a frequent user of AP1... my position is Tesla is in part liable for this tragedy.

Tesla does very little to nothing in educating it's owners on the appropriate use of Autopilot. It's a new technology and still in somewhat primitive form. It's not something you learn in Driver's Ed nor does the DMV test you on it. You might get 5 minutes regarding it from your Delivery Specialist... but that only helps if you're the first time owner and does nothing for others who might drive the vehicle or those buying used.

Tesla needs to require drivers to complete an online education program regarding Autopilot. This can also be done from the console screen or the App. Have them agree to terms of usage and associate it with a driver profile before it can be activated (yes, you can get around this; perfect should not be the enemy). Require re-verification at timed intervals (annual?).

Segway/Ninebot for example requires you to go through it's App training before you can increase the speed limit from 6 mph to 12 mph... for a 60lb device. Tesla does not require any use agreement or verification for a 4500 lb vehicle reaching 90 mph in AP mode.

It won't be long before another AP tragedy happens. I'd like Tesla to prosper by limiting their liability. However, their arrogant excuses and disrespectful attitude towards their owners isn't helping.

My .02 - flame suit on.



Not willing to endorse legal liability for Tesla -- Tesla is following the practice of many other companies in describing the risks of not properly using its product in its manuals -- but I wholeheartedly agree on the training.

Tesla still has some pretty massive flaws in their customer engagement and education strategies (e.g., as a very early MS owner, I'm exceptionally well-educated on using my car and yet Tesla updates the manual willy-nilly from time to time with brand new information and never notifies me of the change. How hard is that?)

Initial in-car training for using Autopilot seems like a great idea. Requiring a simple 15 minute instruction App would do wonders.
 
[Tesla issues strongest statement yet blaming driver for deadly crash...]

Tesla sent Dan Noyes a statement Tuesday night that reads in part, "Autopilot requires the driver to be alert and have hands on the wheel... the crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road."

---

As a owner of 3 Tesla's, a fierce proponent that autonomous vehicles will eventually save thousands of lives... and a frequent user of AP1... my position is Tesla is in part liable for this tragedy.

Tesla does very little to nothing in educating it's owners on the appropriate use of Autopilot. It's a new technology and still in somewhat primitive form. It's not something you learn in Driver's Ed nor does the DMV test you on it. You might get 5 minutes regarding it from your Delivery Specialist... but that only helps if you're the first time owner and does nothing for others who might drive the vehicle or those buying used.

Tesla needs to require drivers to complete an online education program regarding Autopilot. This can also be done from the console screen or the App. Have them agree to terms of usage and associate it with a driver profile before it can be activated (yes, you can get around this; perfect should not be the enemy). Require re-verification at timed intervals (annual?).

Segway/Ninebot for example requires you to go through it's App training before you can increase the speed limit from 6 mph to 12 mph... for a 60lb device. Tesla does not require any use agreement or verification for a 4500 lb vehicle reaching 90 mph in AP mode.

It won't be long before another AP tragedy happens. I'd like Tesla to prosper by limiting their liability. However, their arrogant excuses and disrespectful attitude towards their owners isn't helping.

My .02 - flame suit on.


I think training is pointless (other than telling people explicitly they are always the driver) because EAP is not guaranteed to do anything properly 100% of the time, thus you cannot rely on it. Which is why the driver stays the driver.
I picture R Lee Ermey:
"It's a car. It goes, stops and turns. If you let it do it on its own, you will have the same problems as anyone else that let's a car do whatever it wants.

It is your 15 year old offspring fresh from drivers Ed. Do not trust it.

Here are things it will do if provoked: {list of features}, it may do them right, it may do them wrong. AP might save you from your own screw up, it also may follow the wrong car, the wrong line, or just plain give up.

Now go out there and drive."
 
"Gore point susceptibility" may have to be added to the warnings until the statement is no longer true.

Already covered
Warning: The list above does not represent an exhaustive list of situations that may interfere with proper operation of Driver Assistance components. Never depend on these components to keep you safe. It is the driver's responsibility to stay alert, drive safely, and be in control of the vehicle at all times.
Warning: Traffic-Aware Cruise Control is designed for your driving comfort and convenience and is not a collision warning or avoidance system. It is your responsibility to stay alert, drive safely, and be in control of the vehicle at all times. Never depend on Traffic-Aware Cruise Control to adequately slow down Model S. Always watch the road in front of you and be prepared to take corrective action at all times. Failure to do so can result in serious injury or death.
Warning: Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead. Always pay attention to the road ahead and stay prepared to take immediate corrective action. Depending on Traffic-Aware Cruise Control to avoid a collision can result in serious injury or death. In addition, Traffic-Aware Cruise Control may react to vehicles or objects that either do not exist or are not in the lane of travel, causing Model S to slow down unnecessarily or inappropriately.
Warning: Autosteer is a hands-on feature. You must keep your hands on the steering wheel at all times.
Warning: Autosteer is not designed to, and will not, steer Model S around objects partially or completely in the driving lane. Always watch the road in front of you and stay prepared to take appropriate action. It is the driver's responsibility to be in control of Model S at all times.
Warning: Many unforeseen circumstances can impair the operation of Autosteer. Always keep this in mind and remember that as a result, Autosteer may not steer Model S appropriately. Always drive attentively and be prepared to take immediate action.
Warning: Automatic Emergency Braking is not designed to prevent a collision. At best, it can minimize the impact of a frontal collision by attempting to reduce your driving speed. Depending on Automatic Emergency Braking to avoid a collision can result in serious injury or death.
Warning: Forward Collision Warning is for guidance purposes only and is not a substitute for attentive driving and sound judgment. Keep your eyes on the road when driving and never depend on Forward Collision Warning to warn you of a potential collision. Several factors can reduce or impair performance, causing either unnecessary, invalid, inaccurate, or missed warnings. Depending on Forward Collision Warning to warn you of a potential collision can result in serious injury or death.
Warning: Automatic Emergency Braking is designed to reduce the severity of an impact. It is not designed to avoid a collision.
Warning: The limitations described above do not represent an exhaustive list of situations that may interfere with proper operation of Collision Avoidance Assist features. These features may fail to provide their intended function for many other reasons. It is the driver’s responsibility to avoid collisions by staying alert and paying attention to the area beside Model S so you can anticipate the need to take corrective action as early as possible.
 
  • Helpful
Reactions: 22522
Nice list. Thank you.

The concern is NTSB may influence NHSTA to make the warnings more specific, in a way that hurts sales.

They may even require videos be watched before purchasing AP. Literacy is not what it once was.
Thanks, there are more, but I got bored with formatting...

I don't know that it would hurt sales, Tesla's system is better than the one on our new Ford...It's not like anyone reads the warnings anyway..
 
  • Helpful
Reactions: 22522
Tesla needs to require drivers to complete an online education program regarding Autopilot. This can also be done from the console screen or the App. Have them agree to terms of usage and associate it with a driver profile before it can be activated (yes, you can get around this; perfect should not be the enemy). Require re-verification at timed intervals (annual?).

Segway/Ninebot for example requires you to go through it's App training before you can increase the speed limit from 6 mph to 12 mph... for a 60lb device. Tesla does not require any use agreement or verification for a 4500 lb vehicle reaching 90 mph in AP mode.
My initial reaction to this was to disagree but upon reflection it probably would reduce accidents and ultimately minimize legal and PR risk for Tesla. Especially now with model three production increasing the buyer pool of teslas is becoming much more diverse and there will frankly be a whole bunch of people who just don’t understand auto pilot until they go through some kind of training .

What concerns me most here is this was an Apple engineer who shouldn’t have required the training or should have done the equivalent himself . He likely well understood what auto pilot does and doesn’t do and what lane line markings and traffic situations it can handle on it’s own and where he needs to intervene and most importantly the crystal clear fact that you still need to pay attention to watch for obstacles in your lane like debris potholes or vehicles parked halfway or cutting into your lane . But somehow some distraction caused him to not notice his situation.

An interesting data point will be his phone activity leading up to the crash .