Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Blog Tesla: Autopilot Was Activated During Fatal Model X Crash

This site may earn commission on affiliate links.
Autopilot was activated when a Model X crashed into a concrete barrier killing the driver last week near Mountain View, Calif., according to a release from Tesla.

“In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum,” the company said. “The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”

Damage to the Model X was severe, in fact Tesla said it has it has “never seen this level of damage to a Model X in any other crash.” The company blames the severity of the crash on the absence of a crash attenuator designed to reduce the impact into a concrete lane divider. The crash attenuator was reportedly destroyed in a separate accident 11 days before the Model X crash and had yet to be replaced.

“Our data shows that Tesla owners have driven this same stretch of highway with Autopilot engaged roughly 85,000 times since Autopilot was first rolled out in 2015 and roughly 20,000 times since just the beginning of the year, and there has never been an accident that we know of,” the company said in an earlier statement. “There are over 200 successful Autopilot trips per day on this exact stretch of road.”

U.S. National Transportation Safety Board is investigating the crash.

Here’s Tesla’s update in full:

Since posting our first update, we have been working as quickly as possible to establish the facts of last week’s accident. Our hearts are with the family and friends who have been affected by this tragedy.

The safety of our customers is our top priority, which is why we are working closely with investigators to understand what happened, and what we can do to prevent this from happening in the future. After the logs from the computer inside the vehicle were recovered, we have more information about what may have happened.

In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.

The reason this crash was so severe is because the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had been crushed in a prior accident without being replaced. We have never seen this level of damage to a Model X in any other crash.

Over a year ago, our first iteration of Autopilot was found by the U.S. government to reduce crash rates by as much as 40%. Internal data confirms that recent updates to Autopilot have improved system reliability.

In the US, there is one automotive fatality every 86 million miles across all vehicles from all manufacturers. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware. If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident.

Tesla Autopilot does not prevent all accidents – such a standard would be impossible – but it makes them much less likely to occur. It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists.

No one knows about the accidents that didn’t happen, only the ones that did. The consequences of the public not using Autopilot, because of an inaccurate belief that it is less safe, would be extremely severe. There are about 1.25 million automotive deaths worldwide. If the current safety level of a Tesla vehicle were to be applied, it would mean about 900,000 lives saved per year. We expect the safety level of autonomous cars to be 10 times safer than non-autonomous cars.

In the past, when we have brought up statistical safety points, we have been criticized for doing so, implying that we lack empathy for the tragedy that just occurred. Nothing could be further from the truth. We care deeply for and feel indebted to those who chose to put their trust in us. However, we must also care about people now and in the future whose lives may be saved if they know that Autopilot improves safety. None of this changes how devastating an event like this is or how much we feel for our customer’s family and friends. We are incredibly sorry for their loss.

Photo: @DeanCSmith/Twitter

 
Last edited by a moderator:
This system is, and will always be deadly dangerous.
It has almost killed me many times with it's unpredictable behaviour.
Yesterday again, I almost got rear ended because it out off nothing hits the brakes.
We are all just stupid test-dummies, driving a beta system that never should have been on the street.
By all means, I love my car's, and drive them every day but this lane assist system is dangerous.
Strange post. If you are serious about your remarks, then I can only assume you continue to use EAP because you paid a lot of money for it and hate to waste your money. But rather than speculate, I suppose I should ask: Why in the world are you still using AP?
 
As always, self serving weasel words from Tesla. Lots of irrelevant "hey look over there" to try and distract from the core facts.

AP2 is known to randomly make huge driving corrections even in 100% perfect conditions and on sections of road where it has previously worked perfectly.

No driver input for 6 seconds is 100% consistent with the driver's hands being on the wheel at all times and simply not applying sufficient torque to be detected.

My bet is AP2 swerved at the last moment leaving the driver with no time to react.

Having just read about this, you 'bet' you know what happened? You simply speculate that yes, maybe he did have his hands on the wheel but it wasn't detected? Thankfully, such speculation is not going to be how this is investigated. Let's get facts prior to going on a fudster rampage shall we?

It always bears repeating, YOU are driving the car even when it is on "autopilot". This is clearly stated and warnings are constantly given, such as warnings about putting your hands back on the wheel, as this driver did NOT do.

Also, let us not labor under the delusion that machine operated vehicles will be perfect. They will not be, period, now or ever. Perfection is not the standard and can never be. The standard should be, in my opinion (in the absence of legislative, industry or other leadership, we have no actual standard) is the machine substantially better than humans? Will it avoid wrecks 100% better than people or 200% or more? I have not seen credible numbers about the current safety level to know where we are right now, though we do know there is a difference between AP1 and AP2 at a minimum, suggesting that there are different safety levels out there now.

Let's assume though that Tesla makes a car that is 300% safer than human drivers. Guess what, that car is still going to be in wrecks and people are still going to get hurt and/or killed. They numbers dictate if you drive enough, no matter how safely, wrecks are gonna happen. So, a question is, how safe is safe enough? Are we OK with a machine 300% better than us, but, it still makes mistakes and people still die? We should be, but emotionally are we? If you personally are not, then by all means, take the wheel yourself! If you use this technology correctly, you will be ready to take over if/when it messes up. If you are not, is it correct to put all the blame on the machine? Not in my opinion.
 
Last edited:
I'd like to correct the title on the video. It was the driver that tried to kill the occupants. Autopilot is a driver aid and it's still beta. The driver is supposed to keep hands on the wheel and eyes on the road. Why did it take the driver such a long time to see the car was taking the wrong path - The car goes "left" at 27 seconds, doing 59 mph. Wasn't until 32 secs that the brakes were applied - that's 5 seconds to react. Why didn't they driver just turn the wheel and guide the car back into the lane?

5 seconds for a driver to take action is a long time.


Here is the explanation:

 
You simply speculate that yes, maybe he did have his hands on the wheel but it wasn't detected?

...putting your hands back on the wheel, as this driver did NOT do.
My speculation that the driver DID have his hands on the wheel is equally as valid as your speculation that the driver DID NOT have his hands on the wheel.

There is NOWHERE in the Tesla press release where it states that the driver DID or DID NOT have his hands on the wheel.

Teslas CANNOT detect hands on wheel, only torque on wheel and Tesla know this full well, hence the careful weasel wording trying to make casual readers THINK the driver did not have his hands on the wheel.


I do, however, agree with your point that autonomy is likely to improve automotive safety and lead to less human lives lost in accidents. This is a good thing.
 
If autopilot determines a potential collision and the driver fails to take a corrective action, there should a "final" alternative to save the day. This should be intervened by the autopilot's collision avoidance features. This is the area of autopilot build-in safety and should be taken very seriously.

Everyone who drive with Tesla autopilot knows it currently does not see cars stopped in the middle of the freeway or concrete barriers in the middle of the freeway. It may stop for a car at a stop light if you are going less than 45 mph, but at 75 mph, it is programed to ignore phantom images the radar "sees". It is a great lane keeping system with a very good adaptive cruise control. It follows moving cars very well but has never been able to recognize stopped vehicles. We all hope it will someday, but it does not today.
 
@kevinof

"I'd like to correct the title on the video. It was the driver that tried to kill the occupants. Autopilot is a driver aid and it's still beta. The driver is supposed to keep hands on the wheel and eyes on the road. Why did it take the driver such a long time to see the car was taking the wrong path - The car goes "left" at 27 seconds, doing 59 mph. Wasn't until 32 secs that the brakes were applied - that's 5 seconds to react. Why didn't they driver just turn the wheel and guide the car back into the lane?

5 seconds for a driver to take action is a long time. "




I don't think this was the first time the driver noticed the error. I think his delayed reaction is that he trying to prove a point. That the system will follow the brighter of the 2 lines, which in this case is not a lane. Similarly, to the actual accident site, the lane markings / pavement joint could have confussed the system.
 
  • Informative
Reactions: kevinof
Everyone who drive with Tesla autopilot knows it currently does not see cars stopped in the middle of the freeway or concrete barriers in the middle of the freeway. It may stop for a car at a stop light if you are going less than 45 mph, but at 75 mph, it is programed to ignore phantom images the radar "sees". It is a great lane keeping system with a very good adaptive cruise control. It follows moving cars very well but has never been able to recognize stopped vehicles. We all hope it will someday, but it does not today.
This is not good that the combination of radar, cameras and sonars can not positively identify an object in the path (time and space) of a moving vehicle. This issue must be dealt with as soon as possible.
 
Let's assume though that Tesla makes a car that is 300% safer than human drivers. Guess what, that car is still going to be in wrecks and people are still going to get hurt and/or killed. They numbers dictate if you drive enough, no matter how safely, wrecks are gonna happen. So, a question is, how safe is safe enough? Are we OK with a machine 300% better than us, but, it still makes mistakes and people still die? We should be, but emotionally are we? If you personally are not, then by all means, take the wheel yourself! If you use this technology correctly, you will be ready to take over if/when it messes up. If you are not, is it correct to put all the blame on the machine? Not in my opinion.

The problem is that this 300-or whatever % safer number is being based on a demographic of drivers who are already that much safer than average without autopilot.
The average TeslaX/S owner is middle aged and makes 290k per year. No teenagers, elderly or drunks are driving those cars. I use the Volvo XC90 as a comparison.
Volvo has more of those on the road than Model Xs... yet most years there are no deaths reported for XC90 drivers. The XC90 doesn't have Tesla autopilot. So how
does it achieve such a standard? It is all demographics. If Tesla made a $20k car that teenagers were buying... the accidents would pour in whether it had autopilot or
not. Actually I think it would be worse to give a teenager autopilot because you know how irresponsibly it would be used.
 
I'd like to correct the title on the video. It was the driver that tried to kill the occupants. Autopilot is a driver aid and it's still beta. The driver is supposed to keep hands on the wheel and eyes on the road. Why did it take the driver such a long time to see the car was taking the wrong path - The car goes "left" at 27 seconds, doing 59 mph. Wasn't until 32 secs that the brakes were applied - that's 5 seconds to react. Why didn't they driver just turn the wheel and guide the car back into the lane?

5 seconds for a driver to take action is a long time.

Not only did this driver try to kill the occupants. He grossly endangered the lives of other drivers on the freeway as well by coming to a complete stop! :mad: If this was to prove a point it came at a high risk to others.
 
Last edited:
This is MX crash is scaring me, a bit.

My question for any experts looking at this: Why was the barrier not seen by the radar, causing the system to initiate auto braking? Also, is there a chance of a bug that causes a freeze in logic in this driver's version of the AP software in this situation that causes it to fail to steer one way or the other when the user is not attentive.

Holder of a Day 0 Reservation for the Model 3
Currently driving a Subaru Forester

The static barrier problem is a hard one at freeway speeds. You don’t want the car braking for say, a overhead sign that appears level with the road because you are cresting a hill. Or a barrier that is in front of the vehicle as the crow flies, but the lane curves away, etc. So the car is programmed to ignore many static things to prevent false braking events, which are also unsafe. It is a fine line to walk, and Tesla hasn’t solved it yet.

A “lockup” type bug in AP is always possible, although I haven’t encountered that yet. AP seems more likely to hand control back to me suddenly (with noises and flashing messages) if it encounters stuff that overloads or confuses it.

I would not be scared from this event. Treat it as a good lesson in AP’s limits, and you probably won’t ever be in the same situation as this driver. Always keep your hands on the wheel and pay attention. AP still reduces fatigue on long drives because it reduces the mental math you are normally doing making micro-adjustments to the wheel. If AP seems like its doing something sketchy, take back control, don’t wait to see if it corrects.

I recommend enabling AP features in stages. Start with just TACC and get comfortable with Tesla’s implementation (which might be different than a prior car). Then read the entire Autosteer section of the manual before enabling Autosteer. Don’t enable Auto Lane change right away. Test out Autosteer on lightly travelled freeways that are cleanly marked and slowly get the hang of what the car is doing, what it does when it loses a lane line, etc. Disengage with the wheel several times to see what that is like. Then once you are comfortable with Autosteer alone, turn on Auto Lane change and see how that works and feels.

This was how I started with AP, and how my husband did as well. While it seems like complete overkill to folks who just turn it all on during delivery and test on the way home, I feel like it helped me not take AP for granted and treat it like the assistance system it is.
 
The static barrier problem is a hard one at freeway speeds. You don’t want the car braking for say, a overhead sign that appears level with the road because you are cresting a hill. Or a barrier that is in front of the vehicle as the crow flies, but the lane curves away, etc. So the car is programmed to ignore many static things to prevent false braking events, which are also unsafe. It is a fine line to walk, and Tesla hasn’t solved it yet.

A “lockup” type bug in AP is always possible, although I haven’t encountered that yet. AP seems more likely to hand control back to me suddenly (with noises and flashing messages) if it encounters stuff that overloads or confuses it.

I would not be scared from this event. Treat it as a good lesson in AP’s limits, and you probably won’t ever be in the same situation as this driver. Always keep your hands on the wheel and pay attention. AP still reduces fatigue on long drives because it reduces the mental math you are normally doing making micro-adjustments to the wheel. If AP seems like its doing something sketchy, take back control, don’t wait to see if it corrects.

I recommend enabling AP features in stages. Start with just TACC and get comfortable with Tesla’s implementation (which might be different than a prior car). Then read the entire Autosteer section of the manual before enabling Autosteer. Don’t enable Auto Lane change right away. Test out Autosteer on lightly travelled freeways that are cleanly marked and slowly get the hang of what the car is doing, what it does when it loses a lane line, etc. Disengage with the wheel several times to see what that is like. Then once you are comfortable with Autosteer alone, turn on Auto Lane change and see how that works and feels.

This was how I started with AP, and how my husband did as well. While it seems like complete overkill to folks who just turn it all on during delivery and test on the way home, I feel like it helped me not take AP for granted and treat it like the assistance system it is.
 
This is not good that the combination of radar, cameras and sonars can not positively identify an object in the path (time and space) of a moving vehicle. This issue must be dealt with as soon as possible.

Tesla is working on it. It is not a self driving system yet. That is why the driver has to keep the hand on the wheel and the eyes on the road and be prepared to take over any time the system drifts out of the lane. It is a great system. More than 99% effective. But you do have to take over the other 1% or the car will crash. Once the system is 99.99999% effective, we will be able to let it do the driving.
 
  • Helpful
Reactions: SlicedBr3ad
Not only did this driver try to kill the occupants. He grossly endangered the lives of other drivers on the freeway as well by coming to a complete stop! :mad: If this was to prove a point it came at a high risk to others.

Overly dramatic. He filmed the demo at night with very little traffic and clearly was aware of what to expect. If you have a problem with testing on public roads, maybe you should talk to Elon...
 
  • Love
Reactions: alcibiades