Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Blog Tesla: Autopilot Was Activated During Fatal Model X Crash

This site may earn commission on affiliate links.
Autopilot was activated when a Model X crashed into a concrete barrier killing the driver last week near Mountain View, Calif., according to a release from Tesla.

“In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum,” the company said. “The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”

Damage to the Model X was severe, in fact Tesla said it has it has “never seen this level of damage to a Model X in any other crash.” The company blames the severity of the crash on the absence of a crash attenuator designed to reduce the impact into a concrete lane divider. The crash attenuator was reportedly destroyed in a separate accident 11 days before the Model X crash and had yet to be replaced.

“Our data shows that Tesla owners have driven this same stretch of highway with Autopilot engaged roughly 85,000 times since Autopilot was first rolled out in 2015 and roughly 20,000 times since just the beginning of the year, and there has never been an accident that we know of,” the company said in an earlier statement. “There are over 200 successful Autopilot trips per day on this exact stretch of road.”

U.S. National Transportation Safety Board is investigating the crash.

Here’s Tesla’s update in full:

Since posting our first update, we have been working as quickly as possible to establish the facts of last week’s accident. Our hearts are with the family and friends who have been affected by this tragedy.

The safety of our customers is our top priority, which is why we are working closely with investigators to understand what happened, and what we can do to prevent this from happening in the future. After the logs from the computer inside the vehicle were recovered, we have more information about what may have happened.

In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.

The reason this crash was so severe is because the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had been crushed in a prior accident without being replaced. We have never seen this level of damage to a Model X in any other crash.

Over a year ago, our first iteration of Autopilot was found by the U.S. government to reduce crash rates by as much as 40%. Internal data confirms that recent updates to Autopilot have improved system reliability.

In the US, there is one automotive fatality every 86 million miles across all vehicles from all manufacturers. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware. If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident.

Tesla Autopilot does not prevent all accidents – such a standard would be impossible – but it makes them much less likely to occur. It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists.

No one knows about the accidents that didn’t happen, only the ones that did. The consequences of the public not using Autopilot, because of an inaccurate belief that it is less safe, would be extremely severe. There are about 1.25 million automotive deaths worldwide. If the current safety level of a Tesla vehicle were to be applied, it would mean about 900,000 lives saved per year. We expect the safety level of autonomous cars to be 10 times safer than non-autonomous cars.

In the past, when we have brought up statistical safety points, we have been criticized for doing so, implying that we lack empathy for the tragedy that just occurred. Nothing could be further from the truth. We care deeply for and feel indebted to those who chose to put their trust in us. However, we must also care about people now and in the future whose lives may be saved if they know that Autopilot improves safety. None of this changes how devastating an event like this is or how much we feel for our customer’s family and friends. We are incredibly sorry for their loss.

Photo: @DeanCSmith/Twitter

 
Last edited by a moderator:
I see your line of thinking.
On Bayshore, anywhere on the peninsula, 6 seconds seems like an eternity. At 60 mph, that's 0.1 mi or over 500 ft. I didn't see any mention of speeds when the accident occurred, but the "1" setting will only produce about 50-60 ft separation in my MS. In 500 ft, there are 4 or 5 cars ahead. It would seem daunting for sensors and software to predict that much forward behavior ... seems like it would be in a constant state of alarm. That's what leads me to believe slower speeds would be necessary.

The behavior of pilots under ADS-B is what drivers will need to adopt - a tall order.
 
As always, self serving weasel words from Tesla. Lots of irrelevant "hey look over there" to try and distract from the core facts.

AP2 is known to randomly make huge driving corrections even in 100% perfect conditions and on sections of road where it has previously worked perfectly.

No driver input for 6 seconds is 100% consistent with the driver's hands being on the wheel at all times and simply not applying sufficient torque to be detected.

My bet is AP2 swerved at the last moment leaving the driver with no time to react.
 
As always, self serving weasel words from Tesla.

AP2 is known to randomly make huge driving corrections even in 100% perfect conditions and on sections of road where it has previously worked perfectly.

No driver input for 6 seconds is 100% consistent with the driver's hands being on the wheel at all times and simply not applying sufficient torque to be detected.

My bet is AP2 swerved at the last moment leaving the driver with no time to react.

It is well known that AP2 is erratic and it is well known that no driver input doesn't mean inattentiveness but that isn't favorable to Tesla, so it tends to get ignored.
 
As always, self serving weasel words from Tesla. Lots of irrelevant "hey look over there" to try and distract from the core facts.

AP2 is known to randomly make huge driving corrections even in 100% perfect conditions and on sections of road where it has previously worked perfectly.

No driver input for 6 seconds is 100% consistent with the driver's hands being on the wheel at all times and simply not applying sufficient torque to be detected.

My bet is AP2 swerved at the last moment leaving the driver with no time to react.

This is quite possible and my worst fear about tesla. Let's look for something positive to come out of this tragedy.
 
This accident in Mountain View has really brought home how dangerous driving can be if you do not have your wits about you. In 2016, 40,000 people died on the roadways in the US. It is too easy to get a drivers licence. You should be required to demonstrate more than just a rudimentary knowledge of the traffic laws.
Of the automobile 40,000 fatalities in the US, 1 in 8 fatalities is a pedestrian. When I drive in San Francisco I am always amazed at how many pedestrians don't look both ways before crossing. They just step off the curb. That is what happened in the Uber accident. The woman never saw the Uber vehicle.
When driving on the highway, I sometimes see skid marks that go at an angle off of the road way into a divider or to the outer edge of the road. I wonder how this happens...
 
This system is, and will always be deadly dangerous.
It has almost killed me many times with it's unpredictable behaviour.
Yesterday again, I almost got rear ended because it out off nothing hits the brakes.
We are all just stupid test-dummies, driving a beta system that never should have been on the street.
By all means, I love my car's, and drive them every day but this lane assist system is dangerous.
 
From my perspective (as a lawyer, but not one involved in these matters professionally), I am very worried that Tesla seem to think they are "safe" if Autopilot is 2-3 times as good as a human driver, on the basis of some sort of utilitarian argument that 15000 dead Americans, killed by Autopilot, is better than 40000 dead, killed by themselves/each other. The courts would not support that view, and I fear Tesla could go bust finding out that they would be held to a much, much higher standard than any human driver, and that damages awards against them, even if few in number, could be stratospheric in terms of quantum.

Elon needs to get Autopilot at least 1000 times safer than it is before it could even be considered to be advertised as "beta self driving", (let alone the finished article).

On the bright side, that should actually be achievable, Waymo seem to be well down that road, but not on Elon's timetable, nor, I fear, with the current array of sensors and cameras.
 
This is so sad. But it does point out the future of safety. Cars with camera's, maps and radar can report broken barriers, potholes, and infrastructure issues to maintenance crews. Improve response time. As well as alert other vehicles to the safety issue on the roadway.
My only issue with Tesla here is that they are misleading the public, implying the driver wasn't paying attention. Saying the driver wasn't holding the wheel frequently. False alarms are pretty common and by design of the steering torque sensor. Tesla is well aware that is an unreliable method of sensing hands on the wheel, or paying attention. What about the selfie cam? Was the driver distracted by the tractor trailer swerving next to him into his lane, other drivers not knowing which lane to choose at the last second, or were they texting and driving, eating a cheeseburger, etc?
 
Elon needs to get Autopilot at least 1000 times safer than it is before it could even be considered to be advertised as "beta self driving", (let alone the finished article)..
As a lawyer, you probably recall the cases against forcing Anti-lock brakes and Air bags in all cars. The safety boards stepped up, and had laws made to protect car manufacturers legitimately improving safety. People laugh now that public opinion used to fear anti-locks and airbags as "Deadly" features.

I'm too young to recall the seatbelt debate, but I know plenty of older folks who claim that seatbelts should never have been installed in cars. What happens when the car catches fire and you can't escape because the seatbelt malfunctions! Or the accidents where seatbelts cut a person in half! Yes, old people know all the ways seatbelts can kill you, and not wearing one can save your life!
 
Last edited:
The tesla had redundant steering systems, the organic system being the primary, auto pilot the secondary. It seems that a condition was found that both were incapable of handling, resulting in a accident. The serveity of the accident was the result of the condition of the safety barrier.
The organic based steering system is at he peek of its design with few improvements possible, while the auto pilot steering system is at the beginning of its design with many improvements possible, I'll vote for the system that can be improved.
 
  • Like
Reactions: MrAustraliaTax
This is so sad. But it does point out the future of safety. Cars with camera's, maps and radar can report broken barriers, potholes, and infrastructure issues to maintenance crews. Improve response time. As well as alert other vehicles to the safety issue on the roadway.
My only issue with Tesla here is that they are misleading the public, implying the driver wasn't paying attention. Saying the driver wasn't holding the wheel frequently. False alarms are pretty common and by design of the steering torque sensor. Tesla is well aware that is an unreliable method of sensing hands on the wheel, or paying attention. What about the selfie cam? Was the driver distracted by the tractor trailer swerving next to him into his lane, other drivers not knowing which lane to choose at the last second, or were they texting and driving, eating a cheeseburger, etc?

I'm no expert but tesla is saying the guy who complained about autopilot not working on the that spot should have been paying more attention and not using lowest distance in follow car setting.

I have a car with dynamic cruise control and one thing I always do is use it with max distance to next car because I don't trust it even though it's been spot on in its operation about 99% of the time.
 
I'm no expert but tesla is saying the guy who complained about autopilot not working on the that spot should have been paying more attention and not using lowest distance in follow car setting.

I have a car with dynamic cruise control and one thing I always do is use it with max distance to next car because I don't trust it even though it's been spot on in its operation about 99% of the time.

Absolutely, unless he got a firmware update, and decided to test if AP was now "better" and made a deadly choice. I personally pay more attention on this one specific section of highway, where AP will emergency brake for no reason. The SC just says the logs show 'noisy environment' but there is nothing special about that area. Also 'minimum distance' on my Tesla is about 3-4 car lengths. Minimum is more than enough for a tractor trailer to cut me off, which occurs frequently in my area.
 
Absolutely, unless he got a firmware update, and decided to test if AP was now "better" and made a deadly choice. I personally pay more attention on this one specific section of highway, where AP will emergency brake for no reason. The SC just says the logs show 'noisy environment' but there is nothing special about that area. Also 'minimum distance' on my Tesla is about 3-4 car lengths. Minimum is more than enough for a tractor trailer to cut me off, which occurs frequently in my area.

This is helpful. Keep in mind, distance of 1 tractor trailer is less than .5 seconds to react when something happens while driving at high speed. I've driven with my dynamic cruise control set to this shortest follow distance and was extra attentive to avoid crashes in heavy traffic because of course dynamic cruise control only helps and I'm still responsible if someone slams breaks and my car his the one in front.
 
Can someone confirm this was AP2? If it was, NHTSA is going to finally become aware of the bait-switch that happened when they were judging Autopilot increased safety at almost the same moment Tesla was replacing the foundation of the system with their half-baked home brew sensors. (at the time...nearly baked now)
 
This is MX crash is scaring me, a bit.

My question for any experts looking at this: Why was the barrier not seen by the radar, causing the system to initiate auto braking? Also, is there a chance of a bug that causes a freeze in logic in this driver's version of the AP software in this situation that causes it to fail to steer one way or the other when the user is not attentive.

Holder of a Day 0 Reservation for the Model 3
Currently driving a Subaru Forester