Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Blog Tesla: Autopilot Was Activated During Fatal Model X Crash

This site may earn commission on affiliate links.
Autopilot was activated when a Model X crashed into a concrete barrier killing the driver last week near Mountain View, Calif., according to a release from Tesla.

“In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum,” the company said. “The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”

Damage to the Model X was severe, in fact Tesla said it has it has “never seen this level of damage to a Model X in any other crash.” The company blames the severity of the crash on the absence of a crash attenuator designed to reduce the impact into a concrete lane divider. The crash attenuator was reportedly destroyed in a separate accident 11 days before the Model X crash and had yet to be replaced.

“Our data shows that Tesla owners have driven this same stretch of highway with Autopilot engaged roughly 85,000 times since Autopilot was first rolled out in 2015 and roughly 20,000 times since just the beginning of the year, and there has never been an accident that we know of,” the company said in an earlier statement. “There are over 200 successful Autopilot trips per day on this exact stretch of road.”

U.S. National Transportation Safety Board is investigating the crash.

Here’s Tesla’s update in full:

Since posting our first update, we have been working as quickly as possible to establish the facts of last week’s accident. Our hearts are with the family and friends who have been affected by this tragedy.

The safety of our customers is our top priority, which is why we are working closely with investigators to understand what happened, and what we can do to prevent this from happening in the future. After the logs from the computer inside the vehicle were recovered, we have more information about what may have happened.

In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.

The reason this crash was so severe is because the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had been crushed in a prior accident without being replaced. We have never seen this level of damage to a Model X in any other crash.

Over a year ago, our first iteration of Autopilot was found by the U.S. government to reduce crash rates by as much as 40%. Internal data confirms that recent updates to Autopilot have improved system reliability.

In the US, there is one automotive fatality every 86 million miles across all vehicles from all manufacturers. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware. If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident.

Tesla Autopilot does not prevent all accidents – such a standard would be impossible – but it makes them much less likely to occur. It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists.

No one knows about the accidents that didn’t happen, only the ones that did. The consequences of the public not using Autopilot, because of an inaccurate belief that it is less safe, would be extremely severe. There are about 1.25 million automotive deaths worldwide. If the current safety level of a Tesla vehicle were to be applied, it would mean about 900,000 lives saved per year. We expect the safety level of autonomous cars to be 10 times safer than non-autonomous cars.

In the past, when we have brought up statistical safety points, we have been criticized for doing so, implying that we lack empathy for the tragedy that just occurred. Nothing could be further from the truth. We care deeply for and feel indebted to those who chose to put their trust in us. However, we must also care about people now and in the future whose lives may be saved if they know that Autopilot improves safety. None of this changes how devastating an event like this is or how much we feel for our customer’s family and friends. We are incredibly sorry for their loss.

Photo: @DeanCSmith/Twitter

 
Last edited by a moderator:
the reality here is the tesla took an action to cause a fatality, if the drivers attention is a requirement to override this action then the GM supercruise seems a better supervisor since it monitors if you have your eyes on the road, who cares if your hands are on the steering wheel
 
Sometimes the lane markings are wrong (the construction crash, and 101).
Sometimes the car in front is wrong (fire truck crash last second swerve, sort of).
No matter which you trust, you will be wrong sometimes, so picking one to weigh more seems unreliable.

I think the only system out there that wouldn't of driven into the barrier for sure is GM Supercruise. It only operates in situations where it knows exactly where it is. However with enough time and cars, Supercruise will still kill people.

But Autopilot is a very different system. One that is much more versatile. And one I personally prefer because it's one of the reasons for me buying a Tesla. But I think Tesla should've started with the idea that they wanted to make their system as safe as possible. To do that I wouldve liked to see them include as much technology as possible to help it handle as many situations as possible. So it should've included Lidar or at a minimum stereoscopic vision to better identify stationary objects. Add FLIR to help with pedestrian/animal detection. Add rear and side radar to identify more threats than just what is in front.

Would this add cost? Yes, but Teslas are more expensive than any of the vehicles which have the above technologies already.
 
I think the only system out there that wouldn't of driven into the barrier for sure is GM Supercruise. It only operates in situations where it knows exactly where it is. However with enough time and cars, Supercruise will still kill people.

But Autopilot is a very different system. One that is much more versatile. And one I personally prefer because it's one of the reasons for me buying a Tesla. But I think Tesla should've started with the idea that they wanted to make their system as safe as possible. To do that I wouldve liked to see them include as much technology as possible to help it handle as many situations as possible. So it should've included Lidar or at a minimum stereoscopic vision to better identify stationary objects. Add FLIR to help with pedestrian/animal detection. Add rear and side radar to identify more threats than just what is in front.

Would this add cost? Yes, but Teslas are more expensive than any of the vehicles which have the above technologies already.

I get where you are coming from, but it's not like you can just drop in two cameras and have instant object detection. Tesla had a road map starting with MobilEye that didn't work out. At that point, the cars only had vision and radar, so that's what they had to work with. Long term, the non-stereo vision will be working, and it will happen sooner without the side development for other sensor types that end up being replaced. (Pure opinion of course)
 
I think the only system out there that wouldn't of driven into the barrier for sure is GM Supercruise. It only operates in situations where it knows exactly where it is. However with enough time and cars, Supercruise will still kill people.

But Autopilot is a very different system. One that is much more versatile. And one I personally prefer because it's one of the reasons for me buying a Tesla. But I think Tesla should've started with the idea that they wanted to make their system as safe as possible. To do that I wouldve liked to see them include as much technology as possible to help it handle as many situations as possible. So it should've included Lidar or at a minimum stereoscopic vision to better identify stationary objects. Add FLIR to help with pedestrian/animal detection. Add rear and side radar to identify more threats than just what is in front.

Would this add cost? Yes, but Teslas are more expensive than any of the vehicles which have the above technologies already.
the thing I like most supercruise is the “eyes on the road” monitoring since roads can change overnight due to construction, weather, etc. until we are at full autonomy keep your eyes on the road or disengage self driving!!!
 
the thing I like most supercruise is the “eyes on the road” monitoring since roads can change overnight due to construction, weather, etc. until we are at full autonomy keep your eyes on the road or disengage self driving!!!
There was a review where the tester still managed to reply to all of his emails while using Supercruise. Basically 4 seconds at a time. More than enough time to kill yourself. Any system can be defeated by a human determined enough.
 
I'm not a fan of the eye tracking and suspect keeping a hand on the wheel is better.

More interesting is how the caddy super cruise might have used the lidar mapping to perhaps override the crppy caltrans lane marking.

the reality here is the tesla took an action to cause a fatality, if the drivers attention is a requirement to override this action then the GM supercruise seems a better supervisor since it monitors if you have your eyes on the road, who cares if your hands are on the steering wheel
 
I get where you are coming from, but it's not like you can just drop in two cameras and have instant object detection. Tesla had a road map starting with MobilEye that didn't work out. At that point, the cars only had vision and radar, so that's what they had to work with. Long term, the non-stereo vision will be working, and it will happen sooner without the side development for other sensor types that end up being replaced. (Pure opinion of course)

Many of these technologies have been in cars as long as Tesla has been around. My 2013 Audi had FLIR and I think the Subaru system has been around nearly that too. Don't forget Tesla is also the company that stuck 8 cameras in a car and was perfectly OK with only using 1. So they could've put them in to be activated in time.

I just get the impression Tesla is trying to see how much they can accomplish with as little technology as possible. Don't get me wrong, they have accomplished a lot with just one forward camera But I think more technology would've helped to mitigate some dangerous/lethal situations. Notice I said *some* and not *all*.
 
  • Helpful
Reactions: mongo
I'm not a fan of the eye tracking and suspect keeping a hand on the wheel is better.

More interesting is how the caddy super cruise might have used the lidar mapping to perhaps override the crppy caltrans lane marking.
why do you like hand on wheel better?
I dont see how you can infer any level of paying attention ftom a hand on the wheel but it seems you could if you could monitor eyes on the road
 
There was a review where the tester still managed to reply to all of his emails while using Supercruise. Basically 4 seconds at a time. More than enough time to kill yourself. Any system can be defeated by a human determined enough.
true but that is the same as distracted driving now, I cant disengage from paying attention for many minutes at a time or more.

plus I think there needs to be a clean way to assign liability and if the defense is you must pay attention then whatever is the best way to determine that would seem appropriate

as for the email you mentioned you would also know that the driver was not paying attention for those 4 seconds he could have hit someone else, hand on the wheel does not get you there

and the irony of your car spying on you is noted
 
Last edited:
I think before Tesla starts tracking faces and eyes they should try to add additional sensors (e.g: Lidars) and create a better redundancy for the safety of the drivers. Clearly, the cameras and radar are not sufficient enough. Whats the point of having 8 cameras when the car drives itself into a big red fire truck or a centre divider on a clear day. I posted the video from CES mobile eye presentation someone quoted above in the thread and I was 100% right about it.
 
  • Funny
Reactions: bhzmark
I think before Tesla starts tracking faces and eyes they should try to add additional sensors (e.g: Lidars) and create a better redundancy for the safety of the drivers. Clearly, the cameras and radar are not sufficient enough. Whats the point of having 8 cameras when the car drives itself into a big red fire truck or a centre divider on a clear day. I posted the video from CES mobile eye presentation someone quoted above in the thread and I was 100% right about it.

The 8 cameras cover different directions, so 5 of them have nothing to do with frontal collisions. The software is still being developed for object detection via the front camera set. As we have seen, just having a lidar sensor is no guarantee of anything.
The level of sensor software integration is the reason the driver is still fully responsible. Once the car advances to the point of reliable object recognition of not just road hazards, but pedestrians, traffic signals, and cross traffic, then it will be in the driver as passenger spectrum.
 
perhaps a very loud audible/visual warning could be implemented for those objects EAP whitelists in its path that are stationary and of significant size to be threat . the faster you are going the earlier the alarm. it won't falsely slam on brakes but lets the user decide.
 
I think before Tesla starts tracking faces and eyes they should try to add additional sensors (e.g: Lidars) and create a better redundancy for the safety of the drivers. Clearly, the cameras and radar are not sufficient enough. Whats the point of having 8 cameras when the car drives itself into a big red fire truck or a centre divider on a clear day. I posted the video from CES mobile eye presentation someone quoted above in the thread and I was 100% right about it.
I would disagree here, unless extra sensors completely eliminates the need to pay attention (full autonomy) you would just make the problem worse by having more people lulled into a false sense of safety and your vehicle will still take an action to kill you or others. I feel the problem is that these systems work so well that reasonable people would think everything is fine when it really is not. By the way I am for adding more sensors but feel driver awareness is the higher priority.

And to all of the “driver needs to pay attention” people, that isnt happening enough now, and ask yourselves how vigilent people would be if autopilot was fine for hours, days, months before it swerved into something? The transition to full autonomy will be a tough one because as the system gets better but still not good enough for full autonomy the percentage of people trusting the system to be fully autonomous will increase.

Also before we get to a fully autonomous system it would need to have a failure rate on the order of current mechanical failure rates that cause accidents now which is very small. You need to eliminate the accidents caused by alcohol/drug issues, excessive speeding, etc. since safe drivers dont do these things. I think mechanical failure is only about 10% of fatalities and after that you need to eliminate bad tires or brakes since those are maintanance item that the owner is responsible. What is left are more like the Prius sudden acceleration problem. These mechanical failures are very very rare. I dont think people would accept having a product or feature where they have to weigh how likely it is to kill them or a family member, it has to be so unlikely that we dont even think about it.

As I see it litigation/legislation will either kill this technology in the cradle or force the very safe outcome. Hopefully the later.
 
Last edited:
the thing I like most supercruise is the “eyes on the road” monitoring since roads can change overnight due to construction, weather, etc. until we are at full autonomy keep your eyes on the road or disengage self driving!!!

So are you driving a car with Supercruise now? Since you appear to sound like you know so much about it's capabilities, how does it's eyes on the road handle drivers with dark tinted or mirrored sunglasses on? We live in a hot sunny area and even in winter wear dark polarizing sunglasses during the daylight hours (as do most people here who care about their eyesight and cataracts forming).
 
So are you driving a car with Supercruise now? Since you appear to sound like you know so much about it's capabilities, how does it's eyes on the road handle drivers with dark tinted or mirrored sunglasses on? We live in a hot sunny area and even in winter wear dark polarizing sunglasses during the daylight hours (as do most people here who care about their eyesight and cataracts forming).
I'm not sure how but it still tracks eyes with sunglasses on and at night.
 
I'm not sure how but it still tracks eyes with sunglasses on and at night.

Hmm. OK I'm clueless on this tech but I guess that makes me wonder what kind of, I assume beams?, are being sent to my eyes.

BTW I understand from really limited reading on it that Supercruise maps out major, divided-only highways using their mapping cars so can't be used on a lot of local highways that people drive on every day. From what I read somewhere they will only be updating their mapping info quarterly so how does that identify and navigate around construction areas if they just occurred. From some of the things posted above it sounded like they would always be able to distinguish those areas but that's not what I'm getting the impression of it being able to do if it's only quarterly. A number of accidents could happen within a fiscal quarter.

Aren't our Teslas constantly gathering info on roads driven and sent "home" for updating the map info? I'm sure there aremmore Teslas on the road than cars with Supercruise, in fact isn't it only on one model of Cadillac right now?
 
Had one more comment I guess on Supercruise (BTW I'd never buy a Cadillac). I read that not all highway areas even if mapped by their system give back a "ok you're on Supercruise" indication and it can be intermittent and sometimes not on for very long. Sounds like your car's driver assist system could be on and off frequently and so the driver would still constantly need to have their hands on the wheel and ready to brake. Not sure how that's really different from Tesla's Autopilot which requires you to be driver aware and ready to take control.
 
the reality here is the tesla took an action to cause a fatality, if the drivers attention is a requirement to override this action then the GM supercruise seems a better supervisor since it monitors if you have your eyes on the road, who cares if your hands are on the steering wheel

Doesn't the latest Tesla AP have this or that just in the Model 3 and not the S/X?
 
I dont have
Had one more comment I guess on Supercruise (BTW I'd never buy a Cadillac). I read that not all highway areas even if mapped by their system give back a "ok you're on Supercruise" indication and it can be intermittent and sometimes not on for very long. Sounds like your car's driver assist system could be on and off frequently and so the driver would still constantly need to have their hands on the wheel and ready to brake. Not sure how that's really different from Tesla's Autopilot which requires you to be driver aware and ready to take control.
I dont have either, just a curious observer, but was only commenting on the best driver awareness technology no matter who it is from would be seem to be preferred if you are going to insist the driver needs to be aware and make them liable. If there is technology to do this and its not used I dont think “we said pay attention” is going to go far in a civil court especially since the better the technology gets the more it will lull people to ignore the pay attention warning.