Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Blog Tesla: Autopilot Was Activated During Fatal Model X Crash

This site may earn commission on affiliate links.
Autopilot was activated when a Model X crashed into a concrete barrier killing the driver last week near Mountain View, Calif., according to a release from Tesla.

“In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum,” the company said. “The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”

Damage to the Model X was severe, in fact Tesla said it has it has “never seen this level of damage to a Model X in any other crash.” The company blames the severity of the crash on the absence of a crash attenuator designed to reduce the impact into a concrete lane divider. The crash attenuator was reportedly destroyed in a separate accident 11 days before the Model X crash and had yet to be replaced.

“Our data shows that Tesla owners have driven this same stretch of highway with Autopilot engaged roughly 85,000 times since Autopilot was first rolled out in 2015 and roughly 20,000 times since just the beginning of the year, and there has never been an accident that we know of,” the company said in an earlier statement. “There are over 200 successful Autopilot trips per day on this exact stretch of road.”

U.S. National Transportation Safety Board is investigating the crash.

Here’s Tesla’s update in full:

Since posting our first update, we have been working as quickly as possible to establish the facts of last week’s accident. Our hearts are with the family and friends who have been affected by this tragedy.

The safety of our customers is our top priority, which is why we are working closely with investigators to understand what happened, and what we can do to prevent this from happening in the future. After the logs from the computer inside the vehicle were recovered, we have more information about what may have happened.

In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum. The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision. The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.

The reason this crash was so severe is because the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had been crushed in a prior accident without being replaced. We have never seen this level of damage to a Model X in any other crash.

Over a year ago, our first iteration of Autopilot was found by the U.S. government to reduce crash rates by as much as 40%. Internal data confirms that recent updates to Autopilot have improved system reliability.

In the US, there is one automotive fatality every 86 million miles across all vehicles from all manufacturers. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware. If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident.

Tesla Autopilot does not prevent all accidents – such a standard would be impossible – but it makes them much less likely to occur. It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists.

No one knows about the accidents that didn’t happen, only the ones that did. The consequences of the public not using Autopilot, because of an inaccurate belief that it is less safe, would be extremely severe. There are about 1.25 million automotive deaths worldwide. If the current safety level of a Tesla vehicle were to be applied, it would mean about 900,000 lives saved per year. We expect the safety level of autonomous cars to be 10 times safer than non-autonomous cars.

In the past, when we have brought up statistical safety points, we have been criticized for doing so, implying that we lack empathy for the tragedy that just occurred. Nothing could be further from the truth. We care deeply for and feel indebted to those who chose to put their trust in us. However, we must also care about people now and in the future whose lives may be saved if they know that Autopilot improves safety. None of this changes how devastating an event like this is or how much we feel for our customer’s family and friends. We are incredibly sorry for their loss.

Photo: @DeanCSmith/Twitter

 
Last edited by a moderator:
Agree, in fact, the AP versions in these crashes requires, no demands the driver to pay attention at all times, because the system can misinterpret even minor wear on a lane no human ever would.

Sort of funny comment since the wear is caused by humans driving over the "do not cross" lines. Seems they get confused even with perfect lines..

I do agree that use of AP requires the driver to pay attention. Says so right in the manual.

Warning: Autosteer is a hands-on feature. You must keep your hands on the steering wheel at all times.
Warning: Autosteer is intended for use only on highways and limited-access roads with a fully attentive driver. When using Autosteer, hold the steering wheel and be mindful of road conditions and surrounding traffic. Do not use Autosteer on city streets, in construction zones, or in areas where bicyclists or pedestrians may be present. Never depend on Autosteer to determine an appropriate driving path. Always be prepared to take immediate action. Failure to follow these instructions could cause damage, serious injury or death.
 
  • Like
Reactions: bhzmark
Sort of funny comment since the wear is caused by humans driving over the "do not cross" lines. Seems they get confused even with perfect lines..

I do agree that use of AP requires the driver to pay attention. Says so right in the manual.
What does it say in the manual about Automatic Emergency Braking functioning or not functioning? AEB should not be running into stationary objects at 70mph.
All the discussion of AP, but this seems the most disconcerting miss.
 
Nope, they don't get confused where the lane are and where the should drive. And also sun, rain, wind and age contribute to the erosion of the paint, maybe more than tires.

If that were true, the lane markings would be evenly faded. As the posted images show, they are more worn where people cross them at the beginning of the split and solid afterward.
 
What does it say in the manual about Automatic Emergency Braking functioning or not functioning? AEB should not be running into stationary objects at 70mph.
All the discussion of AP, but this seems the most disconcerting miss.

AEB doesn't run into objects, drivers run into objects. AEB is there to help save people from themselves.
Warning: Automatic Emergency Braking is designed to reduce the severity of an impact. It is not designed to avoid a collision.

From the manual, (available for download for your own perusal) AEB doesn't even mention stationary objects.
The forward looking camera(s) and the radar sensor are designed to determine the distance from an object (vehicle, motorcycle, bicycle, or pedestrian) traveling in front of Model 3. When a frontal collision is considered unavoidable, Automatic Emergency Braking is designed to apply the brakes to reduce the severity of the impact.

Warning: The limitations previously described do not represent an exhaustive list of situations that may interfere with proper operation of Collision Avoidance Assist features. These features may fail to provide their intended function for many other reasons. It is the driver’s responsibility to avoid collisions by staying alert and paying attention to the area beside Model 3 so you can anticipate the need to take corrective action as early as possible.
 
If that were true, the lane markings would be evenly faded. As the posted images show, they are more worn where people cross them at the beginning of the split and solid afterward.
Yes of course, but drivers changing lanes too late know exactly what they are doing, a calculated risk. Even if it is dangerous. And AP must handle those cases consistently, since drivers will not stop wearing down lane markers.
 
I do agree that use of AP requires the driver to pay attention. Says so right in the manual.

You're, of course, referring to the Tesla on-line manual that Tesla can and does change whenever the urge or need arises without any notification to Tesla owners. There's a reason Tesla chose not to have a printed manual delivered with their cars, and it has little to do with saving trees IMHO.
 
  • Like
  • Disagree
Reactions: bro1999 and mongo
You're, of course, referring to the Tesla on-line manual that Tesla can and does change whenever the urge or need arises without any notification to Tesla owners. There's a reason Tesla chose not to have a printed manual delivered with their cars, and it has little to do with saving trees IMHO.

Nope, I'm referring to the revision numbered and dated pdf manuals available for each release from multiple sources. You can even find versions that don't have AEB.
Heck, you can even find when they changed the listed functionality of the 3's trunk.
 
And the 64 thousand dollar question will be if the manufacturers will be held partially liable to ensure the driver is engaged while autopilot or driver assist type systems are active, its to soon to tell but with all the model 3 sales its only a matter of time until a third party is injured and the issue is litigated, unless the regulators step in first.

So how is this different than seatbelts, added to reduce injury and death in an accident? People would have them in their cars and told to use them but many still didn't. Manufacturers then made it so that your car would beep at you until you did. Still people came up with ways to defeat it because they didn't want to wear them. Then we had laws enacted saying you'd be fined if found without seat belts on while operating the vehicle. Some still don't. Don't think despite all that that manufacturers were held liable for personal injuries. Let's face there are people and always will be that just don't care. Drivers however have been sued for letting their passengers ride in the car without putting their seatbelts on. The driver is still the one in control and held accountable.

because lack of seatbelts increases injury to the driver and is not the same as the software deciding to veer off the road and injure a third party requiring constant attention to override bad decisions. And if people want to compared it cruise control your talking about a short period of time before something bad happens so people will pay attention. With autopilot or other driver assist technology the duration before something bad happening could be very long, and the longer it is the more people will trust it when they should not.

My point was some advisory agency can tell manufacturers to modify their cars to do something they think will make things safer for drivers and it doesn't mean drivers use it as intended. Like paying attention, hands on wheel and ready to take over. Pure and simple as that. I wasn't directly comparing seatbelts to driver assist software. Look at the bigger picture and comparison here.

And software doesn't "decide", at least not yet, to run you off the road to kill you...maybe once it's sentient. ;) So let's not be melodramatic describing what assisted systems can and can't do right now, the software of today's cars aren't perfected to recognize everything and yes, may interpret things wrong...which is why you must be mindful while driving, be attentive and stay in control. If you don't think you can do that, like you have to do now in your non-driver assisted car, by all means stay off the road. Guess you could always take an Uber.

Cruise control can still run you off the road at speed. Heck a pilot in a plane can be out or dead and his plane will fly til it runs out of fuel or something else causes it to crash. Some day people won't know how to drive the cars of today even if they had to due to dependence on it. Just like kids probably don't know how to use an abacus or slide ruler today.

And you know FSD software will be programmed to make decisions in accident situations, just like you the driver do now. It will be a call. Someone ultimately could still be injured. Machines will take in info and make decisions faster than the human mind with the thought they will make the better and faster decision to save your life. There won't always be a full-proof win-win situation. People around the world are working to improve machine learning, it's coming just like HD TV was going to replace broadcast whether you liked it or not or had a use for it.
 
Last edited:
My point was some advisory agency can tell manufacturers to modify their cars to do something they think will make things safer for drivers and it doesn't mean drivers use it as intended. Like paying attention, hands on wheel and ready to take over. Pure and simple as that. I wasn't directly comparing seatbelts to driver assist software. Look at the bigger picture and comparison here.

And software doesn't "decide", at least not yet, to run you off the road to kill you...maybe once it's sentient. ;) So let's not be melodramatic describing what assisted systems can and can't do right now, the software of today's cars aren't perfected to recognize everything and yes, may interpret things wrong...which is why you must be mindful while driving, be attentive and stay in control. If you don't think you can do that, like you have to do now in your non-driver assisted car, by all means stay off the road. Guess you could always take an Uber.

Cruise control can still run you off the road at speed. Heck a pilot in a plane can be out or dead and his plane will fly til it runs out of fuel or something else causes it to crash. Some day people won't know how to drive the cars of today even if they had to due to dependence on it. Just like kids probably don't know how to use an abacus or slide ruler today.

And you know FSD software will be programmed to make decisions in accident situations, just like you the driver do now. It will be a call. Someone ultimately could still be injured. Machines will take in info and make decisions faster than the human mind with the thought they will make the better and faster decision to save your life. There won't always be a full-proof win-win situation. People around the world are working to improve machine learning, it's coming just like HD TV was going to replace broadcast whether you liked it or not or had a use for it.

Yes the software did decide to run off the road, it went through a decision tree to choose the appropriate action to take for the current road condition. The algorithm did not give up and transfer control back to the driver or continue to go straight, it decided that veering left was the correct decision on that road.

And the big picture was really about this:
Will the manufacturers be liable at all to a third party injury or fatality if they say “read the manual and pay attention at all times” or does that simple warning shield them from all liability lawsuits? I suspect it will not.

If there are other safety measures such as hands on wheel detection, watch the road detection, shakey seat warnings, etc. what level is needed to be in the clear and free the manufacturer from all liability? That is probably the regulatory boards current big questions.

And I am actually a huge fan of autopilot but I also know legislation or regulation can kill things pretty quick and sometimes based on feelings over facts.
 
Last edited:
Will the manufacturers be liable at all to a third party injury or fatality if they say “read the manual and pay attention at all times” or does that simple warning shield from all liability lawsuits? I suspect it will not.

The warning won't shield them from lawsuits -- Tesla would need legislation to do that -- like the kind the gun manufacturers got.

Whether it provides an absolute or partial defence at trial remains to be seen.
 
The warning won't shield them from lawsuits -- Tesla would need legislation to do that -- like the kind the gun manufacturers got.

Whether it provides an absolute or partial defence at trial remains to be seen.

**Caution your tanaka airbag may not open properly and could cause injury. It is the responsibility of the driver to pay attention at all times and avoid high speed collisions. ;) ;)
 
The warning won't shield them from lawsuits -- Tesla would need legislation to do that -- like the kind the gun manufacturers got.

Whether it provides an absolute or partial defence at trial remains to be seen.
The warning won't shield them from lawsuits -- Tesla would need legislation to do that -- like the kind the gun manufacturers got.

Whether it provides an absolute or partial defence at trial remains to be seen.

I would not think so either
Who is liable and who is the insured party for all accidents?

Here is a hint: my insurance card doesn’t show Tesla as the insured party and is not the one paying my premiums.

Case closed.
 
Who is liable and who is the insured party for all accidents?

Here is a hint: my insurance card doesn’t show Tesla as the insured party and is not the one paying my premiums.

Case closed.
To be clear, I'm not taking a side on who is liable (but probably will at some point :)).

I'm absolutely positive that Tesla will has insurance and pays premiums & that insurance will be expanded in the future. So your insurance card not showing Tesla as the insured party doesn't support 'no liability' - it only shows that you are insured. If I drove your car and got in an accident, my insurance company would likely step in.

And again, not being drawn into the 'who is liable' discussion...
 
That's a momentary lapse of attention to me. It's a "very brief period of time", which is the definition of "moment". I've seen so many good people make much worse mistakes. Your heartlessness for the deceased and their loved ones is, well, better not to say, as my mother taught me.

As to all your Chinese law ramblings, the car was on AP and we have the video. We also have other fatalities investigated in the US and I've read the resulting reports, have you? The recommendations are that Tesla's warnings and nags are not sufficient, like you claim, resulting in action taken by Tesla in firmware updates.

Anyway, I've made my point. I hope some people took it to heart, and those who have a heart I am certain will.

Nice passive aggression! If people believe you, they have a heart, if they believe there are adequate warnings about paying attention while driving they don't. You have no idea what my thoughts are about the deceased and their loved ones, because that is irrelevant to this discussion and it is shocking to hear someone claim to be a defense lawyer citing emotions as a basis for a legal issue.

We don't actually disagree much. I agree that Auto pilot was a bad choice of a name because of the implication of you don't have to do anything, auto pilot is in charge.(not what the warnings say, what the name implies) That said, in this particular case, the driver was in no way acting reasonably, responsibly, or safely when he failed to react in any way to a visible obstruction in the lane that was visible for at least 5 seconds.

"Chinese law ramblings" - that's how you tried to cover the fact that you got way out on a limb about how the legal process works in this case, which happens to be in China and subject to Chinese law. Why not admit you were attempting to apply legal standards that are inapplicable in this particular case so we can discuss the real issues - is auto pilot safe enough yet? No, it isn't, unless the driver remains fully engaged and responsible at all times for the operation of the vehicle. I happen to think that is every driver's responsibility in every car, including one with autopilot.
 
  • Like
Reactions: Cowby
We don't actually disagree much.

We disagree on the fundamental issue. You say the warnings are sufficient. I say they are not. Everything else is just ramblings.

I do find it odd that you say the name is a "bad choice" but in the same sentence say the warnings are fine. That argument would not go over well in court here, the US or Hong Kong, all of which follow the same common-law and similar rules of procedure, which I know well and don't need any limbs for assistance.

What I do not not understand is your logic and, sorry, but you do seem somewhat heartless to me -- but that's just my opinion. I think needless deaths and injuries will result from Tesla calling it AP, putting up videos of the vehicles self-driving, and not acting reasonably prudent in warning people of the serious risks that a momentary lapse of attention could end your life -- starting with a change of the name. But please don't reply that you care for the victims. I'm sure people who don't want warnings on cigarette packages care for the victims of lung disease too, and the NRA cares about dead children from gun violence. You miss my point when you reply that you care or when you say that part of my argument is "irrelevant" and "shocking" to hear from a lawyer. In fact, that argument is at the core of every legal action involving AP. Its relevance is not even at issue in the cases. Any seasoned lawyer will tell you that sympathy is often more powerful in a courtroom than the facts or the law. In fact, it's probably the "sympathy factor" that motivates Tesla's insurer to settle these cases.
 
Last edited:
  • Like
Reactions: croman
You
We disagree on the fundamental issue. You say the warnings are sufficient. I say they are not. Everything else is just ramblings.

I do find it odd that you say the name is a "bad choice" but in the same sentence say the warnings are fine. That argument would not go over well in court here, the US or Hong Kong, all of which follow the same common-law and similar rules of procedure, which I know well and don't need any limbs for assistance.

What I do not not understand is your logic and, sorry, but you do seem somewhat heartless to me -- but that's just my opinion. I think needless deaths and injuries will result from Tesla calling it AP, putting up videos of the vehicles self-driving, and not acting reasonably prudent in warning people of the serious risks that a momentary lapse of attention could end your life -- starting with a change of the name. But please don't reply that you care for the victims. I'm sure people who don't want warning on cigarette packages care for the victims of lung disease too, and the NRA cares about dead children from gun violence. You miss my point when you reply that you care or when you say that part of my argument is "irrelevant" and "shocking" to hear from a lawyer. In fact, that argument is at the core of every legal action involving AP. It's relevance is not even at issue in the cases. Any seasoned lawyer will tell you that sympathy is often more powerful in a courtroom than the facts or the law.

Your "ability" to read and understand people is amazing suspect for a lawyer. You know nothing about me or my beliefs but you continue to spout remarkably off target projections about what I believe.

I didn't miss anything in your post - you claim to be a defense lawyer - about which I have doubts - but defense lawyers don't argue sympathy, they argue law and facts. You next argue that people who don't like Cigs and Gun warnings care - projecting that you have a clue what my position is on those very different topics - you don't know. I guess you think I won't recognize Straw Man arguments? Wrong. I hope your personal interactions involve actually learning something about what people believe before determining that you know what they believe and judging them based on your inaccurate projections, I really do.

Warnings and the name are not the same thing. The name implies inaction on the driver's part to me, but, there is hardly a universally accepted definition of that that means in the context of vehicle operation. That lack of clarity is why I don't like the name.

I find it telling that you refuse to accept the premise that the driver of every vehicle is primarily responsible for the safe operation of the vehicle and that driver failed miserably in this case and the result was tragic. Even if a car has an operating system called "Jesus took the wheel, you can sit back and play with your phone" every driver is by law primarily responsible for the safe operation of the car.
 
I didn't miss anything in your post - you claim to be a defense lawyer - about which I have doubts - but defense lawyers don't argue sympathy, they argue law and facts.

Plaintiffs' lawyers argue sympathy any way they can -- and it's the elephant in the room. Crying family/friends on the stand vs. big bad deep pocket insurance company. Any insurance defence lawyer will tell you sympathy is a huge risk factor and it often takes the Court of Appeal, where sympathy is lessened since it's based on transcripts and not in-person witnesses, to overrule those decisions which is costly. That's why I said:

In fact, it's probably the "sympathy factor" that motivates Tesla's insurer to settle these cases.

I find it telling that you refuse to accept the premise that the driver of every vehicle is primarily responsible for the safe operation of the vehicle

Now we're just going in circles, but once again, here's what I said on that issue:

I placed no blame. I made comments. If you read my posts in other fatality threads here, I said, at law, the driver will likely be found to be at fault but over 90% of these cases settle. Tesla has insurance and legal proceedings are costly (to say the least) and uncertain, and Tesla doesn't want a precedent set, so they settle with no admission of liability and a NDA. To date, no case has gone to trial on this issue and there's no surprises there.

you claim to be a defense lawyer - about which I have doubts

Okay, we have a bet. How about $100? One of the mods here came to visit me at my cabin. He can be the judge and tell you what I do for a living:

Ohmman's Airstream Adventures
 
  • Like
Reactions: Xcelerator
Plaintiffs' lawyers argue sympathy any way they can -- and it's the elephant in the room. Crying family/friends on the stand vs. big bad deep pocket insurance company. Any insurance defence lawyer will tell you sympathy is a huge risk factor and it often takes the Court of Appeal, where sympathy is lessened since it's based on transcripts and not in-person witnesses, to overrule those decisions which is costly. That's why I said:





Now we're just going in circles, but once again, here's what I said on that issue:





Okay, we have a bet. How about $100? One of the mods here came to visit me at my cabin. He can be the judge and tell you what I do for a living:

Ohmman's Airstream Adventures

Stunning! Lovely! Awesome! Color me jelly. How about we make it you tell them what I do for a living? You've already said what you do.