Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Blog NTSB Releases Preliminary Report on Autopilot Crash

This site may earn commission on affiliate links.
The U.S. National Transportation Safety Board issued a preliminary report on a fatal March 23 crash involving a Tesla Model X using Autopilot near Mountain View, Calif.

Investigators leveraged data pulled from the car’s computer that shows the driver’s hands were on the steering wheel for just 34 seconds during the minute before impact.

Data also showed that the Model X sped up to 71 miles per hour just before hitting a highway barrier. Tesla issued a release in March that included most of the info in the report. Tesla said “the driver had received several visual and one audible hands-on warning earlier in the drive” and the driver had about five seconds and 150 meters of unobstructed view of the concrete…but the vehicle logs show that no action was taken.”

The NTSB report said the crash remains under investigation, with the intent of issuing safety recommendations to prevent similar crashes. No pre-crash braking or evasive steering movement was detected, according to the report.

“Tesla Autopilot does not prevent all accidents — such a standard would be impossible — but it makes them much less likely to occur,” Tesla wrote in its March post. “It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists.”

Read the full report here.

 
Last edited by a moderator:
I agree with the theory that the car in front stayed in the left lane, effectively moving right relative to the Testa. Then the Tesla decided that the lane had moved further left, although actually it was the gore markings. I suspect this is similar to its tendency to drift off onto exits from the right lane. As others have said, the acceleration was just to get back up to the set speed.

My total guess as to why the car didn't slow down is that the radar didn't distinguish the barrier from the ground because it was too low. The radar's horizontal resolution should be more than enough, but the vertical resolution isn't great. The vision system likely was never trained on a target like that, so didn't recognize it as anything in particular.

The driver not noticing all this going on is really unfortunate. I don't really see any explanation other than complete lack of attention.

IMHO Tesla should take a look at the markings there to see what the vision system thinks is going on and also try out the combined radar/vision system on targets like the damaged crash barrier.
 
I think it is a misleading statement in the report stating that the driver's hand was not detected. My hand are always holding the steering wheel while autopilot is engage, the system will not detect my hand unless I put force into the steering wheel.

one more note is that I don't understand why the Autopilot didn't break or slow down when it detected an object in front of it. for this case, a solid concrete divider.
 
I think it is a misleading statement in the report stating that the driver's hand was not detected. My hand are always holding the steering wheel while autopilot is engage, the system will not detect my hand unless I put force into the steering wheel.

one more note is that I don't understand why the Autopilot didn't break or slow down when it detected an object in front of it. for this case, a solid concrete divider.

Not detected is all they can say based on sensor data (no torque detected). They are not saying the hands were not on the wheel.

The Doppler radar senses the speed difference between the car and the worlds. All stationary things (including the road surface) look similar.
 
  • Like
Reactions: WattsSon
I think it is a misleading statement in the report stating that the driver's hand was not detected. My hand are always holding the steering wheel while autopilot is engage, the system will not detect my hand unless I put force into the steering wheel.

What they can clearly say is that for 3 seconds while the car was accelerating back to the target speed that had been set, with no other car in front of it, there was no attempt to either brake or turn the wheel...

The fact that the car didn't detect the object is no surprise, the manual makes it perfectly clear that at that speed and in those circumstances it isn't going to notice a stationary object that remains stationary and in that respect it is no different from any of the other similar systems on other cars...
 
Not necessarily.
"Data also showed that the Model X sped up to 71 miles per hour just before hitting a highway barrier"

I used to own a Honda Accord with TACC and you could set that at a specific MPH. If I had set mine at 71 and a driver was in front of me going 62, my car would be following at 62 plus whatever follow distance I had set (very similar to Tesla). Until the car in front swerves/changes lanes/otherwise disappears from my car's view and then my Honda would have sped up to 71 and smashed my inattentive face into the same guard rail as I looked at my twitter feed happily.

The common thread in a lot of these is, if you want a really cool fast fun safe car that aids you in relieving a lot of the tedium of driving, buy a Tesla. If you want to bury your nose in a phone/tablet or sleep, call an Uber.
Are you suggesting the guy who died in Calfornia had his nose burried in a phone/tablet? Come on, really? He probably had his hands on the wheel (last detected 6 seconds prior to the collision, but there are plenty of people on TMC who will tell you how they can drive with both hands on the wheel and still get nags, which means 30+ seconds not detected). I'm thinking the guy was using AP, as you say, to relieve a lot of the tedium of driving, sadly 3 seconds of not watching it like a hawk (which is more "tedium" than driving yourself) costed him his life. And yes, I know Tesla is blaming the barrier dampers - conclusion, never ever use AP anywhere where there are any obstacles along he way which are not dampened enough to save your life when you hit them at 70mph. You also have to love how Tesla statement told everyone how they nagged the drive, while neglecting to mention what came out of this report - the last nag was 15 minutes prior to the accident.
 
'm thinking the guy was using AP, as you say, to relieve a lot of the tedium of driving, sadly 5 seconds of not watching it like a hawk (which is more "tedium" than driving yourself) costed him his life.
I assume by 'it' you mean AP, as if it suddenly pointed the car in a new direction. What the driver should have been watching is 'it' the road. Normal following distance is 2 seconds or so. He had nearly 3 times that to drive the car.

Regardless, tedium is no excuse for unsafe driving. 5 seconds of not watching the road could have easily cost others their lives also.

And yes, I know Tesla is blaming the barrier dampers - conclusion, never ever use AP anywhere where there are any obstacles along he way which are not dampened enough to save your life when you hit them at 70mph.

Bologna. Tesla stated that the previously collapsed barrier made the crash worse.
The reason this crash was so severe is because the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had been crushed in a prior accident without being replaced. We have never seen this level of damage to a Model X in any other crash.

You also have to love how Tesla statement told everyone how they nagged the drive, while neglecting to mention what came out of this report - the last nag was 15 minutes prior to the accident.

They did not say they nagged the driver. They said there were warnings earlier in the drive.
The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision.
 
saying AP is less safe to use it than not use it seems to me to be a straw man argument... Very similar to an argument that I recently had about AWD/4WD being less safe than RWD... the argument was look at al lthe AWD cars that crashed... the stats say more cars crash with AWD than RWD... uh ok...so is it the AWD that is less safe or the driver not knowing the limits... exact same argument here... (and for the record I said AWD is safer than just RWD or FWD for that matter)


Honestly I bet the same arguments were said about "cruise control" when it first came out too... you know the old system that tried to maintain a set speed no matter what... If the driver didn't over ride it it would take a 35mph turn at 70mph if that was the "set" speed...
Ok, here are some articles you may find interesting, they go way beyong a "straw man" when concluding both that Level 2 cannot be safely done, and the incremental Level 2->3->4->5 is not a fallacy - Level 5 is a whole different problem (think extending the range on an EV doesn't get you any closer to reaching the moon):

Robot Cars Can’t Count on Us in an Emergency

There’s growing evidence Tesla’s Autopilot handles lane dividers poorly

People who paid Tesla $3,000 for full self-driving might be out of luck
 
  • Like
Reactions: chinnam3
Regardless, tedium is no excuse for unsafe driving. 5 seconds of not watching the road could have easily cost others their lives also.
Agreed, the problem is AP actually causes conditions where most people are very prone to not pay full attention. Ironically, the better it gets, the more likely people will pay less attention. Concluded by Google, Toyota and other independent researchers. Check out the articles I linked above.
 
  • Informative
Reactions: chinnam3
Believing something doesn't make it true. On what facts are you basing this belief? I'm not aware of anything in the factual record showing Autopilot is safer than a human being. In fact, all of the crashes in which AP was involved likely would not have occurred had the driver not been using AP. That makes it less safe, not more.

Isn't that the same as me saying: Look at these people who were buckled in and killed, whereas if they were not buckled in they would have been thrown clear and survived. In fact, I can show you accidents where a seat belted person was crushed and the person next to him, unbelted, was thrown from the vehicle and survived.

Does that make wearing a seat-belt less safe?

saying AP is less safe to use it than not use it seems to me to be a straw man argument...

Agreed. For example, what if a person is tired and would have drifted over the center line and collided head on with a passenger van, killing 5 people. Instead, he made it home safe on AP. We just can't account for those unknowns, but I still can imagine them and see why AP is safer, while still causing some deaths. We base our rules on bringing deaths down, not eliminating them, and we sacrifice some to save more (hence my seat belt example).

Agreed, the problem is AP actually causes conditions where most people are very prone to not pay full attention. Ironically, the better it gets, the more likely people will pay less attention. Concluded by Google, Toyota and other independent researchers. Check out the articles I linked above.

This is exactly my concern and I see it happening to me and have to tell myself to pay full attention -- it's a flawed system -- but I've seen the videos in addition to just agreeing to the beta button:


It disturbs me that Tesla even calls it autopilot -- I know it's technically correct -- so only the pilots really know what it means. To us lay people, it means what it says -- and that only adds to the already false sense of security.
 
Last edited:
Agreed, the problem is AP actually causes conditions where most people are very prone to not pay full attention. Ironically, the better it gets, the more likely people will pay less attention. Concluded by Google, Toyota and other independent researchers. Check out the articles I linked above.

I agree the physiological factor is quite interesting. However, I don't agree with the idea that the majority of people will pay less attention. If the driver is responsible, there is little to be distracted by.

We already have a problem with people not paying attention without AP. I've been rear ended twice (once at a light), and there are many who have been injured by distracted drivers not braking on the expressway. I've even seen bumper bumpers in stop and go traffic. Even good drivers can be distracted by exterior objects or interior children. Is it not better to have a system that tries to keep in lane and away from the next bumper?

I think a lot of the criticism about AP comes from what people want/ hope/ expect it to be. They want it to be perfect, they expect it to handle everything, and when it doesn't their hope's are dashed. It should have ___ it should be able to ___. Yes, we want it to do that, and hopefully someday it will, for all our sake. (And isn't that what we say of humans who hit us with their cars?) But right now, the truth is it is an upgraded cruise control lane/ vehicle follower system. And that type of system cannot handle all the situations out there. It will try to keep you a safe distance from the next car, it will try to keep you in your lane, but it won't always, so keep guiding with your hand and keep that right foot ready, because at any time you may need to assert that you are the one driving the car that happens to mostly go in the right direction.
 
If the driver is responsible, there is little to be distracted by.

I don't know about that. When the car is "autopiloting" me around, I can sure find a lot to be distracted by (and have to consciously tell myself not to -- and is also the reason for "nags" so I must not be alone in the distraction department if nags are required), whereas with no AP I am fixed to the road, knowing the car will go off the road if I wasn't. That's just not the case with an "automatic pilot" double stalk click available. And, regardless of the fact that I am responsible, there is still much to be distracted by when AP takes over.

Again, I still think it save more lives than it takes, but I think Tesla really needs to change the name, and tell people exactly what you say in your post about its limitations. Instead, we get videos from Tesla showing FSD and saying the driver is only required by law, which sure fooled me at first, and now I find suspect. That's certainly not helpful in the AP education department.
 
  • Like
Reactions: Matias
I don't know about that. When the car is "autopiloting" me around, I can sure find a lot to be distracted by (and have to consciously tell myself not to -- and is also the reason for "nags" so I must not be alone in the distraction department if nags are required), whereas with no AP I am fixed to the road, knowing the car will go off the road if I wasn't. That's just not the case with an "automatic pilot" double stalk click available. And, regardless of the fact that I am responsible, there is still much to be distracted by when AP takes over.

Again, I still think it save more lives than it takes, but I think Tesla really needs to change the name, and tell people exactly what you say in your post about its limitations. Instead, we get videos from Tesla showing FSD and saying the driver is only required by law, which sure fooled me at first, and now I find suspect. That's certainly not helpful in the AP education department.


Admittedly, I am a terrible passenger (and thus am rarely one). I pay attention all the time, even when not driving. On a test drive, I tried autopark and that alone drove me crazy...

This weekend was first trip with Ford ACC, I was always checking if it was seeing the car in front and whether I was being speed limited by it. I expect that paranoia wears off after a while. It did decide to drop speed 10 MPH while on the freeway for no apparent reason...

Curious: what distracts you?
 
This report points out a fatal flaw with Autopilot. It can see a car in front of you only if the closing speed is under 30 mph. If not, the system is blind to anything in front of it.

This is not true.
As early as 18.10.4 (on AP2+) and probably earlier the AP was able to see stopped cars with both radar and cameras, correlate this info and even try to guess how much of an obstacle such stopped cars would present. I have seen very compelling evidence of this and some people I know are working on creating a bit of material in a form that could be presented to the lay public.
 
Wait... what? On what facts are you basing THIS statement? "In fact, all of the crashes in which AP was involved likely would not have occurred had the driver not been using AP."

Do you claim to have access to data on ALL of the Tesla crashes on the planet in which AP was involved? And in all of that data that you must have tell me how you determined that those incidents LIKELY wouldn't have occurred.

This is where I got the 40% figure:
https://static.nhtsa.gov/odi/inv/2016/INCLA-PE16007-7876.PDF

From the report:
Figure 11 shows the rates calculated by ODI for airbag deployment crashes in the subject Tesla vehicles before and after Autosteer installation. The data show that the Tesla vehicles crash rate dropped by almost 40 percent after Autosteer installation.

So my comment meant just this, I believe I am 40 percent safer in my Tesla when using Autopilot versus driving my Tesla and NOT using it.

Have you ever had a massive sneeze hit you when traveling at 65 MPH in traffic? In that split second that your eyes close lots can happen. And I'm convinced that with Autopilot/TACC/Emergency Braking active in my car that second is going to be safer than say in my first car, a 1968 VW Beetle.

U.S. safety agency: prior probe did not assess 'effectiveness' of Tesla Autopilot
U.S. safety agency: prior probe did not assess 'effectiveness' of...
 
What the NTSB report ACTUALLY says:

"During the 60 seconds prior to the crash, the driver’s hands were detected on the steering wheel on three separate occasions, for a total of 34 seconds; for the last 6 seconds prior to the crash, the vehicle did not detect the driver’s hands on the steering wheel."

It is VERY precise in its wording. It does NOT say the driver was not holding the steering wheel. It says the vehicle did not detect the driver's hands.

It is disingenuous, misleading and disheartening that this has been reported right here in the blog post as "investigators leveraged data pulled from the car’s computer that shows the driver’s hands were on the steering wheel for just 34 seconds during the minute before impact."
 
What the NTSB report ACTUALLY says:

"During the 60 seconds prior to the crash, the driver’s hands were detected on the steering wheel on three separate occasions, for a total of 34 seconds; for the last 6 seconds prior to the crash, the vehicle did not detect the driver’s hands on the steering wheel."

It is VERY precise in its wording. It does NOT say the driver was not holding the steering wheel. It says the vehicle did not detect the driver's hands.

It is disingenuous, misleading and disheartening that this has been reported right here in the blog post as "investigators leveraged data pulled from the car’s computer that shows the driver’s hands were on the steering wheel for just 34 seconds during the minute before impact."
Hey @TMC Staff ! Fix please.
 
  • Helpful
Reactions: NeoDog
I've been thinking about this a little more, and perhaps it's time to turn this around on its head so as to not make it seem anti-AP or anti-Tesla.

What we know:
- the driver DEFINITELY DID have his hands on the wheel 6 seconds before impact.

What we don't know:
- what the driver was doing (if anything) in those last 6 seconds.
 
"Assuming the lead car knows what they are doing" is precisely the mistake in the design here.
Not too fast. You are making it sound like the brilliant engineers in Tesla are morons and you are an Einstein. You figured out a massive hole in the algorithm? really? relax..

AP locks to the front car only when it can't see the lanes properly which can happen even in good lane conditions when the cars are too close to each other. So will this to lead some false positives when the lead car suddenly moves away? of course it would.

This is where the man-machine duo pair makes it more safer than either a man or machine driving alone. The human is expected to use some basic common sense - and not be a You-You aka dick - and pay some basic attention and be cognizant of the surroundings when the camera is not sensing the lanes clearly.

There are times when AP is stressed, such as:

- lanes not clearly marked
- car in front too close as to obstruct lanes
- driving too fast for the conditions (for the machine)
- lanes split, leading to no-drive intermediate sections (CA accident conditions)
- driving next to a big-rig (lanes are obscured)
- excessive speed on curves

It is the expectation that the human would be watchful and help AP make correct decisions (or not let AP drive) in those situations.

Locking to the front car is a great compromise that lets you drive in a variety of conditions. The only that is being asked in return is that the driver be watchful.

Every decision AP makes is probabilistic and rarely 100% certainty. Humans are expected the fill the remaining uncertainty - and that ladies and gentleman is what Level 2 is all about. It is a human-machine combo and not just the machine driving the car at any time.
 
Last edited: