Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Blog NTSB Releases Preliminary Report on Autopilot Crash

This site may earn commission on affiliate links.
The U.S. National Transportation Safety Board issued a preliminary report on a fatal March 23 crash involving a Tesla Model X using Autopilot near Mountain View, Calif.

Investigators leveraged data pulled from the car’s computer that shows the driver’s hands were on the steering wheel for just 34 seconds during the minute before impact.

Data also showed that the Model X sped up to 71 miles per hour just before hitting a highway barrier. Tesla issued a release in March that included most of the info in the report. Tesla said “the driver had received several visual and one audible hands-on warning earlier in the drive” and the driver had about five seconds and 150 meters of unobstructed view of the concrete…but the vehicle logs show that no action was taken.”

The NTSB report said the crash remains under investigation, with the intent of issuing safety recommendations to prevent similar crashes. No pre-crash braking or evasive steering movement was detected, according to the report.

“Tesla Autopilot does not prevent all accidents — such a standard would be impossible — but it makes them much less likely to occur,” Tesla wrote in its March post. “It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists.”

Read the full report here.

 
Last edited by a moderator:
This report points out a fatal flaw with Autopilot. It can see a car in front of you only if the closing speed is under 30 mph. If not, the system is blind to anything in front of it. That includes street sweepers in China, tractor trailers in Florida and firetrucks in California and Utah, a car at a stop light in Arizona, and a concrete barrier in Mountain View.

Tesla needs to come clean what is causing these accidents so that people know what to be watching for. Guess this is why the IIHS only tests AEB systems (along with Autopilot) at 12 and 25 mph. They don't work at faster closure rates.
 
This report points out a fatal flaw with Autopilot. It can see a car in front of you only if the closing speed is under 30 mph. If not, the system is blind to anything in front of it. That includes street sweepers in China, tractor trailers in Florida and firetrucks in California and Utah, a car at a stop light in Arizona, and a concrete barrier in Mountain View.

Tesla needs to come clean what is causing these accidents so that people know what to be watching for. Guess this is why the IIHS only tests AEB systems (along with Autopilot) at 12 and 25 mph. They don't work at faster closure rates.

What do you mean by "closing speed"?
 
I think the closing speed is closer to 50 MPH.

In the manual:
Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead.

So:
50 MPH - Stationary (zero MPH) = 50 MPH
 
Not necessarily.
"Data also showed that the Model X sped up to 71 miles per hour just before hitting a highway barrier"

I used to own a Honda Accord with TACC and you could set that at a specific MPH. If I had set mine at 71 and a driver was in front of me going 62, my car would be following at 62 plus whatever follow distance I had set (very similar to Tesla). Until the car in front swerves/changes lanes/otherwise disappears from my car's view and then my Honda would have sped up to 71 and smashed my inattentive face into the same guard rail as I looked at my twitter feed happily.

The common thread in a lot of these is, if you want a really cool fast fun safe car that aids you in relieving a lot of the tedium of driving, buy a Tesla. If you want to bury your nose in a phone/tablet or sleep, call an Uber.
 
Cars without "autopilot" or any other braking/steering system with inattentive drivers at the wheel = exact same result. I'll take my chances continuing to use autopilot wisely and believe that I am at LEAST 40 percent safer in my Tesla when using it.
Believing something doesn't make it true. On what facts are you basing this belief? I'm not aware of anything in the factual record showing Autopilot is safer than a human being. In fact, all of the crashes in which AP was involved likely would not have occurred had the driver not been using AP. That makes it less safe, not more.
 
Believing something doesn't make it true. On what facts are you basing this belief? I'm not aware of anything in the factual record showing Autopilot is safer than a human being.

Lane departure warning cuts crashes
Results of the new study indicate that lane departure warning lowers rates of single-vehicle, sideswipe and head-on crashes of all severities by 11 percent and lowers the rates of injury crashes of the same types by 21 percent.

Front crash prevention cuts rear-enders
Systems with automatic braking reduce rear-end crashes by about 40 percent on average, while forward collision warning alone cuts them by 23 percent, the study found.

In fact, all of the crashes in which AP was involved likely would not have occurred had the driver not been using AP. That makes it less safe, not more.

Now who is claiming a fact without any data?

to @jkennebeck 's point, an AP car will at least try to keep you in your lane and stop before hitting another car. The standard vehicle on the road does neither. An attentive driver will not (at-fault) crash in either (or will do so at the same rate).[/QUOTE]
 
Believing something doesn't make it true. On what facts are you basing this belief? I'm not aware of anything in the factual record showing Autopilot is safer than a human being. In fact, all of the crashes in which AP was involved likely would not have occurred had the driver not been using AP. That makes it less safe, not more.

Wait... what? On what facts are you basing THIS statement? "In fact, all of the crashes in which AP was involved likely would not have occurred had the driver not been using AP."

Do you claim to have access to data on ALL of the Tesla crashes on the planet in which AP was involved? And in all of that data that you must have tell me how you determined that those incidents LIKELY wouldn't have occurred.

This is where I got the 40% figure:
https://static.nhtsa.gov/odi/inv/2016/INCLA-PE16007-7876.PDF

From the report:
Figure 11 shows the rates calculated by ODI for airbag deployment crashes in the subject Tesla vehicles before and after Autosteer installation. The data show that the Tesla vehicles crash rate dropped by almost 40 percent after Autosteer installation.

So my comment meant just this, I believe I am 40 percent safer in my Tesla when using Autopilot versus driving my Tesla and NOT using it.

Have you ever had a massive sneeze hit you when traveling at 65 MPH in traffic? In that split second that your eyes close lots can happen. And I'm convinced that with Autopilot/TACC/Emergency Braking active in my car that second is going to be safer than say in my first car, a 1968 VW Beetle.
 
Believing something doesn't make it true. On what facts are you basing this belief? I'm not aware of anything in the factual record showing Autopilot is safer than a human being. In fact, all of the crashes in which AP was involved likely would not have occurred had the driver not been using AP. That makes it less safe, not more.
I'm not sure if we can separate the autonomous crash avoidance parts of EAP from Autosteer/TACC.

On one hand, yes, there are plenty of people doing who knows what that treat Autosteer/TACC like a level 3 or 4 autonomous system, and sooner or later they run into trouble. If Tesla disabled Autosteer/TACC, those accidents may have been avoided.

At the same time, the NTSB's initial assessment of Tesla's airbag deployment data before and after Autosteer was deployed to the fleet suggested that adding Autosteer substantially improved safety.

I suppose you could suggest that Tesla (and I imagine all other manufacturers) disable Autosteer/TACC as a convenience feature because people could use it incorrectly and put themselves and others in danger. If Tesla stops selling EAP, where would the resources for developing EAP as a whole, which greatly benefit the safety side of things, come from? If Tesla had never sold/developed EAP, would whatever accident avoidance system they had instead be as good as EAP is now?
 
This report points out a fatal flaw with Autopilot. It can see a car in front of you only if the closing speed is under 30 mph. If not, the system is blind to anything in front of it. That includes street sweepers in China, tractor trailers in Florida and firetrucks in California and Utah, a car at a stop light in Arizona, and a concrete barrier in Mountain View.

Tesla needs to come clean what is causing these accidents so that people know what to be watching for. Guess this is why the IIHS only tests AEB systems (along with Autopilot) at 12 and 25 mph. They don't work at faster closure rates.

I suspect the reasoning here is that above the lower speeds, it's too difficult to create a proper response algorithm. Any drive on a freeway at speeds shows drivers that are weaving in and out of traffic, or that intentionally cut in with minimal room to make an offramp or avoid a much slower vehicle in their lane. if the system reacted with emergency braking, it would also have to assess the distance between the vehicle immediately behind, and presence of obstacles to the right and left and their distances in order to make a decision.

Lot's of discussion in the computer science community these days regarding the "bias" that can be designed (unintentionally) into the algorithms used for this sort of thing. Everyone has their own view of what would constitute a "safe" decision. This is why auto-pilot issues the hands-on reminders and other warnings. Indivudal operator brain is required, so that any "final" decision is their's and not some programmer.
 
  • Helpful
Reactions: SmartElectric
Believing something doesn't make it true. On what facts are you basing this belief? I'm not aware of anything in the factual record showing Autopilot is safer than a human being. In fact, all of the crashes in which AP was involved likely would not have occurred had the driver not been using AP. That makes it less safe, not more.
Only if the driver ignores myriad warnings and directions to pay attention.

Autopilot doesn't make you less safe. Not remotely close. It's not a replacement for you...its an addition to you and your attentiveness, vision, instincts. For that reason it's most certainly safer when properly used.

Anyone who uses it improperly, even after agreeing to stay alert, stay attentive and to keep your hands on the wheel...if they don't, and they crash, they're responsible for that, not autopilot.

If your car has an airbag which you erroneously believe protects you from anything.. and you deliberately smash the car into the wall, killing yourself, is it the airbags fault? Was the car with the airbag less safe? No. The person who misunderstood and ignored the warnings of the proper role of an airbag was at fault and caused their demise.

Same here according to this evidence.
 
The U.S. National Transportation Safety Board issued a preliminary report on a fatal March 23 crash involving a Tesla Model X using Autopilot near Mountain View, Calif.. Investigators leveraged data pulled from the car’s computer that shows the driver’s hands were on the steering wheel for just 34 seconds during the minute before impact. Data...
[WPURI="https://teslamotorsclub.com/blog/2018/06/07/ntsb-releases-preliminary-report-on-autopilot-crash/"]READ FULL ARTICLE[/WPURI]

My take, my question,

The acceleration, from what I read, it the result of the car no longer seeing the slower car in front of it and it trying to maintain the set cruise control speed.
I'm not really comfortable with the statement "driver didn't have their hands on the wheel" because I tend to get that a lot, but it is because I drive with a relatively light grip of the wheel. And because if the Autopilot and I are agreeing, I'm not adding any torque to the wheel. To overcome the warning, I have to actively "fight" the wheel to add torque.

So I don't feel that it is necessarily correct to hang the driver out as the bad guy over what can be a human interface issue. Just because the car is issuing warnings, doesn't mean that I'm not attentive.

My query in this case, and the NTSB doesn't seem to understand it, is why the vehicle wasn't in a lane. What caused the vehicle to veer out of the lane and loose sight of the vehicle that it was following?

From the driver's side, since it seems as if they have had issue with this area previously, I'm not sure why the driver wasn't leery of the situation and overtook and controlled the vehicle. And even why they didn't take control from what should have been an noticeable change in direction, and heck to top it off, why stay in the left lane.

Overall, I think that this was a incident that occurs on the road on a daily basis. after all, the gore was already mangled. But if you have to assign blame, I think that everybody gets some, the driver, Tesla, as well as the NTSB.
 

I wasn't stating that I believed the NHTSA claimed this stat. And in the footnotes it says this:

22 The crash rates are for all miles travelled before and after Autopilot installation and are not limited to actual Autopilot use.

The assumption I made is that if a new system is made available and the resulting data shows fewer crashes, that new system can be held at least partially responsible. Unless we all think that of all those miles analyzed, not one of them was driven with Autosteer enabled. Maybe the reality is 20%? Maybe 5%? But even 0.01% is still safer than driving a car without it.
 
This report points out a fatal flaw with Autopilot. It can see a car in front of you only if the closing speed is under 30 mph. If not, the system is blind to anything in front of it. That includes street sweepers in China, tractor trailers in Florida and firetrucks in California and Utah, a car at a stop light in Arizona, and a concrete barrier in Mountain View.

Tesla needs to come clean what is causing these accidents so that people know what to be watching for. Guess this is why the IIHS only tests AEB systems (along with Autopilot) at 12 and 25 mph. They don't work at faster closure rates.

It's too dangerous to use. In normal driving, the driver has to keep 100% attention. With AP the driver get lazy and the attention to the road drops. Then when something out of the normal happens the driver is not fully engaged in driving and people die.

Having something that works 95% of the time is far more dangerous than not relying on it at all. It has to be 100% or nothing.
 
My query in this case, and the NTSB doesn't seem to understand it, is why the vehicle wasn't in a lane. What caused the vehicle to veer out of the lane and loose sight of the vehicle that it was following?

I don't think the vehicle did 'veer out of the lane and lose sight of the vehicle it was following', I suspect based on the report it was the other way around...

At 8 seconds prior to the crash, the Tesla was following a lead vehicle and was traveling about 65 mph.

At 7 seconds prior to the crash, the Tesla began a left steering movement while following a lead vehicle

At 4 seconds prior to the crash, the Tesla was no longer following a lead vehicle

At 3 seconds prior to the crash and up to the time of impact with the crash attenuator, the Tesla’s speed increased from 62 to 70.8 mph,
with no precrash braking or evasive steering movement detected

This suggests it was following another vehicle during the initial left manoeuvre into the gore zone and then the vehicle it was following veered into the correct lane causing the Tesla to lose its target and then accelerate.

3 seconds is an awfully long time for there to be no attempt at braking or steering input...
.
 
saying AP is less safe to use it than not use it seems to me to be a straw man argument... Very similar to an argument that I recently had about AWD/4WD being less safe than RWD... the argument was look at al lthe AWD cars that crashed... the stats say more cars crash with AWD than RWD... uh ok...so is it the AWD that is less safe or the driver not knowing the limits... exact same argument here... (and for the record I said AWD is safer than just RWD or FWD for that matter)


Honestly I bet the same arguments were said about "cruise control" when it first came out too... you know the old system that tried to maintain a set speed no matter what... If the driver didn't over ride it it would take a 35mph turn at 70mph if that was the "set" speed...