Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Fatal autopilot crash, NHTSA investigating...

This site may earn commission on affiliate links.
did they vote on their findings or did they make their findings known and get the manufacturer and airlines to correct the defects that were found?

The investigation of the first incident (United Airlines 585) didn't result in enough evidence to be able to state a probable cause. The board decided (voted?) to publish the final report with "probable cause undetermined", only the 4th NTSB report in it's history that was so published.

It wasn't until the 2nd incident (USAir 427) and the 3rd (Eastwind 517, which occurred during the USAir 427 investigation) that enough evidence was able to be recovered and tested that the conclusions could be drawn. The hydraulic servo unit in United 585 was too damaged to test. But USAir 427's unit was functional, and Eastwind 517 didn't crash, so it could be tested as well. Also, the pilots of Eastwind 517 were interviewed, which wasn't possible on the other two incidents since they were both total fatalities.

After the evidence was gathered, USAir 417 and Eastwind 517's final reports reflected the hydraulic servo failure as the probable cause, and United 585's report was amended to state that the hydraulic servo failure was the probable cause.

The reports resulted in action by the FAA to compel Boeing to replace the hydraulic servo unit with a redesigned unit on all Boeing 737's.
 
The driver set his car to "drive itself into a mountain".

Yes, but you can't stop the investigation there. You have to answer why.
  • Was there an equipment failure that wasn't obvious?
  • Are the design criteria for the system adequate for the job it is tasked to perform?
  • Are there human performance factors that aren't being dealt with properly such that the system as a whole is actually less safe than designed?
You can't just blame the driver(s) ... "Oh, they did something wrong". That's not good enough. We have to understand why they made the decisions they did, including all factors that led up to that in order to find the best way to improve things.
 
Yes, but you can't stop the investigation there. You have to answer why.
  • Was there an equipment failure that wasn't obvious?
  • Are the design criteria for the system adequate for the job it is tasked to perform?
  • Are there human performance factors that aren't being dealt with properly such that the system as a whole is actually less safe than designed?
You can't just blame the driver(s) ... "Oh, they did something wrong". That's not good enough. We have to understand why they made the decisions they did, including all factors that led up to that in order to find the best way to improve things.
not to reignite this very long debate, the bottom line is that the operator is always responsible for the safe operation of the car. if there was some sort of failure he was obliged to immediately assume control, something the evidence suggests did not occur
 
  • Like
Reactions: EVie'sDad
not to reignite this very long debate, the bottom line is that the operator is always responsible for the safe operation of the car. if there was some sort of failure he was obliged to immediately assume control, something the evidence suggests did not occur


True. I just wonder at what point human factors come into play. If the system by it's nature lulls you into a sense of complacency, maybe changes should be made to overcome that psychology.

Maybe that explains the new interior facing camera on the Model 3. Monitoring eye direction and head position to make sure you are actually paying attention vs hands on wheel checks.
 
not to reignite this very long debate, the bottom line is that the operator is always responsible for the safe operation of the car.

Of course he is, that's the way the system is currently designed.

if there was some sort of failure he was obliged to immediately assume control, something the evidence suggests did not occur

This assumes he would know of such failure, and know of it in time to take appropriate action. What if the failure was not indicated to him, thus he didn't know about it? What if once the failure was known to him, it was then too late to take action? What if he attempted to assume control but was unable to for some reason?

These are the questions that have to be answered in order for us to understand exactly where this went wrong. You cannot assume anything, and cannot jump to conclusions until the investigation produces evidence of every last detail.
 
how would you react when you see a truck astride the road ahead of you? wait for ap to slow you down or would you immediately take over and bring the car to a halt?

Of course he should have seen the truck and slowed down or avoided it. 99.99999% of the time, that's exactly what a driver would do. But this time, someone didn't. WHY? That's why we're investigating.

In all likelihood, his attention was diverted due to factors that no one but the driver can be responsible for. But we cannot assume that without a full investigation and concrete evidence. That's all I'm saying.
 
Of course he should have seen the truck and slowed down or avoided it. 99.99999% of the time, that's exactly what a driver would do. But this time, someone didn't. WHY? That's why we're investigating.

In all likelihood, his attention was diverted due to factors that no one but the driver can be responsible for. But we cannot assume that without a full investigation and concrete evidence. That's all I'm saying.
We can, however assume that as per the earlier investigation that the car was in sight of the truck for around seven seconds. Why did the truck driver fail to yield. We have laws in place to prevent this sort of thing.
 
Joshua Brown family lawyer Jack Landskroner gave a statement 1 day before 9/12/2017 National Transportation Safety Board hearing on probable cause:

"We heard numerous times that the car killed our son. That is simply not the case,"

"There was a small window of time when neither Joshua nor the Tesla features noticed the truck making the left-hand turn in front of the car."

"People die every day in car accidents,"

"Change always comes with risks, and zero tolerance for deaths would totally stop innovation and improvements."

The family "takes solace and pride in the fact that our son is making such a positive impact on future highway safety."
 
The abstract of the NTSB final report on this incident is now available:

https://www.ntsb.gov/news/events/Documents/2017-HWY16FH018-BMG-abstract.pdf

This includes factual findings, probable cause, and new recommendations.

The probable cause is stated:

PROBABLE CAUSE

The National Transportation Safety Board determines that the probable cause of the Williston, Florida, crash was the truck driver’s failure to yield the right of way to the car, combined with the car driver’s inattention due to overreliance on vehicle automation, which resulted in the car driver’s lack of reaction to the presence of the truck. Contributing to the car driver’s overreliance on the vehicle automation was its operational design, which permitted his prolonged disengagement from the driving task and his use of the automation in ways inconsistent with guidance and warnings from the manufacturer.


The complete final report will be published as part of the public docket in a few weeks, available here:

Accident ID HWY16FH018 Mode Highway occurred on May 07, 2016 in Williston, FL United States Last Modified on June 28, 2017 10:06 Public Released on June 19, 2017 11:06 Total 44 document items
 
Here are the relevant findings about AP (the findings before these are about things like drug use/lack of drug use and the findings after these are about the need for standardized data collection):

5. If automated vehicle control systems do not automatically restrict their own operation to those conditions for which they were designed and are appropriate, the risk of driver misuse remains.
6. Because driving is an inherently visual task and a driver may touch the steering wheel without visually assessing the roadway, traffic conditions, or vehicle control system performance, monitoring steering wheel torque provides a poor surrogate means of determining the automated vehicle driver’s degree of engagement with the driving task.
7. The Tesla driver’s pattern of use of the Autopilot system indicates an overreliance on the automation and a lack of understanding of system limitations.
8. The Tesla driver was not attentive to the driving task, but investigators could not determine from the available evidence the reason for his inattention.
9. The way that the Tesla Autopilot system monitored and responded to the driver’s interaction with the steering wheel was not an effective method of ensuring driver engagement.

Here are the main NTSB recommendations:

To the US Department of Transportation:
1. Define the data parameters needed to understand the automated vehicle control systems involved in a crash. The parameters must reflect the vehicle’s control status and the frequency and duration of control actions to adequately characterize driver and vehicle performance before and during a crash.
To the National Highway Traffic Safety Administration:
2. Develop a method to verify that manufacturers of vehicles equipped with Level 2 vehicle automation systems incorporate system safeguards that limit the use of automated vehicle control systems to those conditions for which they were designed.
3. Use the data parameters defined by the US Department of Transportation in response to Safety Recommendation [1] as a benchmark for new vehicles equipped with automated vehicle control systems so that they capture data that reflect the vehicle’s control status and the frequency and duration of control actions needed to adequately characterize driver and vehicle performance before and during a crash; the captured data should be readily available to, at a minimum, National Transportation Safety Board investigators and National Highway Traffic Safety Administration regulators.
4. Define a standard format for reporting automated vehicle control systems data, and require manufacturers of vehicles equipped with automated vehicle control systems to report incidents, crashes, and vehicle miles operated with such systems enabled.

To manufacturers of vehicles equipped with Level 2 vehicle automation systems (Audi of America, BMW of North America, Infiniti USA, Mercedes-Benz USA, Tesla Inc., and Volvo Car USA):
5. Incorporate system safeguards that limit the use of automated vehicle control systems to those conditions for which they were designed.
6. Develop applications to more effectively sense the driver’s level of engagement and alert the driver when engagement is lacking while automated vehicle control systems are in use.
 
If the system by it's nature lulls you into a sense of complacency, maybe changes should be made to overcome that psychology.

Isn't that exactly what has happened?

The present level of AP 2 unpredictability keeps drivers holding the wheel and focussed.

Good for driver alertness, not so great for nerves and ulcers.

So we have an absurd situation. One group fearing the unpredictability of a system that isn't good enough and the other fearing the return of driver complacency if the system improves.

Looks like no one can develop or test anything.

Maybe selecting the EAP/FSD options in the Design Studio needs to bring up a Ludicrous-like Bring it on/ I want my Mommy confirmation choice.
 
  • Informative
Reactions: dhanson865
This is a nothing burger. No matter what feature you have in a car, you could find an accident in a million miles where it contributed to an accident that lead to a death. For example, the review mirror more then likely has contributed, along with many other things like the person texting just before looking up and getting distracted by the review mirror just before going off a cliff. The point is that people who do dumb things in cars kill themselves that will not change, though the Tesla vehicles today will protect those dump people better then almost any other production car today. Yes, there are different dumb things you can do in a Model S, but in general, they are much safer and its not even close.
 
  • Like
Reactions: 1 person
We have a situation where driver deaths are finally increasing WHILE cars are concurrently getting safer and medical trauma technology improves. We are rapidly losing our driving skills.

We are deliberately crashing. You might not agree, but if I fire a rifle with a blindfold on and it kills, we do consider that poor manners. And folk like me consider that a deliberate action.
 
  • Like
Reactions: kavyboy
Looks like no one can develop or test anything.

OR they could develop alternate attention verification methods such as the internal camera that is on the Model 3. And I would bet we will see geofencing of AP as a result of this as well.

(Which might not be a bad thing - the very first time my husband engaged AP in a loaner I had it was on a narrow two lane road with no shoulder - I had to tell him that was NOT a good idea. I have no idea why the car thought that road was acceptable for AP engagement)
 
OR they could develop alternate attention verification methods such as the internal camera that is on the Model 3.

They did.

This incident reinforced the hands on the wheel, the frequent system nags and the AP lockouts.

Lots of attention verification methods.

Throw an internal camera in as well? Sure, why not?

But it can only provide a different route to the same nags and lockouts as currently provided by the steering wheel sensors and audible warnings.

And just checking someone's eyes are looking ahead doesn't guarantee that the brain attached is seeing a hazardous situation.

Which brings us back to the circular argument "Well in that case, the car should do even more"

Neuralink connection to AP, anyone?

No system is ever going to be 100% safe.
 
Neuralink connection to AP, anyone?

No system is ever going to be 100% safe

No, its a balance, obviously. I feel the original implementation of AP probably left the driver too many opportunities to misuse the system. Tesla apparently agreed based on the changes they have made. I am not arguing the system should be perfect but that doesn't mean they shouldn't improve things based on how the system is being used/misused.
 
  • Like
Reactions: malcolm