Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilot Called Out in NTSB Report on Tesla Crash into Fire Truck

This site may earn commission on affiliate links.


The National Transportation Safety Board (NTSB) said Wednesday that driver errors and Autopilot caused a January 2018 crash of a Model S into a parked fire truck.

According to the report: ​”The National Transportation Safety Board determines that the probable cause of the Culver City, California, rear-end crash was the Tesla driver’s lack of response to the stationary fire truck in his travel lane, due to inattention and overreliance on the vehicle’s advanced driver assistance system; the Tesla’s Autopilot design, which permitted the driver to disengage from the driving task; and the driver’s use of the system in ways inconsistent with guidance and warnings from the manufacturer.”

Performance data collected during the investigation show that the Tesla followed various lead vehicles in heavy traffic for minutes before the crash. When the last lead vehicle changed lanes—3 to 4 seconds before the crash—revealing the fire truck on the path of the Tesla, the system was unable to immediately detect the hazard and accelerated the Tesla toward the stationary truck.

“By the time the system detected the stationary vehicle and gave the driver a collision warning—0.49 second before impact —the collision was imminent and the warning was too late, particularly for an inattentive driver,” the report said. “The AEB system did not activate. Had the driver been attending to the driving task, he could have taken evasive action to avoid or mitigate the collision.”

The fire truck was unoccupied and the driver was not injured in the incident.

 
Last edited by a moderator:
What about bridges and overhead signs? Also, what is the resolution of the radar? can it really distinguish an object in front of you from one in the next lane over?

I'm sure it can distinguish the lane a target is in since I had a 2012 Charger that I drove up until I got my Model 3 in May that could do it quite well. Even at some distance it would not track and follow a car in the lane to the left or the right but could acquire a car in the same lane. And it could do it way out there so this says it had pretty good resolution. If Dodge can do this in 2011 then surely Tesla's radar is at least as good as that one.

Now bridges & overhead signs are another story. The more I think about that the more I'm inclined to think that bridge detection is the reason it fails to recognize a stationary object at times.
 
  • Like
Reactions: JeffnReno
That's because they decided to give it the misleading name "autopilot." I regard my EAP as a mature lane-keeping assist system, which is not "autopilot." As a lane-keeping assist system, which by definition would require driver attention, it could be considered mature software. But because they call it "autopilot" they have to say it's still in beta so that they can tell you to keep your eyes on the road and your hands on the wheel because it's not an autopilot system.

Perhaps the reason you think it is misleading is because of your perceived definition of the term "autopilot". This is what Wiki has to say about autopilot:
"Autopilots do not replace human operators, but instead they assist them in controlling the aircraft. This allows them to focus on broader aspects of operations such as monitoring the trajectory, weather and systems." Musk has said the same thing in interviews. But I do agree that to most people, autopilot means "automatic pilot", needing no intervention to do its job. But I also think that calling it a lane keeping system doesn't describe it well either. It's more than that.
 
Mercedes, BMW (as of 2018) are using forward looking stereo cameras for their ADAS (in conjunction with radar). Hell, Subaru is just using two cameras for their “eyesight” system, with no radar.

AP2-3 cars have three forward looking cameras, I don’t see why two of those cameras could be used as stereo cams to measure distance to a stationary object, and react to it. I realize this thread is about a AP1 car, but just a thought.
 
  • Informative
Reactions: erik_k
The National Transportation Safety Board (NTSB) said Wednesday that driver errors and Autopilot caused a January 2018 crash of a Model S into a parked fire truck. According to the report: ”The National Transportation Safety Board determines that the probable cause of the Culver City, California, rear-end crash was the Tesla driver’s lack of response...
[WPURI="https://teslamotorsclub.com/blog/2019/09/04/autopilot-called-out-in-ntsb-report-on-tesla-crash-into-fire-truck/"]READ FULL ARTICLE[/WPURI]
Unbelievable that a person would be so trusting of a program and some sensors at highway speeds. This is still pretty experimental technology. It takes the stress out of moment to moment driving, but it doesn’t replace an attentive focus on the road and traffic. That person was fortunate nobody was harmed and that they weren’t injured. I’ll bet the repair bill for the Tesla and the fire truck exceeded their collision damage coverage though...,
 
It’s very frustrating when people don’t pay attention to the message about needing to keep your hands on the wheel at all times.

Tesla has a self-driving strategy other companies abandoned years ago

at the end of the article:

Promoted Comments
  • throx Ars Scholae Palatinae
    JUMP TO POST
    The attention problem is well known in engineering. It is very hard to get a human to concentrate on something that will turn up good more than 99% of the time, even when there's serious or fatal consequences of failure. Trains are the classic example - tracks are amost always clear, signals are almost always correct which means you have to devise all sorts of systems to keep the driver alert.

    The TSA has similar issues (among its many), where almost all passengers are not carrying contraband. They use systems that deliberately plant false images, or have red teams that try to run the checkpoints to theoretically keep the inspectors from just blindly passing anything.

    One system I was personally involved in gave green traffic signals to trams approaching intersections well over 99% of the time, but the tram company became extremely worried about tram drivers driving through intersections on red just because they were so well trained to assume green. Solutions being discussed involved things even to the point of deliberately deoptimising to give reds more often, just to force the attention.

    Ultimately, a professional engineer *will* be held partially responsible if they design a HMI that doesn't take into account operator attention behaviour. It's a risk factor you have to measure and account for and you can't straight up blame the operator if you aren't adequately covering the inattention risk.
 
Excellent article and good ideas to think about. I test drove a new P100D before ultimately buying the CPO MS85 that I got. I was amazed at the AP abilities, but knew I would never fully trust it enough to let it drive and stop paying full attention to the road. What I could see myself doing, since I work 12.5 hr shifts and have a 50 minute commute, would be to engage AP if I'm particularly drowsy so that my car would be attentive if I unintentionally nodded off for a second -- key word there *UNINTENTIONALLY*. I'm the kind of driver that finds it almost impossible to be a passenger because I don't trust anyone else's driving (53 y/o and I've never had an accident). I don't see myself fully trusting any automated car in the next few years... maybe someday, but not today.
 
My 2019 Model 3 has detected issues ahead of the car in front of me and Autopilot responded appropriately even before I understood why. So, even though I am very pleased with its capabilities, I am STILL always attentive to traffic conditions. Unfortunately, the media loves the drama and implication that Tesla "self-driving" resulted in an accident.
 
“The AEB system did not activate. Had the driver been attending to the driving task, he could have taken evasive action to avoid or mitigate the collision.”
AEB on my Tesla is complete rubbish. So is the "hands on wheel" detection. I've been in scenarios where AEB should have kicked and it does nothing. I also use AP with my hands on the wheel at all times and still get nagged like crazy unless I am constantly torquing the wheel.

Tesla needs to admit their current detection methods are flawed, and stop blaming drivers for believing Tesla's own hyperbolic and twisted statements about how safe AP is (and rename AEB to WTFDIB ... as in Why TF didn't it brake!).
If you keep up with reports you read that there have been multiple situations where AP doesn't recognize a stationary vehicle. They seem to see them moving but not always when stopped. I had this happen to me when I thought it would stop at a stop light because a car was stopped there, it didn't, but I did. I use AP to work and back, 30 miles each way and seldom have a problem because it's basic highway traffic and very relaxing for me. I'm very attentive on sharp turns and merging traffic.
 
Telsa's "detection" methods are not "flawed"; they are in development, and will always be. That is the case with all viable software products and all always have bugs. As others so frequently state, Tesla never states or implies that you can sleep at the wheel (literally or figuratively). So, yes, it IS the driver's fault. I paid $6000 for "self driving", knowing full well it is not autonomous, and it's worth twice that.
 
  • Love
  • Funny
Reactions: JeffnReno and Octo