Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilot Called Out in NTSB Report on Tesla Crash into Fire Truck

This site may earn commission on affiliate links.


The National Transportation Safety Board (NTSB) said Wednesday that driver errors and Autopilot caused a January 2018 crash of a Model S into a parked fire truck.

According to the report: ​”The National Transportation Safety Board determines that the probable cause of the Culver City, California, rear-end crash was the Tesla driver’s lack of response to the stationary fire truck in his travel lane, due to inattention and overreliance on the vehicle’s advanced driver assistance system; the Tesla’s Autopilot design, which permitted the driver to disengage from the driving task; and the driver’s use of the system in ways inconsistent with guidance and warnings from the manufacturer.”

Performance data collected during the investigation show that the Tesla followed various lead vehicles in heavy traffic for minutes before the crash. When the last lead vehicle changed lanes—3 to 4 seconds before the crash—revealing the fire truck on the path of the Tesla, the system was unable to immediately detect the hazard and accelerated the Tesla toward the stationary truck.

“By the time the system detected the stationary vehicle and gave the driver a collision warning—0.49 second before impact —the collision was imminent and the warning was too late, particularly for an inattentive driver,” the report said. “The AEB system did not activate. Had the driver been attending to the driving task, he could have taken evasive action to avoid or mitigate the collision.”

The fire truck was unoccupied and the driver was not injured in the incident.

 
Last edited by a moderator:
The AEB system did not activate

Well, that is concerning. Three seconds at 21 to 31mph was not enough time for autopilot to detect a stationary fire truck.

It was a 2014 Model S with HW1. HW1 was aware of a decelerating car ahead 7 seconds before impact, which changed lanes after 4 seconds. It took another 2.51 seconds before HW1 detected the fire truck and sounded the forward collision alarm, but did not apply the AEB system.

Shame that 3000 milliseconds isn't long enough to prevent a crash. Technology still has a ways to go.
 
Well, that is concerning. Three seconds at 21 to 31mph was not enough time for autopilot to detect a stationary fire truck.

It was a 2014 Model S with HW1. HW1 was aware of a decelerating car ahead 7 seconds before impact, which changed lanes after 4 seconds. It took another 2.51 seconds before HW1 detected the fire truck and sounded the forward collision alarm, but did not apply the AEB system.

Shame that 3000 milliseconds isn't long enough to prevent a crash. Technology still has a ways to go.

It was an AP1 car. So I don't see how it would have much bearing on the current AP2/2.5/3 tech. It shows a weakness in AP1. But that does not mean that AP2 has the same weakness, especially AP2 uses completely different software.
 
“The AEB system did not activate. Had the driver been attending to the driving task, he could have taken evasive action to avoid or mitigate the collision.”
AEB on my Tesla is complete rubbish. So is the "hands on wheel" detection. I've been in scenarios where AEB should have kicked and it does nothing. I also use AP with my hands on the wheel at all times and still get nagged like crazy unless I am constantly torquing the wheel.

Tesla needs to admit their current detection methods are flawed, and stop blaming drivers for believing Tesla's own hyperbolic and twisted statements about how safe AP is (and rename AEB to WTFDIB ... as in Why TF didn't it brake!).
 
Autopilot is a driving aid, it is great if used correctly.

Agreed. I think the biggest issue is Tesla does very little to educate new owners on correct and appropriate use of AP. Ever since the 3 introduction, the delivery orientation is 5 minutes; consists of pointing to the car, then pointing you toward the exit.

"The NTSB cited the driver's "inattention and over-reliance" on the advanced driver assistance system." ...

  • "over-reliance" is on Tesla, their marketing, Elon tweets, etc.
  • "inattention" is also on Tesla, torque wheel sensing is a kludge and doesn't work. They should be using cameras and eye-tracking.

New owners are more tuned to Elon tweeting about how the car will drive you anywhere, removing steering wheels and conflating terms like AP1,2,2.5,3/EAP/FSD/LMNOP (and selling vaporware)...

If safety really were their priority... they should require owners to go through some type of online training (either through the App or Center Console) before AP can be activated (per driver profile). It's not a perfect solution, however it might save a life.


Segway requires safety training before activation... for a tiny 13mph scooter.
Tesla... nothing... for a 4500lb car going 90mph.
 
Last edited:
Forget the AEB for a sec, why did it accelerate towards a stationary object?
My car will do that. Example, a car in front of me slows then leaves my lane, AP accelerates back toward its programmed speed until it detects the stationary vehicle that was, say, 60 feet ahead of the car that moved out of the way. Then sudden deceleration. Which is one reason I have to keep an eye on it.
 
Last edited:
Forget the AEB for a sec, why did it accelerate towards a stationary object?

I suspect it was just accelerating to its set speed, after the car in front of it switched lanes. It SHOULD have seen such a large object (firetruck) and AEB should have detected it way earlier...

It's very sad that the NHSTA is blaming the accident on coffee and bagels vs asking your very fundamental question "why did it accelerate towards a stationary object?"
 
Having driven a while now with both an AP1 and now an AP2 Tesla, I'd honestly have to say that I've seen little difference in the system's ability to react to obstacles between either version.

Both systems seem to suffer from some of the same fundamental problems. For one thing, I don't think the sonar sensors are all that precise. For years now, both AP1 and AP2 systems have that "glitch" where they start displaying cars dancing all over the road when they're beside you at a stop. I also notice as I drive into my garage and it reads off the distance before hitting the front wall, the number of inches jumps around some .... never a smooth, linear countdown as I creep towards it.

Additionally, I think the processor speed isn't fast enough to make everything respond as quickly as people expect or demand. Maybe it's better on a Model 3, and I haven't tried one yet to say? But with both AP1 and AP2, I've experienced it lighting up a car detected in front of me in red and beeping to warn of a possible collision when I was already slowing to a stop and it was just a distraction or nuisance warning. Yet in other similar scenarios, it doesn't trigger at all, despite coming in to the area with the stopped traffic fairly quickly. I also see it lagging a bit behind "real time" as it tries to show me warnings that I'm close to curbs or other obstructions on each side of the vehicle.

In fact, if you don't turn off some of the avoidance features, the autopilot tends to hard brake for no good reason at random times on the interstate, when it mistakenly thinks it saw something in your way, ahead of you.


I suspect it was just accelerating to its set speed, after the car in front of it switched lanes. It SHOULD have seen such a large object (firetruck) and AEB should have detected it way earlier...

It's very sad that the NHSTA is blaming the accident on coffee and bagels vs asking your very fundamental question "why did it accelerate towards a stationary object?"
 
The Mobile Eye TACC in my ELR will do the same thing. If someone merges out of my lane, leaving a stopped car in front, the car is "bind" to it and will resume to the speed setpoint.

-J

I find this to be very worrisome and disappointing.

That machine vision can’t figure out if there is a massive stationary object in the path.

In my many decades of driving I’ve had my share of situations where the car in front of me abruptly changed lanes because of a stationary obstacle.

And while it was stressful, my biovision and my human reaction time has zero problems understanding “oh *sugar* there’s an obstacle that I MUST AVOID”. Never did I think, oh cool, I can now accelerate into the obstacle at max speed limit.

This is such a basic safety related skill that I don’t understand why it is apparently treated as an obscure corner case.

What’s the point of FSD if stationary massive objects can’t be detected 100% ?!?
 
HW 2.5 seems to look at things in the current frame and has no knowledge of what happened in the previous frames.
Dancing cars makes me wonder this.
Seeing a truck then all the sudden not seeing it and driving under it makes me wonder this.
Seeing nothing on the road then all the sudden braking for no reason makes me wonder this.

If this is the case I hope HW 3.0 has another layer of code on top of what they have now to coordinate what is happening over time.

Note: This is total guessing and just a generic observation.
 
I’m sorry but you can blame the AP system but how stupid can you be to have this happen. The biggest cause of accidents in any vehicle is in attention, this is not an autonomous driving car.

I am also sorry... you can blame drivers for being stupid... or you can blame them for being gullible in beleiving the hyperbolic statements made by Elon/Tesla about it's AP's capabilities.

Here's the 2014 Autopilot announcement. The first... but one of of many examples.

I think the real stupidity is believing Elon's statements about AP... I'm sure they'll be true someday... however it wasn't true in 2014... and still not true in 2019.