Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilot Called Out in NTSB Report on Tesla Crash into Fire Truck

This site may earn commission on affiliate links.


The National Transportation Safety Board (NTSB) said Wednesday that driver errors and Autopilot caused a January 2018 crash of a Model S into a parked fire truck.

According to the report: ​”The National Transportation Safety Board determines that the probable cause of the Culver City, California, rear-end crash was the Tesla driver’s lack of response to the stationary fire truck in his travel lane, due to inattention and overreliance on the vehicle’s advanced driver assistance system; the Tesla’s Autopilot design, which permitted the driver to disengage from the driving task; and the driver’s use of the system in ways inconsistent with guidance and warnings from the manufacturer.”

Performance data collected during the investigation show that the Tesla followed various lead vehicles in heavy traffic for minutes before the crash. When the last lead vehicle changed lanes—3 to 4 seconds before the crash—revealing the fire truck on the path of the Tesla, the system was unable to immediately detect the hazard and accelerated the Tesla toward the stationary truck.

“By the time the system detected the stationary vehicle and gave the driver a collision warning—0.49 second before impact —the collision was imminent and the warning was too late, particularly for an inattentive driver,” the report said. “The AEB system did not activate. Had the driver been attending to the driving task, he could have taken evasive action to avoid or mitigate the collision.”

The fire truck was unoccupied and the driver was not injured in the incident.

 
Last edited by a moderator:
Forget the AEB for a sec, why did it accelerate towards a stationary object?
Because it didn't detect it. Why didn't it detect it?:
1. Radar ignores stationary objects because too many false positive detections such as with traffic signs.
2. Camera system was NOT trained to detect stopped fire trucks. Was trained to detect lanes and following vehicles.
New systems are 10x-100x times more powerful hardware wise than AP1. Probably not enough for all the edge cases.
 
... Tesla needs to admit their current detection methods are flawed, and stop blaming drivers for believing Tesla's own hyperbolic and twisted statements about how safe AP is (and rename AEB to WTFDIB ... as in Why TF didn't it brake!).
Tesla has admitted detection methods are flawed, by stating it is beta and stating drivers must remain in control at all times. If it wasn't flawed, if it was perfect, Tesla would be saying, go to sleep and magically arrive at your destination.
 
  • Like
Reactions: M109Rider
Tesla has admitted detection methods are flawed, by stating it is beta and stating drivers must remain in control at all times.

Not really. It's been in 'beta" from the start, so the surfacing of flaws didn't change the equation.

They added hand detection LATER, as an afterthought... AP wasn't designed with driver attention in mind (because at first they were claiming it was hands free)... It's a kludge that doesn't work well.

How long will it be in Beta? Will AP1/2 be in beta forever? Beta implies there will be a final. Or is "beta" just a word used to limit their liability?
 
If somebody says jump off the bridge you will be fine do you jump off the bridge.

Replace "somebody" with "Elon"... and I think you get different answers.

"The first part is the long range radar, which can see through basically anything"

"The car can do almost anything, so we're obviously able to do lane-keeping on freeways, to automatic cruise control, active emergency braking so it'll brake if it see's any object that you're going to collide with..." - Elon 2014 (regarding AP1)​

There are Elon/Tesla sycophants out there (and here)... so yes, I think some would jump if Elon told them they'd be fine.
 
Last edited:
Because it didn't detect it. Why didn't it detect it?:
1. Radar ignores stationary objects because too many false positive detections such as with traffic signs.
2. Camera system was NOT trained to detect stopped fire trucks. Was trained to detect lanes and following vehicles.
New systems are 10x-100x times more powerful hardware wise than AP1. Probably not enough for all the edge cases.

Seriously? That explanation may make sense from a technical point of view but it’s a testimony what a travesty all the talk from Musk about FSD is if this late in the game it is apparently a huge problem to detect MASSIVE stationary objects?

And people excuse this? Wow, this is unbelievable. So you are basically saying the cameras, the radars and all the other sensors on the Tesla car together with the computer right now isn’t capable of reliably detecting a stopped fire truck. Absolutely laughable that this is still for sale under the grandiose “FSD” acronym.

It’s got as many cameras as a spider has eyes but it’s too dumb to identify a stopped truck. Eyeroll. Yeah, here are my $6000. Not. LOL!
 
It’s got as many cameras as a spider has eyes but it’s too dumb to identify a stopped truck. Eyeroll. Yeah, here are my $6000. Not. LOL!

To be fair, this was an AP1 incident. AP1 only has one camera (still forward facing, so it should have 'seen' it)

Also this from a Tesla press release:

Euro NCAP’s results demonstrate the impact of recent improvements made to our Automatic Emergency Braking (AEB) system that were extended to all Model S, Model X and Model 3 cars built since October 2016 via an over-the-air software update earlier this year.​

Since the car was AP1, it probably was built before 10/2016.
 
  • Like
Reactions: liuping
To be fair, this was an AP1 incident. AP1 only has one camera (still forward facing, so it should have 'seen' it)

Also this from a Tesla press release:

Euro NCAP’s results demonstrate the impact of recent improvements made to our Automatic Emergency Braking (AEB) system that were extended to all Model S, Model X and Model 3 cars built since October 2016 via an over-the-air software update earlier this year.​

Since the car was AP1, it probably was built before 10/2016.
Yes, but the same restrictions apply to my 2019 P3 with HW3 and that’s a problem.
 
This subject has been beaten so much, you can't even recognize the horse anymore. I stopped using AP 3 months into ownership and won't trust it UNTIL there has been substantiative data and visual proof of all the bells and whistles that Elon claims FSD, EAP, AEB will afford the Tesla driver. That aside, I know many on this site have had great experiences with AP. I've watched the videos, read the reviews, yadda, yadda. Happy that folks are enjoying the feature(s). R&D are the key but, just "how much" of that is trully being worked, on a regular basis, is an unknown. Don't hear anything about it; in the media or Elon's tweets (I think he finally got reigned in(?)). It's a waiting game, at best.
 
If you understand the limitations, the current level of automation is an excellent adjunct to an alert driver ready to take over in an emergency. You may recall seeing on the news the fully-autonomous vehicle last year being tested in Chandler, Arizona, where the "driver" was watching a video on her phone and the vehicle managed to run over a pedestrian pushing a bicycle across the street. We are all early adopters of technology that's not even on the cusp of development yet. Limitations will be found (we used to call these "undocumented features") and overcome. If Moore's Law applies to Teslas then we are indeed in for an exciting ride.
 
Based on this video clip (AEB test on AP2 at 50s mark)... I can't disagree with you....


I really wish Telsa would require drivers to watch a video like this before AP can be activated...
My $0.02 as a former aerospace engineer.

1. Tesla screwed up by calling it Auto Pilot instead of Assisted Driving or something akin.
2. Tesla is using their customers as Beta Testers.
 
I find this to be very worrisome and disappointing.

That machine vision can’t figure out if there is a massive stationary object in the path.

In my many decades of driving I’ve had my share of situations where the car in front of me abruptly changed lanes because of a stationary obstacle.

And while it was stressful, my biovision and my human reaction time has zero problems understanding “oh *sugar* there’s an obstacle that I MUST AVOID”. Never did I think, oh cool, I can now accelerate into the obstacle at max speed limit.

This is such a basic safety related skill that I don’t understand why it is apparently treated as an obscure corner case.

What’s the point of FSD if stationary massive objects can’t be detected 100% ?!?
Hmmm. Maybe Autopilot didn't see the massive object in the lane because it was a stealth fire truck. Does anyone make stealth fire trucks? ;)
 
My car will do that. Example, a car in front of me slows then leaves my lane, AP accelerates back toward its programmed speed until it detects the stationary vehicle that was, say, 60 feet ahead of the car that moved out of the way. Then sudden deceleration. Which is one reason I have to keep an eye on it.

Using TACC I've seen numerous instances in heavy traffic where a sudden
clearance of the lane results in heavy acceleration (what is the point of that?)
and late detection of the obstacle in front. A couple just today heading
for the Golden Gate Bridge. But not all such instances feel like that.
Maybe the acceleration is the same but because I see the obstacle
before the car does it really feels like quick accel?

The oddest part to me is that in most cases acceleration by TACC feels more
gentle, more sensible. But, seemingly, when it's a terrible idea it
goes full-bore.

I've learned to adjust the target speed down near to traffic speed ...but
sometimes I forget to make the adjustment.

I catch the car, but the quick-accel and brake annoys the passenger!
In some situations I think it best to turn off TACC. Let alone AP.
 
  • Informative
Reactions: liuping
... but it’s a testimony what a travesty all the talk from Musk about FSD is if this late in the game it is apparently a huge problem to detect MASSIVE stationary objects?
Like my text said and several others have stated, that hardware was AP1, the new hardware AP3, is 100x more powerful. Although much more powerful hardware wise, it still has limitations.
So you are basically saying the cameras
AP1 has a single camera.
Absolutely laughable that this is still for sale under the grandiose “FSD” acronym.
AP1 was never advertised as FSD.
It’s got as many cameras as a spider has eyes but it’s too dumb to identify a stopped truck. Eyeroll. Yeah, here are my $6000. Not. LOL!
Stop, your confusion between the different versions of autopilot. Although your wrong about AP hardware, your not wrong about even current hardware's ability to drive itself. I would agree it is a joke on those that believe.