Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilot Called Out in NTSB Report on Tesla Crash into Fire Truck

This site may earn commission on affiliate links.


The National Transportation Safety Board (NTSB) said Wednesday that driver errors and Autopilot caused a January 2018 crash of a Model S into a parked fire truck.

According to the report: ​”The National Transportation Safety Board determines that the probable cause of the Culver City, California, rear-end crash was the Tesla driver’s lack of response to the stationary fire truck in his travel lane, due to inattention and overreliance on the vehicle’s advanced driver assistance system; the Tesla’s Autopilot design, which permitted the driver to disengage from the driving task; and the driver’s use of the system in ways inconsistent with guidance and warnings from the manufacturer.”

Performance data collected during the investigation show that the Tesla followed various lead vehicles in heavy traffic for minutes before the crash. When the last lead vehicle changed lanes—3 to 4 seconds before the crash—revealing the fire truck on the path of the Tesla, the system was unable to immediately detect the hazard and accelerated the Tesla toward the stationary truck.

“By the time the system detected the stationary vehicle and gave the driver a collision warning—0.49 second before impact —the collision was imminent and the warning was too late, particularly for an inattentive driver,” the report said. “The AEB system did not activate. Had the driver been attending to the driving task, he could have taken evasive action to avoid or mitigate the collision.”

The fire truck was unoccupied and the driver was not injured in the incident.

 
Last edited by a moderator:
we live in a fast pace world. technology is never perfect. it is best if we are able to adapt to such changes and learn to understand the technology and its capabilities.

I've been learning this technology has flaws. I work with processors with nano second clock precision. Unable to detect a stationary object in 3-4 seconds is simply bad design/tech. I personally would turn the AP off.
 
With the number of times an AP crash has been on the news, all AP drivers should KNOW to pay attention. Yes, AP needs to work better. And people should report issues of AP not detecting objects. But to STILL be getting into accidents? Come on people. Pay attention!!!!!
People have been getting into accidents since car accidents since cars were invented. I'm not sure why you all would expect that to change.
I just hope that Autopilot decreases the number of accidents because otherwise it's eventually going to be banned on public roads. The NTSB doesn't care about how well you use Autopilot, they care how safely all Tesla drivers uses Autopilot.
 
Last edited:
  • Like
Reactions: JeffnReno
People have been getting into accidents since car accidents since cars were invented. I'm not sure why you all would expect that to change.
I just hope that Autopilot decreases the number of accidents because otherwise it's eventually going to banned on public roads. The NTSB doesn't care about how well you use Autopilot, they care how safely all Tesla drivers uses Autopilot.

I would expect people to be looking at the road when using AP. The combination of attentive drivers AND the current form of AP should be safer than just an attentive driver.
 
I find this to be very worrisome and disappointing.

That machine vision can’t figure out if there is a massive stationary object in the path.

In my many decades of driving I’ve had my share of situations where the car in front of me abruptly changed lanes because of a stationary obstacle.

And while it was stressful, my biovision and my human reaction time has zero problems understanding “oh *sugar* there’s an obstacle that I MUST AVOID”. Never did I think, oh cool, I can now accelerate into the obstacle at max speed limit.

This is such a basic safety related skill that I don’t understand why it is apparently treated as an obscure corner case.

What’s the point of FSD if stationary massive objects can’t be detected 100% ?!?

If you were a programmer, and understood the complexities, I’m sure you wouldn’t ask how the software can’t handle this simple basic task. :)
 
I am also sorry... you can blame drivers for being stupid... or you can blame them for being gullible in beleiving the hyperbolic statements made by Elon/Tesla about it's AP's capabilities.

Here's the 2014 Autopilot announcement. The first... but one of of many examples.

I think the real stupidity is believing Elon's statements about AP... I'm sure they'll be true someday... however it wasn't true in 2014... and still not true in 2019.

Pay attention at the wheel. That’s the message sent. That what didn’t happen.l here.
Very sad of course, but autopilot is not FSD yet.
 
  • Like
Reactions: JeffnReno
Not really. It's been in 'beta" from the start, so the surfacing of flaws didn't change the equation.

They added hand detection LATER, as an afterthought... AP wasn't designed with driver attention in mind (because at first they were claiming it was hands free)... It's a kludge that doesn't work well.

How long will it be in Beta? Will AP1/2 be in beta forever? Beta implies there will be a final. Or is "beta" just a word used to limit their liability?

Replace "somebody" with "Elon"... and I think you get different answers.

"The first part is the long range radar, which can see through basically anything"

"The car can do almost anything, so we're obviously able to do lane-keeping on freeways, to automatic cruise control, active emergency braking so it'll brake if it see's any object that you're going to collide with..." - Elon 2014 (regarding AP1)​

There are Elon/Tesla sycophants out there (and here)... so yes, I think some would jump if Elon told them they'd be fine.

Seriously? That explanation may make sense from a technical point of view but it’s a testimony what a travesty all the talk from Musk about FSD is if this late in the game it is apparently a huge problem to detect MASSIVE stationary objects?

And people excuse this? Wow, this is unbelievable. So you are basically saying the cameras, the radars and all the other sensors on the Tesla car together with the computer right now isn’t capable of reliably detecting a stopped fire truck. Absolutely laughable that this is still for sale under the grandiose “FSD” acronym.

It’s got as many cameras as a spider has eyes but it’s too dumb to identify a stopped truck. Eyeroll. Yeah, here are my $6000. Not. LOL!

OMG. The forum software here needs to include a “Head Slap” icon, we can choose from.
The “Thumbs down” disagree icon just doesn’t cut it for these replies. ...
 
  • Like
Reactions: liuping
FAP on new S3-Raven (2 mo old) didn't detect a deer crossing 2 lane road when driving 45mph or at least I initiated brake before car did. Unfortunately not in enough time to avoid contact with front bumper but luckily no major damage. Was able to manually pop bumper back into place and only had to have one ultrasonic sensor bracket replaced. Glad it wasn't person running across the road.
 
  • Informative
Reactions: JeffnReno
If you were a programmer, and understood the complexities, I’m sure you wouldn’t ask how the software can’t handle this simple basic task. :)

Oh please. Go jump in a time machine back to 2016 and tell that to Musk so that he shuts up about FSD. And while you are at it ask the TMC forum programmers to include a “you stupid” button that you apparently want to have.

I never said it was an easy problem to solve. My point is that this is a FUNDAMENTAL problem that must be solved before you can even think about FSD, let alone sell it. And yet in September 2019 we aren’t any closer to this than apparently back in AP1 days. But you deduce that I think this is a simple problem for a computer?!? I also need a dunce button to rate your posts.

Or do you want your fancy robotaxi to crash into a fire truck at 70mph because “it soooooo difficult, stooooooopid”.
 
Last edited:
  • Disagree
Reactions: Peter Lucas
Because it didn't detect it. Why didn't it detect it?:
1. Radar ignores stationary objects because too many false positive detections such as with traffic signs.

This is the key point. But what a flawed idea. A stationary object in the path of the car does not need to be identifiable. The algorithm should cause the car to stop for any/every stationary object in its path.
 
“The AEB system did not activate. Had the driver been attending to the driving task, he could have taken evasive action to avoid or mitigate the collision.”
AEB on my Tesla is complete rubbish. So is the "hands on wheel" detection. I've been in scenarios where AEB should have kicked and it does nothing. I also use AP with my hands on the wheel at all times and still get nagged like crazy unless I am constantly torquing the wheel.

Tesla needs to admit their current detection methods are flawed, and stop blaming drivers for believing Tesla's own hyperbolic and twisted statements about how safe AP is (and rename AEB to WTFDIB ... as in Why TF didn't it brake!).

Tesla can improve it’s “detection methods” but nothing will replace common sense and personal responsibility from the drivers. The system is not an autonomous driving system and you have to be paying attention at all times. Using anything for other than its intended purpose or past its intended purpose will always create problems, we should not blame the misused item but the person not following simple and clear instructions.

I agree with your comment about the nagging. Is useless and annoying. A driver can be paying full attention with his/her hands ready to take action but not necessarily nagging the wheel to keep the system happy. I hope they come up with a better way to control this, maybe the interior camera monitoring for driver’s attention?
 
With apologies for skipping to the end of the thread:

From the article linked in the blog post:

According to the report: ”The National Transportation Safety Board determines that the probable cause of the Culver City, California, rear-end crash was the Tesla driver’s lack of response to the stationary fire truck in his travel lane, due to inattention and overreliance on the vehicle’s advanced driver assistance system; the Tesla’s Autopilot design, which permitted the driver to disengage from the driving task; and the driver’s use of the system in ways inconsistent with guidance and warnings from the manufacturer.”

Emphasis mine.

In other words, the driver was an idiot who thought that "This is a BETA feature. Keep your hands on the wheel and your attention on the road at all times" meant "This is a self-driving car. Ignore the road and take a nap if you like."
 
  • Funny
Reactions: JeffnReno
Something no one has mentioned is that beta software is rarely released into the wild even when the potential for doing harm is vastly less than is the case for Tesla's driver assisted features.

I've had a model 3 for a few months and have experimented with self-steer and Traffic aware cruise control. I'm a very cautious driver and I find that I can not place much confidence in the systems. Even when I'm doing my best to be fully alert I find that I get lulled so that I'm not as alert as I would be if I was fully responsible for the driving. This makes me very nervous and I have decided to only use these aids for very short periods.

Given the current state of unreliability I find it really amazing that these systems would be available to any but a select group of beta testers, testing under controlled conditions.
 
  • Like
Reactions: Octo
Something no one has mentioned is that beta software is rarely released into the wild even when the potential for doing harm is vastly less than is the case for Tesla's driver assisted features.

I've had a model 3 for a few months and have experimented with self-steer and Traffic aware cruise control. I'm a very cautious driver and I find that I can not place much confidence in the systems. Even when I'm doing my best to be fully alert I find that I get lulled so that I'm not as alert as I would be if I was fully responsible for the driving. This makes me very nervous and I have decided to only use these aids for very short periods.

Given the current state of unreliability I find it really amazing that these systems would be available to any but a select group of beta testers, testing under controlled conditions.
I too have been driving my model 3 for 4 months now and I totally disagree with your assessment . In my 4 months I have driven over 14,000 miles. I routinely drive about 200 miles over mountain roads just for pleasure. I would guess that about 90% of those miles are on autopilot. I have read that autopilot is for city driving. I find that I prefer to drive myself in the city and on autopilot when heading into the mountains on two lane roads. Of course! I have to be prepared to take over and have multiple times. I also have found that I am more aware and awake on autopilot than when I drove my Prius on the same route and was constantly getting sleepy. Because I know that I can anticipate a problem before autopilot reacts I can disengage it and drive though without a problem. I have also noticed that autopilot keeps me in the middle of the lane whereas I have a tendency to crowd the middle line.
Autopilot is not perfect but it makes me a much better driver.
 
Well, that is concerning. Three seconds at 21 to 31mph was not enough time for autopilot to detect a stationary fire truck.

It was a 2014 Model S with HW1. HW1 was aware of a decelerating car ahead 7 seconds before impact, which changed lanes after 4 seconds. It took another 2.51 seconds before HW1 detected the fire truck and sounded the forward collision alarm, but did not apply the AEB system.

Shame that 3000 milliseconds isn't long enough to prevent a crash. Technology still has a ways to go.
I have a 2015 AP1 P85D, in my experience there is something in the software that differentiates between a stationary object, or an object moving at high speed towards your car, compared to a car you are following. I e. the car doesn't mind looking directly at a car coming towards you on the outside of a bend when the closing speed might be 100 mph so there is a way to discriminate between a car travelling in same direction that you don't want to hit, and a vehicle coming in opposite direction that it disregards. I suspect, and has happened to me, that an object suddenly appearing in front of the car and appearing to be travelling at a high closing speed (ie you are moving and it isn't) is treated as an object moving in opposite direction and not as something to be avoided. Just a thought, but my car has accelerated toward stationary objects a couple of times.
 
  • Informative
Reactions: cadetsea
Based on this video clip (AEB test on AP2 at 50s mark)... I can't disagree with you....


I really wish Telsa would require drivers to watch a video like this before AP can be activated...

I'm an electronics engineer and design embedded systems and write embedded system software. I have an FCC General Radiotelephone license with radar endorsement and have worked on shipboard radars many moons ago. I know a little bit about software and a little bit about radar. However, I have never seen the data sheet for the radar module that Tesla uses. But radar is radar.

That said, I find it hard to fathom how AP software did not pick out the fake vehicle in the road. I've heard it said many times that it either doesn't detect stationary objects or if it does, it ignores them. (OK, the paper car in the video may not return a radar echo, but a firetruck would, definitely). I can understand why adaptive cruise on other cars can NOT use radar for stationary objects because without knowing whether the object is in your lane, the software would be braking all the time (like when approaching a curve with a mailbox on the side of the road, dead ahead). A Tesla knows the lanes. It even draws them out for you with little cars in front and beside you. So if it knows the lane ahead, and it gets this big fat radar echo from directly in front indicating some metallic object is sitting squarely in my lane, what on earth is stopping it (the software) from declaring that a code red emergency and (1) swerve, or (2) slam on the brakes, or (3) sound the collision alarm?

I certainly don't expect this but would a Tesla software engineer please chime in?
 
I'm an electronics engineer and design embedded systems and write embedded system software. I have an FCC General Radiotelephone license with radar endorsement and have worked on shipboard radars many moons ago. I know a little bit about software and a little bit about radar. However, I have never seen the data sheet for the radar module that Tesla uses. But radar is radar.

That said, I find it hard to fathom how AP software did not pick out the fake vehicle in the road. I've heard it said many times that it either doesn't detect stationary objects or if it does, it ignores them. (OK, the paper car in the video may not return a radar echo, but a firetruck would, definitely). I can understand why adaptive cruise on other cars can NOT use radar for stationary objects because without knowing whether the object is in your lane, the software would be braking all the time (like when approaching a curve with a mailbox on the side of the road, dead ahead). A Tesla knows the lanes. It even draws them out for you with little cars in front and beside you. So if it knows the lane ahead, and it gets this big fat radar echo from directly in front indicating some metallic object is sitting squarely in my lane, what on earth is stopping it (the software) from declaring that a code red emergency and (1) swerve, or (2) slam on the brakes, or (3) sound the collision alarm?

I certainly don't expect this but would a Tesla software engineer please chime in?
What about bridges and overhead signs? Also, what is the resolution of the radar? can it really distinguish an object in front of you from one in the next lane over?
 
Something no one has mentioned is that beta software is rarely released into the wild even when the potential for doing harm is vastly less than is the case for Tesla's driver assisted features.

It is my uninformed opinion that they're calling it beta software to cover their asses, and to emphasize the fact that you still need to pay attention. That's because they decided to give it the misleading name "autopilot." I regard my EAP as a mature lane-keeping assist system, which is not "autopilot." As a lane-keeping assist system, which by definition would require driver attention, it could be considered mature software. But because they call it "autopilot" they have to say it's still in beta so that they can tell you to keep your eyes on the road and your hands on the wheel because it's not an autopilot system.

I've had a model 3 for a few months and have experimented with self-steer and Traffic aware cruise control. I'm a very cautious driver and I find that I can not place much confidence in the systems. Even when I'm doing my best to be fully alert I find that I get lulled so that I'm not as alert as I would be if I was fully responsible for the driving. This makes me very nervous and I have decided to only use these aids for very short periods.

Given the current state of unreliability I find it really amazing that these systems would be available to any but a select group of beta testers, testing under controlled conditions.

I've said this elsewhere: It took me two weeks to get up the courage to try out EAP. And then I shut it off after five minutes. And that was on a freeway with only a few other cars around. But I gradually gained confidence, and now I use it everywhere it's suitable. It performs excellently, as long as you view it as an assist system, and not as an autopilot system. And as long as you recognize its limitations and disengage it in marginal conditions. It's an assist feature, not a game of "How long can I keep it engaged?"

Again, the case in the OP was the driver acting as though it was an autopilot system, which it is not.
 
  • Love
Reactions: erik_k