Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilot Called Out in NTSB Report on Tesla Crash into Fire Truck



The National Transportation Safety Board (NTSB) said Wednesday that driver errors and Autopilot caused a January 2018 crash of a Model S into a parked fire truck.

According to the report: ​”The National Transportation Safety Board determines that the probable cause of the Culver City, California, rear-end crash was the Tesla driver’s lack of response to the stationary fire truck in his travel lane, due to inattention and overreliance on the vehicle’s advanced driver assistance system; the Tesla’s Autopilot design, which permitted the driver to disengage from the driving task; and the driver’s use of the system in ways inconsistent with guidance and warnings from the manufacturer.”

Performance data collected during the investigation show that the Tesla followed various lead vehicles in heavy traffic for minutes before the crash. When the last lead vehicle changed lanes—3 to 4 seconds before the crash—revealing the fire truck on the path of the Tesla, the system was unable to immediately detect the hazard and accelerated the Tesla toward the stationary truck.

“By the time the system detected the stationary vehicle and gave the driver a collision warning—0.49 second before impact —the collision was imminent and the warning was too late, particularly for an inattentive driver,” the report said. “The AEB system did not activate. Had the driver been attending to the driving task, he could have taken evasive action to avoid or mitigate the collision.”

The fire truck was unoccupied and the driver was not injured in the incident.

 
Last edited by a moderator:

Octo

Member
Jun 28, 2019
386
526
Dallas
Like my text said and several others have stated, that hardware was AP1, the new hardware AP3, is 100x more powerful. Although much more powerful hardware wise, it still has limitations.

AP1 has a single camera.

AP1 was never advertised as FSD.

Stop, your confusion between the different versions of autopilot. Although your wrong about AP hardware, your not wrong about even current hardware's ability to drive itself. I would agree it is a joke on those that believe.

Yeah, I didn’t clarify that I was talking about the latest iteration of Tesla’s hardware. Which was a mistake when commenting on a thread about an AP1 car crash.

My point was that several years and significant HW improvements after AP1, stationary object detection is still not working well enough to be useful.

I didn’t follow Tesla back when they sold AP1 but was just exposed to news every now and then and as a non-Tesla interested person back then who wasn’t in the market for Tesla priced EVs it sure seemed that Tesla’s marketing tried to create the impression that they are about to have fully Level 5 self driving cars. Isn’t there a video from 2016 pitching the fully self driving car?
While in reality AP1 couldn’t even detect a huge truck?!?

And now it’s the same: HW3 can’t detect objects beyond toy level (dancing cars etc.) but Tesla pitched their FSD vision half a year ago with the CEO talking robotaxis.

Before they fix this glaring defect in their vision system the whole FSD effort is pointless.
 

ohioviper

New Member
Mar 4, 2019
3
2
Ohio
Auto lane keeping/autopilot are enhancements that you can appreciate but you are still the pilot in charge of a several thousand pound hurtling projectile. I’m sure some drivers wonder why my car will suddenly slow itself from 70 mph to 30 mph then accelerate again in front of them. If they could only hear my comments maybe they would entertained but they most likely are repeating my comments because they’re trying to keep from rear ending me.
Sometimes you can rethink the incident and extrapolate a cause but many times not.
Anyone who truly believes the car will drive itself shouldn’t own one.
Love the car and look forward to each baby step in assisted driving. Would like to not be reminded so often that my hands need to be on the wheel. I keep a hand on but apparently not as enthusiastic as it requires.
 
  • Like
Reactions: Silicon Desert
This is the same model and hardware that I drive, and I've had some similar issues where the autopilot responded sub-optimally, but I was keeping a lose watch on it, and took control. We are at a point in this technology development where it is extremely helpful, but not a replacement for a competent operator. You've got to stay on point with the task at hand (driving) in addition to using the autopilot.
 
  • Like
Reactions: Sennapirke

dskid

Member
Sep 16, 2018
70
41
Vancouver
The fundamental issue is radar. In theory, vision will have a better time identifying but I don’t believe it’s actually in use as of yet. Still fully reliant on radar, which has, and always will have the issue with stationary objects due to the way radar detects objects.

Should the system be better? Definitely. Are you supposed to be paying attention and ready to take over at all times? Also yes. I use AP a lot, but work within its limitations and it’s a great driver assist 95% of the time.
 
  • Like
Reactions: JeffnReno
  1. There are technical reasons it didn't see the fire engine stopped in the lane, but I'll let you do your own research on why.
  2. "Autopilot" name is not the problem, as planes also have "autopilot", but still require the "driver" to be ready to take over at any time.
  3. Anytime you activate autopilot a warning comes up on the screen saying you must pay attention and be ready to take over at any time.
  4. No matter what Tesla does, people will be stupid and abuse the technology.
I understand there tech, understand it's limitations, and still use autopilot 90% of the time. I'm always ready to take over as needed and have done so on many occasions. That said, I enjoy using the tech and it has greatly reduced my driving stress and, imo, the possibility of getting in an accident.
 

rdunniii

Member
Jun 27, 2012
266
157
Reno NV
First of all, detecting an object and figuring out what to do and taking action are separate things. If it isn't obvious to you that the need for HW3 is to lessen the time the systems require to analyze and respond you just don't get it. If you don't understand even HW3 equipped cars with fully released FSD are imperfect and and will have accidents, even fatal ones. your'e a fool. There will always be situations that the system, just like a human driver, are unable to respond to quickly or accurately enough.

All marketing is propaganda and Elon has not figured out how market things without saying things that can come back to bite him.

As I write this there is a banner on this page advertising a 7% Target Distribution Fund. It is basically a meaningless statement. If it is not a Ponzi scheme and you click on the link there will be so many gotchas that you should understand that 7% is the absolute best it can do and will never be realized over any time period more than maybe a day, not even close. But it sure sounds good.
 

robertmanning

Member
Dec 8, 2018
143
115
New York
Look, everyone that is blaming the driver, please stop. The driver was at fault for not paying enough attention to driving AND the "autopilot" system doesn't work well enough to prevent the accident. This is what was determined.

Therefore, there are two things that need to happen to prevent this in the future. One: Drivers in any model Tesla need to ALWAYS be alert and ALWAYS have their hands on the steering wheel. TWO: Tesla needs to communicate this to ALL drivers and add this communication to the screen when autopilot is engaged in a written or audible message. THREE: Tesla needs to do this immediately, and Tesla needs to stop communicating and selling future "Full Self Driving" capability UNTIL it comes to fruition.

The blame is two-sided.
 

JeffnReno

Member
Mar 4, 2016
244
130
Reno, NV
Been using Enhanced AP for over a year every chance I get. Yes it still requires paying attention and yes it does do some things differently than I would prefer. It has required me to take over more times than I can keep track of. The fact that I was able to take over because I was paying attention is probably why I still live to tell about how much I enjoy using it and don't mind admitting that part of the reason I didn't mind paying for it was to add to the development so it would get better over time and it has. I've learned that in certain situations, to pay extra attention or to simply turn it off and drive until I pass where I know it might not react the way I want it too. This doesn't prevent me from using it and even makes me want to test its improvements as updates come out. IMHO it is very much worth every penny I paid for it as is and it keeps getting better for no additional cost. What other car does that?
 

Daniel in SD

Well-Known Member
Jan 25, 2018
7,373
10,920
San Diego
It's quite possible that if Autopilot detected stationary objects much more reliably that there could be even more accidents since people would be less vigilant. How well Autopilot works changes how people use it.
The question of blame is irrelevant to me. The question is whether or not it makes the roads safer or less safe in the real world not by how well some theoretical population of perfect drivers would use it.
 
  • Like
Reactions: JeffnReno
Look, everyone that is blaming the driver, please stop. The driver was at fault for not paying enough attention to driving AND the "autopilot" system doesn't work well enough to prevent the accident. This is what was determined.

Therefore, there are two things that need to happen to prevent this in the future. One: Drivers in any model Tesla need to ALWAYS be alert and ALWAYS have their hands on the steering wheel. TWO: Tesla needs to communicate this to ALL drivers and add this communication to the screen when autopilot is engaged in a written or audible message. THREE: Tesla needs to do this immediately, and Tesla needs to stop communicating and selling future "Full Self Driving" capability UNTIL it comes to fruition.

The blame is two-sided.
TWO: it already does
THREE: no, they don't.
 

Potatoee

Member
Jul 21, 2018
31
19
Reading
What I do not understand here is that regardless of the autopilot version, vision system, etc. there was a Forward Looking Radar on the car which should have handled/mitigated the collision. Very often the radar is used to trigger automatic braking/collision mitigation, adaptive cruise, etc. Anyway, I'm surprised that the radar was not able to see the fire truck and did not cause the vehicle to brake. I have to wonder if the control architecture in the vehicle actually delayed/got in the way of a "call for action" by the radar. Tesla uses BOSCH, does it not?
 

Silicon Desert

Active Member
Oct 1, 2018
3,786
3,998
Sparks Nevada
Yeah, I didn’t clarify that I was talking about the latest iteration of Tesla’s hardware. Which was a mistake when commenting on a thread about an AP1 car crash.
Kudos. Good for you. This catches my attention because I finally found someone that doesn't try to cover up a mis-statement with another quote containing excuses. Fortunately there are some great very knowledgeable people on here, yet all too often I see some people getting upset when someone else attempts to correct a mis-quote. They get really defensive as if to know everything and never wrong. Those folks should lighten up a bit. Yea, I know this is off topic and apologies for that, but Octo got me to thinking about this.
 

Potatoee

Member
Jul 21, 2018
31
19
Reading
The fundamental issue is radar. In theory, vision will have a better time identifying but I don’t believe it’s actually in use as of yet. Still fully reliant on radar, which has, and always will have the issue with stationary objects due to the way radar detects objects.

Should the system be better? Definitely. Are you supposed to be paying attention and ready to take over at all times? Also yes. I use AP a lot, but work within its limitations and it’s a great driver assist 95% of the time.

If you are saying that the radar does not detect stationary objects, I would have to disagree. Radars regularly detect stationary objects in vehicles: e.g. guard rails, parked cars, pedestrians, etc. I'll also note that any object in front of a moving vehicle has a doppler shift due to the vehicle motion and should be readily detectable. Despite all that, there could be a bug in the radar itself that would omit the detection of any object that is stationary with respect to what it considers the roadway itself. Another possibility is that since the firetruck was adjacent to a barrier, with both providing a return signal to the radar perhaps it got confused about what was the roadway boundary itself.

I think this is an interesting problem regardless of the allocation of blame.
 

Daniel in SD

Well-Known Member
Jan 25, 2018
7,373
10,920
San Diego
If you are saying that the radar does not detect stationary objects, I would have to disagree. Radars regularly detect stationary objects in vehicles: e.g. guard rails, parked cars, pedestrians, etc. I'll also note that any object in front of a moving vehicle has a doppler shift due to the vehicle motion and should be readily detectable. Despite all that, there could be a bug in the radar itself that would omit the detection of any object that is stationary with respect to what it considers the roadway itself. Another possibility is that since the firetruck was adjacent to a barrier, with both providing a return signal to the radar perhaps it got confused about what was the roadway boundary itself.

I think this is an interesting problem regardless of the allocation of blame.
Yes, radar detects the firetruck 110% of the time. The problem is that extra 10%!
 
  • Like
Reactions: SO16

kev1n

Active Member
Nov 17, 2016
1,350
964
SF Bay Area
I'd argue over a period of time the driver becomes complacent/over-confident in the AP's ability. Also the periodic updates could change the behavior (for better or worse) and the driver is clueless of its side effects.

we live in a fast pace world. technology is never perfect. it is best if we are able to adapt to such changes and learn to understand the technology and its capabilities.
 
  • Like
Reactions: JeffnReno

Rockster

Active Member
Oct 22, 2013
3,014
4,701
McKinney, TX
Agreed. I think the biggest issue is Tesla does very little to educate new owners on correct and appropriate use of AP. Ever since the 3 introduction, the delivery orientation is 5 minutes; consists of pointing to the car, then pointing you toward the exit.

"The NTSB cited the driver's "inattention and over-reliance" on the advanced driver assistance system." ...

  • "over-reliance" is on Tesla, their marketing, Elon tweets, etc.
  • "inattention" is also on Tesla, torque wheel sensing is a kludge and doesn't work. They should be using cameras and eye-tracking.

New owners are more tuned to Elon tweeting about how the car will drive you anywhere, removing steering wheels and conflating terms like AP1,2,2.5,3/EAP/FSD/LMNOP (and selling vaporware)...

If safety really were their priority... they should require owners to go through some type of online training (either through the App or Center Console) before AP can be activated (per driver profile). It's not a perfect solution, however it might save a life.


Segway requires safety training before activation... for a tiny 13mph scooter.
Tesla... nothing... for a 4500lb car going 90mph.


I have an idea for safety training: How about a display message that cautions the driver to "keep hands on the wheel and be prepared to take over at any time"?

Just how much more training is required to understand that? Training in reading comprehension? Training in the definition of "at all times"? Training doesn't mitigate the owner's willful intent to disregard a clear, concise, single sentence warning.

I have no sympathy whatsoever for people who can't follow a one sentence mandate. Nor will I blame marketing copy for an owner's misunderstanding of the word "autopilot." I don't really believe that my Apple content lives in the clouds nor do I think that the wizard that Windows launches is a magical being who is able to cast spells. People with common sense can parse marketing language and understand reality.

Enough of spreading blame beyond the driver. The driver is 100% responsible. Period.
 

SO16

Active Member
Feb 25, 2016
3,359
11,117
MI
I’m sorry but you can blame the AP system but how stupid can you be to have this happen. The biggest cause of accidents in any vehicle is in attention, this is not an autonomous driving car.

Agreed.


With the number of times an AP crash has been on the news, the driver is reminded to pay attention and keep their hands on the wheel when they turn on AP EVERY SINGLE TIME, all AP drivers should KNOW to pay attention. Yes, AP needs to work better. It will have to for FSD. And people should report issues of AP not detecting objects. But to STILL be getting into accidents? Come on people. Pay attention!!!!!
 
Last edited:
  • Like
Reactions: M109Rider

Products we're discussing on TMC...

About Us

Formed in 2006, Tesla Motors Club (TMC) was the first independent online Tesla community. Today it remains the largest and most dynamic community of Tesla enthusiasts. Learn more.

Do you value your experience at TMC? Consider becoming a Supporting Member of Tesla Motors Club. As a thank you for your contribution, you'll get nearly no ads in the Community and Groups sections. Additional perks are available depending on the level of contribution. Please visit the Account Upgrades page for more details.


SUPPORT TMC
Top