Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

NTSB Wants Information on Tesla Autopilot Accident

This site may earn commission on affiliate links.
A Model S using Autopilot crashed into a firetruck near Los Angels on Monday prompting inquiry from the U.S. National Transportation Safety Board, according to a report from Bloomberg.

The Tesla driver was reportedly traveling at 65 mph when he rear-ended the truck. There were no injuries in the crash.

The Bloomberg report says the NTSB has not decided if it will launch a formal investigation. The agency is currently “gathering information.”

The Culver City Fire Department shared a photo of the accident.


The NTSB announced earlier this year findings of an investigation into the first known fatal crash involving a car using an automated driver assistance system. The agency said that “operational limitations” of Tesla’s Autopilot system played a ‘major role’ in the 2016 crash that killed one person. The driver’s 2015 Tesla Model S collided with a semi-truck while the car’s Traffic-Aware Cruise Control and Autosteer lane-keeping assistance features were being used.

Tesla’s repeated line on accidents is that “Autopilot is intended for use only with a fully attentive driver.”

And, the NTSB noted in multiple reports that the driver kept his hands off the wheel for extended periods of time despite repeated automated warnings not to do so. Further, NTSB said the drivers hands were on the wheel for just 25 seconds during a 37-minute period that Autopilot was engaged. Still, the agency said Tesla’s system needs more safeguards – better systems to alert drivers and detect surrounding traffic.

Monday’s collision reportedly occurred while the firetruck was parked in an emergency lane at the side of the highway attending to another accident.

 
Last edited by a moderator:
Good find that is very interesting and does appear to be similar to the firetruck crash. Seems like a flaw in the system (and in driver attention).
Indeed. I think this is some limitation of human nature (e.g. "I've never seen this situation before" = "This can't happen"). Whether we call this a flaw in the system or a known limitation, that's just mincing words. Bottom line is, you gotta watch out for stopped cars. Sure it might do the right thing 99 out of 100 times, but if that other 1 time is a fire truck and everyone needs to learn that lesson the hard way… that's not gonna work.
 
Good find that is very interesting and does appear to be similar to the firetruck crash. Seems like a flaw in the system (and in driver attention).
TACC is not designed to detect STOPPED traffic when there is nothing between the car and the stopped object. Warnings about this abound, but some still try to test Darwin's theory.
 
Autopilot is embryonic. Ultimately, like seat belts, the autopilot will not be the problem but rather, the pilot.


I couldn’t agree more. It’s maddening to me that our society refuses to place blame solely on the human(s) responsible. When the user interface and user documentation clearly state that technology is beta, that the user must pay attention at all times, and that the user be fully prepared to take control at any time, it’s absolutely insipid to ask how responsible the technology is for the accident.

Whenever some idiot wrecks their Tesla while on autopilot we get the media yelling that the sky is falling, useless government pinheads justifying their salaries by “asking for information” and spending taxpayer money assessing blame, and even members of this forum wringing their hands and gnashing their teeth.

Asking how much blame the car holds for an autopilot accident is like asking how responsible the chainsaw is for the chainsaw juggler’s severed hand.
 
Maybe autopilot was on strike because the owner failed repeatedly to properly recharge the MS. My X, MX that is, is very temperamental if not fully charged:)

The humor here is grate, thanks:)

Autopilot does not like being lied about. Where the saying used to be “it’s not nice to pee off mother nature.” The new saying as AI crawls into out lives should be “it’s not nice to pee off autopilot.”

Someone should post a chart with our input as to “was it autopilots fault” or the “driver’s fault.”
 
Indeed. I think this is some limitation of human nature (e.g. "I've never seen this situation before" = "This can't happen"). Whether we call this a flaw in the system or a known limitation, that's just mincing words. Bottom line is, you gotta watch out for stopped cars. Sure it might do the right thing 99 out of 100 times, but if that other 1 time is a fire truck and everyone needs to learn that lesson the hard way… that's not gonna work.
Couldn't agree more. Preposterous to blame the tech and even more preposterous to seek to understand if Ap1 v Ap2 given that both will not deal well with unexpectedly stationary vehicles in fast moving traffic. So what SHOULD happen.....human is alerted to obstacle ahead of time by warning signs - in future, Ap should read sign and slow/prepare to slow accordingly. If no warning sign then human should intervene at earliest opportunity if the car tech cannot (likely currently, possible in future). If in these circumstances, stopping distance isn't enough then it would never be enough...some accidents will happen whatever the tech, or the human will do to prevent.
 
Latest news is that the driver was DUI quite a bit over the limit. He used Autopilot as an excuse.

Update:
The original Washington Post article differentiates between two drivers - one on the bridge with DUI and the other one slamming into the fire truck.
As usual, the Toronto Star did not do due diligence and merged it all into one. So much for journalism. They just scour the Internet and claim they are reporting on a story.
 
Last edited:
Latest news is that the driver was DUI quite a bit over the limit. He used Autopilot as an excuse.

Update:
The original Washington Post article differentiates between two drivers - one on the bridge with DUI and the other one slamming into the fire truck.
As usual, the Toronto Star did not do due diligence and merged it all into one. So much for journalism. They just scour the Internet and claim they are reporting on a story.

Unfortunately The Washington Post also characterized both incidents as crashes. The Bay Bridge DUI incident was an example of Autopilot preventing a potential crash.
 
Amazing that no one was hurt. Chalk another one up to Model S safety.

Unfortunately, I could see accidents like this, if they continue, causing the government to curtail Autopilot capabilities. Hope that Tesla comes up with a fast software fix to improve vehicle recognition.

It will be interesting to see what analysis of the situation leading up to the collision reveals.
 
Amazing that no one was hurt. Chalk another one up to Model S safety.

Unfortunately, I could see accidents like this, if they continue, causing the government to curtail Autopilot capabilities. Hope that Tesla comes up with a fast software fix to improve vehicle recognition.

It will be interesting to see what analysis of the situation leading up to the collision reveals.

I'm actually kind of happy the NTSB/NHTSA are investigating. The last investigation was probably the most informative neutral-toned analysis of Autopilot 1.0 that we've seen, and it also motivated Tesla to make some much-needed improvements like 2-car-ahead radar. Looking forward to reading the report. If the Reddit post is true, it sounds like there's more complex factors than the driver simply failing to pay attention.
 
FYI on a reddit post: Tesla allegedly on Autopilot hits firetruck with 65mph • r/teslamotors

The driver of the Tesla is my dad's friend. He said that he was behind a pickup truck with AP engaged. The pickup truck suddenly swerved into the right lane because of the firetruck parked ahead. Because the pickup truck was too high to see over, he didn't have enough time to react. He hit the firetruck at 65mph and the steering column was pushed 2 feet inwards toward him. Luckily, he wasn't hurt. He fully acknowledges that he should've been paying more attention and isn't blaming Tesla. The whole thing was pretty unfortunate considering he bought the car fairly recently (blacked it out too).
Edit #1: He had some minor cuts and bruises, but nothing serious. As for the 65mph detail, the braking system could've intervene.