Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla ADAS Incident Reports

This site may earn commission on affiliate links.
I'm starting this thread to analyze incident reports that Tesla files with NHTSA regarding crashes involving cars that might be related to FSD or AP or NOA. Note that Tesla has to report any incidence where the crash happens even if AP/FSD/NOA was engaged within 30 seconds prior to the crash.

Here is the NHTSA site where you can see details about the report and download the data. The data is for all OEMs.


Data sheet, 2022 : https://static.nhtsa.gov/odi/ffdd/sgo-2021-01/SGO-2021-01_Incident_Reports_ADAS.csv

Data Definitions : https://static.nhtsa.gov/odi/ffdd/sgo-2021-01/SGO-2021-01_Data_Element_Definitions.pdf

Tesla withholds a large number of data in the fields as "[REDACTED, MAY CONTAIN CONFIDENTIAL BUSINESS INFORMATION]". Still there are a lot of useful fields for analysis. We can use these fields to try to figure out how many crashes are AP/NOA and how many are FSDb, which is my first objective.

For eg. below I've a pivot table by "Roadway Type". Yellow is clearly AP/NOA, Green is FSD and the other two could be mixed.

1674936307508.png


Here is the breakdown by posted speed limit. Again we can assume anything below 60 is FSDb (though there are edge cases).


1674936455065.png
 
I think I read that the police confirmed she was asleep.
So, it seems likely that at least AP prevented a serious accident; surely, we'll see that in the ADAS data somewhere?
Source?

It’s very possible that she was and the system shut down as designed but there’s so much rumor and misinformation out there that we shouldn’t take a forum post as anything but more conjecture whithout a source to back it up.
 
No idea how they managed to do that - and why AP/FSD was involved
I suppose you were right to question how Autopilot was involved in this case as it turns out, the vehicle didn't even have it equipped:


But that makes the NHTSA data even more curious as did Tesla report it because there were media (which is a standing order reporting requirement) articles speculating that Autopilot was involved?
 
  • Informative
Reactions: sleepydoc and EVNow
I suppose you were right to question how Autopilot was involved in this case as it turns out, the vehicle didn't even have it equipped:


But that makes the NHTSA data even more curious as did Tesla report it because there were media (which is a standing order reporting requirement) articles speculating that Autopilot was involved?
The problem is there’s a lot of confusion about FSD(b), autopilot, etc. there have also been some high profile cases of people abusing the tech and a couple accidents where it failed. Combine that with a flamboyant (and narcissistic) CEO who’s prone to …shall we say optimistic predictions and you have a perfect setup for headlines that jump to conclusions.
 
  • Like
Reactions: Nolakai
Here's a NHTSA report that just came out. It's almost entirely about Autopilot, but there is a mention of one FSD crash with a fatality.


The report mentions FSD on one line of the report, stating that between August 2022 and August 2023, there were 60 crashes examined and one of those involved a fatality. This is apparently an at-fault crash, but there is no documentation on the crash in the report.
What makes you think it's a FSD Beta at-fault crash? Is it because if not, it would be "Other Vehicle Fault?"

Here's the entries for Tesla with Fatality from August 2022 to August 2023 from the Standing General Order ADAS data:
aug22-23 fatalities.png


Unclear if these 12 include the 1 "FSD-Beta Crash" of 14 Supplemental Crash Analysis Fatalities. But I could only make it through looking a few entries before deciding that's enough investigating for me.

Here's some articles written about the "Street" entries that could be more likely FSD Beta:
 
  • Informative
Reactions: JB47394 and EVNow
Here is something we should all be discussing.

This analysis, conducted before Recall 23V838, indicated that drivers involved in the crashes were not
sufficiently engaged in the driving task and that the warnings provided by Autopilot when Autosteer
was engaged did not adequately ensure that drivers maintained their attention on the driving task.
The drivers were involved in crashes while using Autopilot despite fulfilling Tesla’s pre-recall driver
engagement monitoring criteria. Crashes with no or late evasive action attempted by the driver were
found across all Tesla hardware versions and crash circumstances.

May be needs a new thread.
 
What makes you think it's a FSD Beta at-fault crash? Is it because if not, it would be "Other Vehicle Fault?"
That was the way I read it, but "Other Vehicle Fault" could be specific to Autopilot cases, where they didn't bother breaking down FSD cases. Of course, a fatality is a fatality, and it may have been of interest to @aronth5.

But I could only make it through looking a few entries before deciding that's enough investigating for me.
Yeah, trying to sift through this stuff isn't fun - such as interpreting label meanings. Especially when reports lack supporting information. What I didn't want to do was try to figure out which event might be the one that NHTSA considered an FSD event with a fatality. That way lies madness.
 
  • Informative
Reactions: EVNow
Here is something we should all be discussing

did not adequately ensure that drivers maintained their attention on the driving task​
In the 3 example of "Street" Roadway Type, potentially the Tesla driver was under the influence of drugs, did not see high speed red-light runner, did not expect oncoming car to head-on. There's one more non-Highway incident report labeled "Intersection" in Brooklyn, NY:


These last 2 both involved 17-year-old drivers (South Lake Tahoe was the other driver), but the first and last seem to have been reckless driving (97mph in 45 and 44mph in 25). I suppose any at-fault accident could be considered driver not maintaining enough attention to the driving task even if Autopilot wasn't engaged. Presumably NHTSA analysis has actual insights into whether Autopilot/FSD was engaged as opposed to Tesla needing to report an incident /because/ media speculated Autopilot might have been engaged.
 
These last 2 both involved 17-year-old drivers (South Lake Tahoe was the other driver), but the first and last seem to have been reckless driving (97mph in 45 and 44mph in 25).
I hope AP/FSD was not involved. Tesla would be accused of "enabling" reckless behavior.

I hope they get to a point where AP/FSD refuses to engage if it determines reckless behavior. FSD is now good in the sense it won't speed beyond reasonable speed. They can implement something like max speed + 10% on empty roads (and traffic speed in others as max).
 
How about this one? Though the driver admitted he was inattentive while using Autopilot, the preventable death still sounds unnerving.


MALTBY — A Tesla driver who had set his car on Autopilot was “distracted” by his phone before reportedly hitting and killing a motorcyclist Friday on Highway 522, according to a new police report.

Around 3:45 p.m., a Snohomish man in a 2022 Tesla Model S was driving home behind a motorcyclist at Fales Road in Maltby, according to the report.

The man, 56, had activated Tesla’s Autopilot feature. He was using his phone when he heard a bang as his car lurched forward and crashed into the motorcycle in front of him, troopers wrote.

The motorcyclist, Jeffrey Nissen, was ejected. The Tesla was lodged on top of him, police said.
 
I think all deaths caused by distracted drivers is unnerving, regardless of ADAS use or not.
More importantly, we have no confirmation that the car was on AP at the time. There have been many accidents where the driver claims the car was on AP, but turns out it wasn't. It's a common human "panic" statement when faced with possible criminal or civil penalties - "It wasn't my fault"
 
  • Like
Reactions: MP3Mike
I don't think it's easy to avoid any death linked to inattentive use. What worries me is that the press spins this as a failure of technology. The voters may press the regulators to change their minds and to make progress unreasonably hard.
If anything, I see regulations requiring driver monitoring systems in all vehicles at all times even without ADAS capabilities in use. But it will be a long time coming, just like AEB won't be mandatory until September, 2029. (And that is assuming that OEMs don't petition, and get granted, a delay to that requirement.)
 
If anything, I see regulations requiring driver monitoring systems in all vehicles at all times even without ADAS capabilities in use. But it will be a long time coming, just like AEB won't be mandatory until September, 2029. (And that is assuming that OEMs don't petition, and get granted, a delay to that requirement.)
Pretty sure nearly every new car already has AEB. Automakers fulfill autobrake pledge for light-duty vehicles
 
  • Informative
Reactions: Daniel in SD