News

NHTSA Announces Autopilot Investigation

The National Highway Traffic Safety Administration (NHTSA) announced Monday that it has launched an investigation into Tesla’s Autopilot feature.

The pointed to 11 crashes since January 2018 where Tesla models operating on Autopilot “have encountered first responder scenes and subsequently struck one or more vehicles involved with those scenes.” The agency said the accidents caused 17 injuries and one death.

“Most incidents took place after dark and the crash scenes encountered included scene control measures such as first responder vehicle lights, flares, an illuminated arrow board, and road cones,” the investigation summary said. “The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes.”

The investigation includes about 765,000 Tesla vehicles in the U.S., applying to the entire lineup since 2014.

NHTSA said its investigation “will assess the technologies and methods used to monitor, assist, and enforce the driver’s engagement with the dynamic driving task during Autopilot operation.”

Dan D.

Member
Dec 7, 2020
698
797
Vancouver, BC
  • Like
Reactions: LowlyOilBurner

dingyibvs

Member
May 18, 2019
27
11
East Bay
Similar articles:
  1. US opens formal probe into Tesla Autopilot system
  2. U.S. opens formal safety probe into some 765,000 Tesla vehicles
The NHTSA should also publish how many traffic accidents have been saved and how many lives have been saved.
NHTSA web site. PDF
Quote:
The National Highway Traffic Safety Administration (NHTSA) said that since January 2018, it had identified 11 crashes in which Tesla models "have encountered first responder scenes and subsequently struck one or more vehicles involved with those scenes."

It said it had reports of 17 injuries and one death in those crashes.

The NHTSA said the 11 crashes included four this year, most recently one last month in San Diego, and it had opened a preliminary evaluation of Autopilot in 2014-2021 Tesla Models Y, X, S, and 3.

"The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes," the NHTSA said in a document opening the investigation.

I'm sure an attentive driver with active assistance features would result in less accidents. It's possible that the outcome of this will be basically forcing Tesla to have as much CYA stuff in terms of marketing and features restrictions (e.g. driver attention monitoring) as other manufacturers.
 
  • Like
Reactions: drdumont

bhzmark

Active Member
Jul 21, 2013
3,580
5,436
The issue is that it is typical procedure to use a Fire truck to park at an angle upstream of an accident scene but in doing so they often park in such a way that they completely block the shoulder but often just stick a corner of their truck partially into the adjacent lane — enough so that at least the older APs didn’t sense the partial lane blockage and still kept to the lane — just like any lane-keeping technology would likely do and also where any TACC technology will likely not have enough lane blockage to stop the car.

Hopefully this is resolved in the actual newer FSD technology that extends beyond older AP lane-keeping/TACC package. and maybe even the older AP tech can be trained to fix this also.

Of course the main solution is for the driver to pay attention — which is actually easier with AP since they are not burdened by the constant lane-keeping and car-following driving functions and thus can devote more attention to their surroundings and especially what is down the road.
 

StellarRat

Active Member
Jan 8, 2014
1,520
1,405
Pacific
If the AP can detect the status of a traffic light I'm sure it can recognize emergency lights in use on public safety vehicles. This might be something they can fix in software. On the other end, it's still a driver problem because you supposed to "be ready to take control of the vehicle at any time." It would be interesting to see if the number of Tesla crashes into emergency vehicles is any higher than other vehicles in general. My guess is NO. As usual, a few idiots are wrecking it for everyone else.
 

TSLY

Member
Jul 28, 2021
73
28
Los Angleles
The National Highway Transportation Safety Administration announced the investigation today, and it encompasses 765,000 Teslas sold in the US, a significant fraction of all of the company’s sales in the country. The agency says the probe will cover 11 crashes since 2018; the crashes caused 17 injuries and one death. News reports have not yet focused on Tesla dropping LiDAR and radar in MY/M3s (I'm expecting a November EDD MY). The investigation may kibosh all upcoming EDDs in addition to triggering a massive Tesla recall. I ordered the MY prior to Tesla's announced change, which I found cavalier and self serving. I know Elon has been trash talking radar and LiDAR awhile, but the contemporaneous and parallel shortage of components and timing of the announcement felt fishy to me. I'm still stoked to get my MY but would no more rely on Tesla's auto driving system than I rely on autodrive in my 2019 BMW 540i M Sport, which sports the same feature set. What do you all think?
 

AndreP

Member
Apr 22, 2021
148
97
United States
yes, that is the way I read it. I won't make any assumptions on the point you may be implying, so help me to understand ;)
Mostly surprise from the lack of awareness around these crashes considering how publicized autonomous vehicle accidents are, I didn't even know crashes into first responders was a big issue

Digging into the details and reading other well-written articles, this is happening under the context of a "Defect Investigation" where the NHTSA would likely be looking to initiate a recall of some kind. This is a snip of the actual NHTSA document

Po3IiqB.png


I think this suggests the main take-away will be around ensuring driver engagement when driver assist technology is in use, we already know that all driver assist systems can be easily fooled. I'd wonder what else can be done though, people who work / do research in that corner of the industry likely have an idea
 
  • Informative
Reactions: Silicon Desert

DanCar

Active Member
Oct 2, 2013
2,023
1,775
SF Bay Area
A couple of questions:
  1. What percent of these accidents happened with Autopilot 1? Even if accident happened in 2020 it still could be caused by Autopilot 1.
  2. What percent of these accidents happened with Autopilot 2 but older firmware? In other words any investigation into autopilot 2 might be obsolete because of updates to driver attention monitoring.
 
  • Like
Reactions: APotatoGod

Matias

Active Member
Apr 2, 2014
3,430
3,774
Finland
A couple of questions:
  1. What percent of these accidents happened with Autopilot 1? Even if accident happened in 2020 it still could be caused by Autopilot 1.
  2. What percent of these accidents happened with Autopilot 2 but older firmware? In other words any investigation into autopilot 2 might be obsolete because of updates to driver attention monitoring.
I’m sure investigation will answer those questions.
 

linux-works

Active Member
Dec 23, 2019
2,174
3,759
mtn view, ca
The issue is that it is typical procedure to use a Fire truck to park at an angle upstream of an accident scene but in doing so they often park in such a way that they completely block the shoulder but often just stick a corner of their truck partially into the adjacent lane — enough so that at least the older APs didn’t sense the partial lane blockage and still kept to the lane — just like any lane-keeping technology would likely do and also where any TACC technology will likely not have enough lane blockage to stop the car.

Hopefully this is resolved in the actual newer FSD technology that extends beyond older AP lane-keeping/TACC package. and maybe even the older AP tech can be trained to fix this also.

Of course the main solution is for the driver to pay attention — which is actually easier with AP since they are not burdened by the constant lane-keeping and car-following driving functions and thus can devote more attention to their surroundings and especially what is down the road.
again, I will do my harping on v2x tech.

imagine if all 'important do not hit me, please!' vehicles were sending out beacons, or even master nodes that 'know' about certain things, they send out the beacons. any car nearby or even advancing could know of the presence of things like this.

all very do-able.

IF we would just decide we want it.

(sigh. I hate there being tech that people refuse to use. its like..... no.... I wont say it. nope, not here.)
 

ZilWin

Member
May 29, 2021
174
117
North America, Earth
The issue is that it is typical procedure to use a Fire truck to park at an angle upstream of an accident scene but in doing so they often park in such a way that they completely block the shoulder but often just stick a corner of their truck partially into the adjacent lane — enough so that at least the older APs didn’t sense the partial lane blockage and still kept to the lane — just like any lane-keeping technology would likely do and also where any TACC technology will likely not have enough lane blockage to stop the car.

Hopefully this is resolved in the actual newer FSD technology that extends beyond older AP lane-keeping/TACC package. and maybe even the older AP tech can be trained to fix this also.

Of course the main solution is for the driver to pay attention — which is actually easier with AP since they are not burdened by the constant lane-keeping and car-following driving functions and thus can devote more attention to their surroundings and especially what is down the road.
I have noticed the TACC and FSD visualizations do not show vehicles parked in the shoulder yet it will display vehicles in adjacent lanes and traffic signs and cones in the shoulder. Maybe the car actually considers those objects and just does not display them but if a emergency vehicle is partially obstructing my travel lane, i would thing AP should pick that up and take appropriate measures to avoid/mitigate.
 

StellarRat

Active Member
Jan 8, 2014
1,520
1,405
Pacific
AP has a problem with stopped objects. It always has. If an object is not moving it has to rely on what is looks like to determine if it supposed to be there or not. "Is it an overpass or semi across the road, etc..." When something is moving on the road you can pretty much assume it's something to avoid. It also has big problems when there are no lane markings. I worked as a programmer for years and IMO this is frightenly complicated problem. The things we did were child's play compared to self-driving technology and no one's life depended on our software being correct 100% of the time.
 

sunfarm

2021M3LR, Blue, 19", FSD
Jun 21, 2021
144
81
Canada
In my opinion Elon and Tesla slowly, but steady going to mass lawsuit against them. Customers more and more asking why I bought FSD and it's doesn't works. If you tested through beta testers it's fine, but stop selling it publicly. When it will be ready starts sell, but not in development stage and future promises.
 

daktari

Member
Jan 21, 2017
919
1,035
Norway
If you look at EuroNCAP videos, no cars are able to stop safely for the stationary car dummy at highway speeds. Radar and camera range seem to low. The system should brake and reduce severity though, not just plow on.

Then you have the complacency issue. That is Tesla's biggest problem. The system user interface is designed faulty and thus make the driver complacent. The car tells driver "Pling I drive" until "plong you drive". But the human can't drive together with the car, it is binary. You can't give small corrections, the car will then abort "take your hands away or I will cancel". Car takes full control and people trust the car to much.That could work for a Level 3 system, not a Level 2.
 

Products we're discussing on TMC...