Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

U.S. opens formal safety probe for Autopilot - 2021 Aug 16

This site may earn commission on affiliate links.

Terminator857

Active Member
Aug 5, 2019
1,490
1,790
Ca
Similar articles:
  1. US opens formal probe into Tesla Autopilot system
  2. U.S. opens formal safety probe into some 765,000 Tesla vehicles
The NHTSA should also publish how many traffic accidents have been saved and how many lives have been saved.
NHTSA web site. PDF
Quote:
The National Highway Traffic Safety Administration (NHTSA) said that since January 2018, it had identified 11 crashes in which Tesla models "have encountered first responder scenes and subsequently struck one or more vehicles involved with those scenes."

It said it had reports of 17 injuries and one death in those crashes.

The NHTSA said the 11 crashes included four this year, most recently one last month in San Diego, and it had opened a preliminary evaluation of Autopilot in 2014-2021 Tesla Models Y, X, S, and 3.

"The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes," the NHTSA said in a document opening the investigation.
 
Last edited:
Wait there have been 11 instances of vehicles on autopilot/TACC crashing into first responder vehicles already dealing with an accident?
 
Last edited by a moderator:
  • Funny
Reactions: Shateam
  • Like
Reactions: LowlyOilBurner
Similar articles:
  1. US opens formal probe into Tesla Autopilot system
  2. U.S. opens formal safety probe into some 765,000 Tesla vehicles
The NHTSA should also publish how many traffic accidents have been saved and how many lives have been saved.
NHTSA web site. PDF
Quote:
The National Highway Traffic Safety Administration (NHTSA) said that since January 2018, it had identified 11 crashes in which Tesla models "have encountered first responder scenes and subsequently struck one or more vehicles involved with those scenes."

It said it had reports of 17 injuries and one death in those crashes.

The NHTSA said the 11 crashes included four this year, most recently one last month in San Diego, and it had opened a preliminary evaluation of Autopilot in 2014-2021 Tesla Models Y, X, S, and 3.

"The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes," the NHTSA said in a document opening the investigation.

I'm sure an attentive driver with active assistance features would result in less accidents. It's possible that the outcome of this will be basically forcing Tesla to have as much CYA stuff in terms of marketing and features restrictions (e.g. driver attention monitoring) as other manufacturers.
 
  • Like
Reactions: drdumont
The issue is that it is typical procedure to use a Fire truck to park at an angle upstream of an accident scene but in doing so they often park in such a way that they completely block the shoulder but often just stick a corner of their truck partially into the adjacent lane — enough so that at least the older APs didn’t sense the partial lane blockage and still kept to the lane — just like any lane-keeping technology would likely do and also where any TACC technology will likely not have enough lane blockage to stop the car.

Hopefully this is resolved in the actual newer FSD technology that extends beyond older AP lane-keeping/TACC package. and maybe even the older AP tech can be trained to fix this also.

Of course the main solution is for the driver to pay attention — which is actually easier with AP since they are not burdened by the constant lane-keeping and car-following driving functions and thus can devote more attention to their surroundings and especially what is down the road.
 
If the AP can detect the status of a traffic light I'm sure it can recognize emergency lights in use on public safety vehicles. This might be something they can fix in software. On the other end, it's still a driver problem because you supposed to "be ready to take control of the vehicle at any time." It would be interesting to see if the number of Tesla crashes into emergency vehicles is any higher than other vehicles in general. My guess is NO. As usual, a few idiots are wrecking it for everyone else.
 
The National Highway Transportation Safety Administration announced the investigation today, and it encompasses 765,000 Teslas sold in the US, a significant fraction of all of the company’s sales in the country. The agency says the probe will cover 11 crashes since 2018; the crashes caused 17 injuries and one death. News reports have not yet focused on Tesla dropping LiDAR and radar in MY/M3s (I'm expecting a November EDD MY). The investigation may kibosh all upcoming EDDs in addition to triggering a massive Tesla recall. I ordered the MY prior to Tesla's announced change, which I found cavalier and self serving. I know Elon has been trash talking radar and LiDAR awhile, but the contemporaneous and parallel shortage of components and timing of the announcement felt fishy to me. I'm still stoked to get my MY but would no more rely on Tesla's auto driving system than I rely on autodrive in my 2019 BMW 540i M Sport, which sports the same feature set. What do you all think?
 
yes, that is the way I read it. I won't make any assumptions on the point you may be implying, so help me to understand ;)
Mostly surprise from the lack of awareness around these crashes considering how publicized autonomous vehicle accidents are, I didn't even know crashes into first responders was a big issue

Digging into the details and reading other well-written articles, this is happening under the context of a "Defect Investigation" where the NHTSA would likely be looking to initiate a recall of some kind. This is a snip of the actual NHTSA document

Po3IiqB.png


I think this suggests the main take-away will be around ensuring driver engagement when driver assist technology is in use, we already know that all driver assist systems can be easily fooled. I'd wonder what else can be done though, people who work / do research in that corner of the industry likely have an idea
 
  • Informative
Reactions: Silicon Desert
A couple of questions:
  1. What percent of these accidents happened with Autopilot 1? Even if accident happened in 2020 it still could be caused by Autopilot 1.
  2. What percent of these accidents happened with Autopilot 2 but older firmware? In other words any investigation into autopilot 2 might be obsolete because of updates to driver attention monitoring.
 
A couple of questions:
  1. What percent of these accidents happened with Autopilot 1? Even if accident happened in 2020 it still could be caused by Autopilot 1.
  2. What percent of these accidents happened with Autopilot 2 but older firmware? In other words any investigation into autopilot 2 might be obsolete because of updates to driver attention monitoring.
I’m sure investigation will answer those questions.
 
The issue is that it is typical procedure to use a Fire truck to park at an angle upstream of an accident scene but in doing so they often park in such a way that they completely block the shoulder but often just stick a corner of their truck partially into the adjacent lane — enough so that at least the older APs didn’t sense the partial lane blockage and still kept to the lane — just like any lane-keeping technology would likely do and also where any TACC technology will likely not have enough lane blockage to stop the car.

Hopefully this is resolved in the actual newer FSD technology that extends beyond older AP lane-keeping/TACC package. and maybe even the older AP tech can be trained to fix this also.

Of course the main solution is for the driver to pay attention — which is actually easier with AP since they are not burdened by the constant lane-keeping and car-following driving functions and thus can devote more attention to their surroundings and especially what is down the road.
again, I will do my harping on v2x tech.

imagine if all 'important do not hit me, please!' vehicles were sending out beacons, or even master nodes that 'know' about certain things, they send out the beacons. any car nearby or even advancing could know of the presence of things like this.

all very do-able.

IF we would just decide we want it.

(sigh. I hate there being tech that people refuse to use. its like..... no.... I wont say it. nope, not here.)
 
The issue is that it is typical procedure to use a Fire truck to park at an angle upstream of an accident scene but in doing so they often park in such a way that they completely block the shoulder but often just stick a corner of their truck partially into the adjacent lane — enough so that at least the older APs didn’t sense the partial lane blockage and still kept to the lane — just like any lane-keeping technology would likely do and also where any TACC technology will likely not have enough lane blockage to stop the car.

Hopefully this is resolved in the actual newer FSD technology that extends beyond older AP lane-keeping/TACC package. and maybe even the older AP tech can be trained to fix this also.

Of course the main solution is for the driver to pay attention — which is actually easier with AP since they are not burdened by the constant lane-keeping and car-following driving functions and thus can devote more attention to their surroundings and especially what is down the road.
I have noticed the TACC and FSD visualizations do not show vehicles parked in the shoulder yet it will display vehicles in adjacent lanes and traffic signs and cones in the shoulder. Maybe the car actually considers those objects and just does not display them but if a emergency vehicle is partially obstructing my travel lane, i would thing AP should pick that up and take appropriate measures to avoid/mitigate.
 
  • Informative
Reactions: pilotSteve
AP has a problem with stopped objects. It always has. If an object is not moving it has to rely on what is looks like to determine if it supposed to be there or not. "Is it an overpass or semi across the road, etc..." When something is moving on the road you can pretty much assume it's something to avoid. It also has big problems when there are no lane markings. I worked as a programmer for years and IMO this is frightenly complicated problem. The things we did were child's play compared to self-driving technology and no one's life depended on our software being correct 100% of the time.
 
If you look at EuroNCAP videos, no cars are able to stop safely for the stationary car dummy at highway speeds. Radar and camera range seem to low. The system should brake and reduce severity though, not just plow on.

Then you have the complacency issue. That is Tesla's biggest problem. The system user interface is designed faulty and thus make the driver complacent. The car tells driver "Pling I drive" until "plong you drive". But the human can't drive together with the car, it is binary. You can't give small corrections, the car will then abort "take your hands away or I will cancel". Car takes full control and people trust the car to much.That could work for a Level 3 system, not a Level 2.