Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Blog NTSA Asks Why Tesla Did Not Issue Recall Before Safety Updates

This site may earn commission on affiliate links.


National Highway Traffic Safety Administration wants to know why Tesla didn’t issue a recall when it delivered a safety update to its software.

A letter published on the NHTSA website says the updates in question include “Tesla’s late September 2021 distribution of functionality to certain Tesla vehicle models intended to improve detection of emergency vehicle lights in low light conditions, and Tesla’s early October 2021 release of the Full Self-Driving Beta Request Menu option.”

NHTSA is currently investigating wrecks that involved Tesla drivers crashing into emergency vehicles parked on the side of the road. In every incident, Tesla’s Autopilot was engaged.

NHTSA wants to know why Tesla waited to deliver a safety fix before it communicated with owners that a safety issue existed.

“As Tesla is aware, the Safety Act imposes an obligation on manufacturers of motor vehicles and motor vehicle equipment to initiate a recall by notifying NHTSA when they determine vehicles or equipment they produced contain defects related to motor vehicle safety or do not comply with an applicable motor vehicle safety standard,” NHTSA’s note said.

Additionally, NHTSA said it wants more information on Tesla’s recent software update for its Full Self Driving Beta program. In particular, NHTSA wants information on how Tesla selects participants for the FSD software testing.

 
Last edited by a moderator:
The obvious answer is that Tesla does not consider it a safety defect. But NHTSA would get an easy "gotcha" if Tesla does a recall on this (which is admitting it's a safety defect) and it'll be an open and shut case for NHTSA's most recent defect investigation if Tesla does that.

NHTSA's previous investigation in 2016 already said that known limitations in AP and AEB (like not being able to stop for cross traffic or react consistently to stopped vehicles, something shared with practically all ADAS systems) are not safety defects, and that improvements Tesla made via OTAs (which they did also do back then) does not invalidate that.

I expect Tesla to fight this tooth and nail and make the point that if any OTA that improves safety beyond what was originally designed is forced to be considered a recall (and an admission of a safety defect in first place) this will have a chilling effect on such improvements in the future.
I think the NHTSA is trying to walk Tesla into something of a corner with this. Have you read the letter and the questions posed?

Question 3 essentially asks whether Tesla believes the changes in this update would have had any material impact on the outcomes of crashes into emergency vehicles already being investigated. If Tesla says yes, that would seem to be admittance of correcting what they’d call a “defect that pose(d) an unreasonable risk to motor vehicle safety”. The other option is to argue that this update actually would have made no difference in the outcomes.

Question 4 asks whether Tesla plans to file recall documentation or explain why not, and it seems the consensus here is that a recall isn’t required because the driver is responsible at all times.

I don’t believe the NHTSA has an issue with the software to begin with, I think they mostly have an issue with driver engagement while software like this is being used. It seems that the strike-out aspect of the steering wheel nag might have been a development spurred by the ODI investigation you referenced here…
 
It has nothing to do with the feature itself. It’s that they did not follow protocol for a known safety issue.

Did you just skim the headline?

Other posters have quoted the NHTSA reg above. Try reading it.
I'm still struggling to understand what the "safety issue" is? The system was not designed to avoid emergency vehicles, just like every other car isn't. Having Autopilot enabled is irrelevant when it's the driver who is clearly responsible for avoiding emergency vehicles. The NHTSA has gotten ridiculous.
 
The NHTSA won’t argue the software should detect emergency vehicles and avoid them, they will use this as evidence that the system won’t do those things. And it will all feed into a broader call for systems that do more to ensure driver’s don’t become complacent with software like this.

I don’t think you guys are giving the NHTSA enough credit here, they know exactly what they’re asking
 
IMHO, avoiding emergency vehicles is not a bug fix, it is a new feature.
Does it avoid emergency vehicles? Or does the driver need to be fully engaged at all times because the manual clearly states the feature cannot be depended on to detect emergency vehicles in all situations?
 
Question 3 essentially asks whether Tesla believes the changes in this update would have had any material impact on the outcomes of crashes into emergency vehicles already being investigated. If Tesla says yes, that would seem to be admittance of correcting what they’d call a “defect that pose(d) an unreasonable risk to motor vehicle safety”. The other option is to argue that this update actually would have made no difference in the outcomes.
But that's a false dilemma, must because an update may change the outcomes of previous crashes, doesn't mean that it's a defect. For example, Tesla had previously updated AEB in various stages over OTA, which would have changed outcomes of previous crashes, but that doesn't mean the cars were defective.
Consumer Reports says Tesla's AEB software update is only a partial fix
I don’t believe the NHTSA has an issue with the software to begin with, I think they mostly have an issue with driver engagement while software like this is being used. It seems that the strike-out aspect of the steering wheel nag might have been a development spurred by the ODI investigation you referenced here…
Are you referring to the previous investigation or the new one? In the previous NHTSA did not determine a defect in the software. In this new investigation, they may regardless of drive engagement (which previously they determined was sufficient).
 
Reiterating laughing at this. HIlarious.

Attempting to equate the two takes some real pretzel logic.

This NHTSA v Tesla case reminds me of the way how the legal system works as in the investigation of Clinton's Whitewater corruption (It's about money).

When the prosecutor could not prove anything for years about the money crime, he took another fork down the road with an event that happened long after the investigation started:

Clinton-Lewinsky Affair (It's about sex): But it's not a crime to have an affair in the US. So, Clinton was prosecuted not for having sex itself but for lying about sex.

If Clinton respected the investigation and said the truth that he had sex, then it's not a crime.

Because Clinton didn't respect the investigation and lied about sex, that's how he got into trouble with the law.

Tesla has a good history of respecting authority in China as we haven't seen any fights between Tesla and Chinese regulators at all.

It's another story in the US: Fighting with SEC, downplaying NTSB, defying California Covid-19 shutdown...

What I don't like about an investigation is: A company can get into trouble, not because of any safety crimes in the first place but because of the procedural crime, because of not following the protocol during an investigation.

If there's no investigation, then there's no penalty. Because of the investigation and the defiance of investigation protocol, that becomes a newly created crime that did not happen prior to the investigation.

So how should Tesla act? Complying, cooperating with the US authorities just like the way it does with China, or keep raising hell in the US?
 
I'm still struggling to understand what the "safety issue" is? The system was not designed to avoid emergency vehicles, just like every other car isn't. Having Autopilot enabled is irrelevant when it's the driver who is clearly responsible for avoiding emergency vehicles. The NHTSA has gotten ridiculous.
Really? You don’t see how plowing into the back of an emergency vehicle with bright flashing lights is a safety issue?

You say autopilot was not designed to avoid them. Okay. Was it also designed to ram into the back of a passenger vehicle it is following? Or a stopped vehicle on the road?

It really does not matter whether it is a safety vehicle, passenger vehicle, or a concrete barrier. The system should be avoiding these hazards when activated. If it cannot, that is a BIG problem.
 
Really? You don’t see how plowing into the back of an emergency vehicle with bright flashing lights is a safety issue?

You say autopilot was not designed to avoid them. Okay. Was it also designed to ram into the back of a passenger vehicle it is following? Or a stopped vehicle on the road?

It really does not matter whether it is a safety vehicle, passenger vehicle, or a concrete barrier. The system should be avoiding these hazards when activated. If it cannot, that is a BIG problem.
None of the L1/L2 ADAS systems can do that nor are they designed to do so. Read the ODI report, section 5.0.
https://static.nhtsa.gov/odi/inv/2016/INCLA-PE16007-7876.PDF

AEB might be able to do it in come circumstances, but the standard out there have them specified only to be able handle straight line stops, stationary only for city speeds (37 mph), and a moving following vehicle for higher speeds (50mph).
The newer tests have offsets introduced, but still don't test things with curves, and certainly not partial lane objects where it's offset more than 50% (like emergency vehicles frequently are, they are only partially in the lane, but not fully 50%)
AEB Car-to-Car | Euro NCAP
And note even with low score, or if it fails to activate in the real world, that is not considered a defect, as long as it works within the designed parameters.

Also note that avoidance maneuvers are definitely not viable for the systems out there, and may introduce even more danger (you definitely don't want the car suddenly changing lanes on its own or moving partially into an adjacent lane). At most it'll be to brake within the same lane or slow down (to reduce the speed of the crash). This has to be balanced with phantom braking and rear end risk for false positives.
 
When are other car manufacturers going to recall their cars to enable this feature?
When? right after ICE auto fires per million miles are subjected to the better stats of EV fires.
;)
They should. But its all about the mighty dollar. Why sue drivers for a measly couple thousand, when you can sue the company for a couple million?!?
Be assured - Tesla (having a vested interest in the possible outcome) would be right there anyway to protect their back side - rightfully so.
.
 
Last edited:
  • Like
Reactions: rxlawdude
Really? You don’t see how plowing into the back of an emergency vehicle with bright flashing lights is a safety issue?

You say autopilot was not designed to avoid them. Okay. Was it also designed to ram into the back of a passenger vehicle it is following? Or a stopped vehicle on the road?

It really does not matter whether it is a safety vehicle, passenger vehicle, or a concrete barrier. The system should be avoiding these hazards when activated. If it cannot, that is a BIG problem.
But still, it’s the drivers fault, he should have taken over the control of the car. It’s not like you don’t see these flashing lights on such a vehicle from a distance. The fact that he was even driving on AP in such a situation says a lot for me.

The system probably still makes a lot of yet to be discovered errors …. so what….you are responsible the moment you turn on auto pilot or decide to use FSD at this stage.

I don’t know how many times more they need to state this in the manual.

and what about the constant nag in the car when auto pilot is activated? So driving with your hands on the wheel, focussed on the road and still hitting an emergency vehicle?! Sorry but that is just BS 😉

The only one to blame is your fellow man in this case Or like the NRA says: guns don’t kill people, people kill people
 
Tesla has really pushed the NHTSA to modernize and consider policy on 2 items: driver assist/self driving, and OTA software updates. No other maker has pursued these as aggressively, tho many are rapidly playing catch-up. The first mover on paradigm changes always takes the arrows.hope the NHTSA recognizes and is fair with them.
 
Really? You don’t see how plowing into the back of an emergency vehicle with bright flashing lights is a safety issue?

You say autopilot was not designed to avoid them. Okay. Was it also designed to ram into the back of a passenger vehicle it is following? Or a stopped vehicle on the road?

It really does not matter whether it is a safety vehicle, passenger vehicle, or a concrete barrier. The system should be avoiding these hazards when activated. If it cannot, that is a BIG problem.
Wrong. If the system was designed to avoid emergency vehicles, then I could see the point of a recall. No version of fancy cruise control out there avoids stopped emergency vehicles. Are you advocating they should be recalled too for being faulty. Perhaps we should recall human drivers for being faulty too especially since they are in fact designed to avoid such scenarios.