Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

U.S. opens formal safety probe for Autopilot - 2021 Aug 16

This site may earn commission on affiliate links.
It's really an extension of the existing investigation into autopilot crashing into emergency vehicles

This is the first letter about the emergency light detection and some FSD Beta questions


And this is the second letter about any NDAs associated with FSD Beta enrolment



Some of the questions in the first letter are interesting. Question 3 is asking for an assessment of whether the changes from the emergency light detection update would have altered the outcome of the previous crashes being investigated.

Question 5 asks to see the agreement between Tesla and vehicle owners in terms of repairs, access to software, OTA updates, and compensation/goodwill available including when it comes to resolving lawsuits or arbitration. Reading between the lines, this almost feels tied to limiting FSD Beta access.
 
Last edited by a moderator:
It's really an extension of the existing investigation into autopilot crashing into emergency vehicles

This is the first letter about the emergency light detection and some FSD Beta questions


And this is the second letter about any NDAs associated with FSD Beta enrolment



Some of the questions in the first letter are interesting
Thanks for the link to the letters, it validates my previous thoughts:

The letter is stating:
"As Tesla is aware, the Safety Act imposes an obligation on manufacturers of motor vehicles and
motor vehicle equipment to initiate a recall by notifying NHTSA when they determine vehicles
or equipment they produced contain defects related to motor vehicle safety or do not comply
with an applicable motor vehicle safety standard
. "

An improvement of current feature sets such as detection of emergency vehicles is in my opinion not:
a/ a mitigation to a defect
and not
b/ a non-compliance to applicable safety standards

The addition of a feature that could assist a responsible driver further than any other vehicle across all brands is not the same as a recall but rather an evolution of technology.
 
We all need to continue spreading the word that Tesla cars are currently Level 2, reporting full attention by the driver at all times....
It's a shame that to do that, we have to push against Tesla's own naming and videos on their autopilot pages. You can't both say "Full Self Driving" is "only a name" and then decide that the word needs to be spread that "Full Self Driving requires driver attention at all times." A simple name change by Tesla here would be massively effective on the safety side, but tragic on the marketing side.

It's really an extension of the existing investigation into autopilot crashing into emergency vehicles
Not quite:

In a separate order to Tesla, NHTSA says that the company may be taking steps to hinder the agency’s access to safety information by requiring drivers who are testing “Full Self-Driving” software to sign non-disclosure agreements.

The order demands that Tesla describe the non-disclosure agreements and say whether the company requires owners of vehicles with Autopilot to agree “to any terms that would prevent or discourage vehicle owners from sharing information about or discussing any aspect of Autopilot with any person other than Tesla.”

They are also starting a new thread asking Tesla about FSD, and how NDAs are blocking NHTSA from keeping an eye on it, and also asking how they pick their "beta" testers.
 
Last edited:
contain defects related to motor vehicle safety or do not comply
with an applicable motor vehicle safety standard
. "
There's an OR in there.
If the radio in the car suddenly starts playing a screeching tone at 100dB that can't be turned off, that can be a safety defect even if a radio is not required by FVMSS.

The question here is if "Autopilot" can be safely used by the population of Tesla drivers. If it can't, it poses a public safety risk. The investigation is clearly focused on determining if the rate of accidents on AP is much higher than when the car is driven manually. If it is, then it is important to look into it, because it's not actually a safety system at that point. You can't just claim something is a safety enhancement because it's "Designed to be one". It actually has to work, and work with real human drivers using it, not just perfect ones.
 
  • Like
Reactions: drdumont
The NHTSA has submitted an updated request as part of this investigation and now seem to be digging deeper into the cabin camera functionality

The amount of detail they’re requesting is insane, and I feel like the language used is becoming more aggressive along with tweaks to the structure of previous requests that one could use to infer some stuff about Tesla’s responses to date


Some big new questions I’m seeing:

Whether each VIN has a cabin camera installed.
Whether each VIN is radarless Tesla Vision.
Mileage with cabin camera data sharing enabled.
Whether FSD is purchased and everything there including whether they’ve requested Beta access, whether they’ve been admitted, when, most recent Safety Score.

The NHTSA is now specifically questioning Tesla’s Vehicle Safety Report statistics that claim Autopilot has accident rates so much lower than whatever metric they use. They’re asking about road types included, data sources, assumptions, and all other relevant parameters.

And then asking a ton of questions about the driver attentiveness system, its design, and how it functioned leading up to the incidents.
 
Last edited:
  • Informative
Reactions: daktari and Dan D.
And you know what? I have yet to hear of an autonomous elevator which crashed into another elevator...
Umm... There are elevator systems with more than one car. Or two cars one atop the other. Indeed, I agree I've never heard of them crashing into one another, but then, they are on rails and have rather sophisticated supervisory controls. So are trains (railroad vehicles). With the exception of some autonomous airport systems there is at least one trained, alert human at the controls. On the railroads, inattention for 20 seconds or more gets you at first a warning, then an emergency stop.

The railroad system works similarly to the Tesla system. Unless you have done something to indicate to the vehicle that you are sentient, like blow the whistle, start/stop the bell, alter the throttle, move a brake handle (there are two), even turning on the wipers or headlights or of course actuating the alerter reset (a footswitch, pushbutton, switch stalk or even a touch sensitive place on the desk) - well, you get the picture. If you haven't performed one of these little actions, the air is dumped, emergency brakes applied, engine throttle goes to idle, and a radio message is sent to a dispatcher/controller.
You can experiment with your Tesla, even moving one of the rollers on the wheel will reset the alerter.

If you DO get a penalty stop, aside from some very indignant passengers (or at least the freight conductor), you get a chance to stand on a carpet and likely win a few days on the beach or a permanent stay on the beach. Try reading the funnies or playing with a cell phone as you run a red signal. No California stops allowed.

It's difficult to defeat the system in a railroad cab. Unfortunately, it's pretty easy to do in a Tesla, alas.
 
Although it doesn't work well, Tesla does have emergency light detection. Relatively speaking it doesn't sound hard for a vision-based system. Get it working, move to the outside lane as required by law, slow down, and notify the driver. If no outside lane, show down a lot or request human assistance.

Should not be difficult, again relatively speaking.
 
Although it doesn't work well, Tesla does have emergency light detection. Relatively speaking it doesn't sound hard for a vision-based system. Get it working, move to the outside lane as required by law, slow down, and notify the driver. If no outside lane, show down a lot or request human assistance.

Should not be difficult, again relatively speaking.
Sadly, in my state, cheap emergency light bars have become very popular and every construction vehicle has one. Often they run them while just driving along. So, there's a challenge for Tesla to pick out the real emergency vehicles.
 
  • Like
Reactions: drdumont
Although it doesn't work well, Tesla does have emergency light detection. Relatively speaking it doesn't sound hard for a vision-based system. Get it working, move to the outside lane as required by law, slow down, and notify the driver. If no outside lane, show down a lot or request human assistance.

Should not be difficult, again relatively speaking.
Everything is simple until you have to code it.
Concept is easy. Producing is hard. Politics makes it all the worse.
 
Although it doesn't work well, Tesla does have emergency light detection. Relatively speaking it doesn't sound hard for a vision-based system. Get it working, move to the outside lane as required by law, slow down, and notify the driver. If no outside lane, show down a lot or request human assistance.

Should not be difficult, again relatively speaking.
Pet peeve - move over or slow down. Not both in most states I drive in.

Everyone cutting everyone else off while also diving to way more than 20 under (or whatever your state req's) is extremely dangerous and annoying to boot.

+1 on annoying for passing a cop and staying at -5 for the next few miles in the left lane.

Back to your regularly scheduled NHSTA programming....