Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

NHTSA Close to forcing recall of FSD after more crashes

This site may earn commission on affiliate links.

2101Guy

Breaker of Ignore Buttons
Jan 6, 2020
4,972
7,661
USA
  • Jan 2022 – Desert Center, CA
  • Sep 2021 – Petaluma. CA
  • Aug 2021 – Orlando, FL
  • Apr 2021 – Belmont, CA
  • Jan 2021 – Mount Pleasant, SC
  • Nov 2020 – Houston, TX



 
Last edited:
I don’t get it. I’ve been driving cars since 2015 with ADAS systems, 25-30,000miles a year (Tesla’s for the last two years). I haven’t rear ended a bus, dump truck, low-boy trailer, fire truck, ice cream truck, ect) while using the ADAS systems. Maybe, because I, I don’t know, PAY ATTENTION???
Issue seems to focus at least in part, on this:

“On average in these crashes, Autopilot aborted vehicle control less than one second prior to the first impact.”
 
  • Like
Reactions: LowlyOilBurner
Issue seems to focus at least in part, on this:

“On average in these crashes, Autopilot aborted vehicle control less than one second prior to the first impact.”
Not much warning time. Maybe they should have upgraded to high resolution cameras sooner...

Be curious to see what cars of those involved in crashes had radar still and it if was working at the time of the crash. Best advice I ever heard on this forum is assume AP is out to kill you. I would follow the same logic with FSD.
 
  • Like
Reactions: 2101Guy
Not much warning time. Maybe they should have upgraded to high resolution cameras sooner...

Be curious to see what cars of those involved in crashes had radar still and it if was working at the time of the crash. Best advice I ever heard on this forum is assume AP is out to kill you. I would follow the same logic with FSD.
I assume AP is like an AI that might need to be put down at any second. I also assume every other driver on the road is out to kill me, too.
 
  • Like
Reactions: pkeller001
  • Jan 2022 – Desert Center, CA
  • Sep 2021 – Petaluma. CA
  • Aug 2021 – Orlando, FL
  • Apr 2021 – Belmont, CA
  • Jan 2021 – Mount Pleasant, SC
  • Nov 2020 – Houston, TX



Clickbait FUD title.

Have you no shame, sir ?

Start at 17:42 to non-clickbait non-FUD coverage.

 
  • Jan 2022 – Desert Center, CA
  • Sep 2021 – Petaluma. CA
  • Aug 2021 – Orlando, FL
  • Apr 2021 – Belmont, CA
  • Jan 2021 – Mount Pleasant, SC
  • Nov 2020 – Houston, TX



Worst case they are forced to disable all autonomous features including base AP/TACC. So it would be a nice EV with no features. No refund.

Best case, they issue a recall like the seatbelt recall which is just an update to fix.
 
  • Like
Reactions: 2101Guy
The NHTSA clearly does not want to stop Autopilot from functioning, they want to harness the benefits of ADAS while mitigating the risks associated with drivers becoming inattentive when such systems are in use. People who follow AVs closely and discuss them a lot on here, we should all know the human-machine interface is a massive factor and would have been the focus of these investigations. The NHTSA doesn't expect the systems to flawlessly detect and navigate around stopped first responders, the technology is nowhere near perfect and it won't be for a long time, but there are benefits and there are currently risks.

The problem here, and what led to the upgrade from PE to EA, is the system apparently believing drivers are sufficiently engaged right up until the moment they crash. In the broader Autopilot investigation, only two drivers had received any warnings in the 5 seconds prior to the crash even though they were likely disengaged before, during, and after that measurement. The wheel torque requirement registers the drivers as having their hands on their wheel as they careen towards and eventually hit static objects.

Now another item to put into context here is Elon previously and incorrectly refuting the use of cabin cameras for eye tracking -- that opinion very quietly slunk into the shadows and Tesla began building in the functionality.

1654956705651.png


Many new cars nowadays are rolling off the assembly line with IR cameras that track driver eye movement even when ADAS are not in use. If you turn on ADAS in those vehicles, additional checks and balances are activated.

How does Tesla fix this? Are the current cabin cameras up to the task? What about earlier Tesla vehicles on the road with AP but without cabin cameras? These are the things I would be wondering, because the NHTSA won't try to stop Autopilot from functioning but they will want to research and build in ways to better measure and control driver engagement that is seriously lacking right now.

Despite the Tesla cabin cameras never being originally intended for this use, as Elon stated before in other tweets as well, their existence is a saving grace. If all Tesla vehicles were running Autopilot without cabin cameras, I think there would be a much higher risk of the functionality being severely hamstrung or disabled due to lack of an effective means of ensuring driver engagement.
 
I don’t get it. I’ve been driving cars since 2015 with ADAS systems, 25-30,000miles a year (Tesla’s for the last two years). I haven’t rear ended a bus, dump truck, low-boy trailer, fire truck, ice cream truck, ect) while using the ADAS systems. Maybe, because I, I don’t know, PAY ATTENTION???
That's at least part of the issue, though. Tesla's current approach to ensuring that drivers actually pay attention is easily defeated by hanging a weight on the steering wheel. We've all seen the stories (some well-documented, with video) of people who take naps on the highway while in the driver's seat of a Tesla, and at least some of the incidents that the NHTSA is investigating are probably caused by such inattention. Some people do actively attempt to defeat the system. Others become too reliant on it, letting their attention lapse when it shouldn't.

Yes, it sucks that people who do use Autopilot properly may (if things go very badly) cause it to be taken away from those who do use it properly. But do you know what sucks worse? Having your life taken away because somebody didn't use Autopilot properly. IMHO, the NHTSA's investigation is well-justified. Some other automakers have more sophisticated driver-monitoring technologies than what Tesla uses.

One potential piece of good news is that Tesla has developed a second attention-verifying technology, which uses the interior camera to monitor the driver. This is already being used, although if I understand correctly it's only being used in the latest FSD beta builds. I've seen complaints in the FSD beta threads that this camera-based monitor may sometimes produce false alarms, believing that drivers are not paying attention when in fact they are. It's also possible that there would be a significant miss problem, where the system believes drivers are paying attention when they aren't. Time will tell how effective this system is; but if it looks promising, and NHTSA agrees, it seems likely that Tesla will simply activate this system on all Autopilot-equipped vehicles. (At least, those with interior cameras; I don't know if they exist on older Model S and X vehicles.) This isn't the worst-case outcome of the NHTSA investigation, in terms of our access to Autopilot, but I'm skeptical that it would get worse than this.
 
We have the interior camera monitoring my gaze in one car (with FSD beta). I’d say the system works “okay”. All of these systems can be defeated, because… humans. 😂 I’d say go ahead and turn this on for all Teslas with cabin cameras (older S & X don’t), but that’s still not going to guarantee anything. Maybe it’ll help/improve attentiveness? Depends on how intent people are on defeating such babysitters: It's Not Just Tesla: All Other Driver-Assist Systems Work without Drivers, Too
 
The wheel tug only determines that the driver is awake. It does not prevent the driver from using a phone, eat lunch, turn around to yell at the kids or even close the eyes.

Assuming that regulators pressure Tesla into improving driver monitoring, I expect Tesla will integrate use of the interior camera for all assisted driving functions, as is done with FSD beta.
 
  • Like
Reactions: 2101Guy
We have the interior camera monitoring my gaze in one car (with FSD beta). I’d say the system works “okay”. All of these systems can be defeated, because… humans. 😂 I’d say go ahead and turn this on for all Teslas with cabin cameras (older S & X don’t), but that’s still not going to guarantee anything. Maybe it’ll help/improve attentiveness? Depends on how intent people are on defeating such babysitters: It's Not Just Tesla: All Other Driver-Assist Systems Work without Drivers, Too
Humans can defeat any system but the idea is to make it so onerous that almost nobody will bother, thus reducing risk. Wheel torque is obviously not sufficient, it does not accurately gauge driver attentiveness and is circumvented by something as simple as a little weight taped to the steering wheel.

All cars should have driver eye tracking even when ADAS is not in use, distracted driving is a massive problem nowadays and a solution through truly autonomous vehicles is likely years away. And it's not about the risk to the driver, people can risk their own lives all they want, but other road users and pedestrians etc are a different story.

Tesla vehicles can truly can be ahead of most others here and really improve safety because of the cabin cameras and software capabilities, the only downside is angering people who want to engage in risky behavior -- I'd say that's a small price to pay.
 
  • Like
Reactions: pilotSteve
Humans can defeat any system but the idea is to make it so onerous that almost nobody will bother, thus reducing risk. Wheel torque is obviously not sufficient, it does not accurately gauge driver attentiveness and is circumvented by something as simple as a little weight taped to the steering wheel.

All cars should have driver eye tracking even when ADAS is not in use, distracted driving is a massive problem nowadays and a solution through truly autonomous vehicles is likely years away. And it's not about the risk to the driver, people can risk their own lives all they want, but other road users and pedestrians etc are a different story.

You mean people like this person who posted here, complaining that the car was nagging them to keep their eyes on the road, and also stated "I am a commercial driver, I am a good driver, I dont need to look at the road all the time"?

I have been driving 40 years without an accident or ticket. I’m a commercial driver. Just because my eyes aren’t glued to the road while on autopilot does not mean I will hurt anyone or crash. I can assure you that I am a very safe driver. I have about 10k miles on autopilot and never once has it done anything dangerous or out of the ordinary. i have never had to take control.
 
The weird thing is cars have had dumb cruise control forever. With zero driver alertness monitoring. It will plow right into anything without even a care.

Why wouldn't NHTSA make that get removed from all cars? Where's the outrage?
It's not one extreme or the other, it doesn't need to be all or nothing. Cruise control obviously has big benefits in terms of reducing driver fatigue by simply modulating speed, nobody believes it will drive for them and the benefits are worth the risks that do exist. But people are not activating regular dumb cruise control and then climbing in the back seat.

Some people become so overconfident and complacent with Autopilot that they feel comfortable disengaging from the driving task, and they'll even work to circumvent safeguards in place so they can disengage further while the vehicle drives. That's what the NHTSA is talking about in terms of exacerbating bad driver behavior, and that's what needs to be reined in.

It doesn't need to be all or nothing with Autopilot either, there exists a happy medium where we benefit from Autopilot while also not unduly risking drivers becoming so disengaged that they let the car drive itself into static objects.
 
  • Like
Reactions: strider
The weird thing is cars have had dumb cruise control forever. With zero driver alertness monitoring. It will plow right into anything without even a care.

Why wouldn't NHTSA make that get removed from all cars? Where's the outrage?
"Dumb" cruise control does not make a driver "zone out" to the same extent as Autopilot/FSD can (although I've never zoned out with Autopilot on because it scares the crap out of me, puts me on high alert as a driver, and I don't use it very often). But I can say that I've driven cars with RADAR cruise control that automatically controls following speed (bear in mind that I still had to steer the vehicle, just didn't need to use the accelerator or brakes nor adjust the speed manually). That's mostly okay, but after about 35-40 miles of this, one time a vehicle started signaling into my lane and of course this "dumb" RADAR cruise control did not react to seeing this signal like a human would. And because I had become used to not controlling the speed of the vehicle, neither did I, until the vehicle started cutting into my lane and I had to slam on the brakes.

Would it have been my fault if there was a collision? No, the other driver was making the lateral movement. But the cruise control combined with about 30 minutes of automation dependency in speed control let me get into a situation that I would never have gotten myself into had I been driving manually.
 
The weird thing is cars have had dumb cruise control forever. With zero driver alertness monitoring. It will plow right into anything without even a care.

Why wouldn't NHTSA make that get removed from all cars? Where's the outrage?
The NHTSA should force Tesla to add dumb cruise control and then we will be able to compare the collision rate of dumb cruise control vs. Autopilot.
I guess Tesla already has the collision rate of TACC vs. TACC and Autosteer. That would be interesting to know.
 
So, is it the technology's fault, or the driver's fault? If the technology has a flaw that's causing accidents without the driver's ability to avoid it, then I'm behind the NHTSA investigation and a recall. For example: AP is engaged and heading directly at something, like a crashed vehicle or emergency vehicle, and the driver attempts to take over to avoid it but AP refuses to disengage... That's a serious problem.

If this is just stupid humans, not paying attention, then how is it different than any stupid human driving? Countless videos of idiot humans texting and driving, plowing into other cars. Hell, I personally got rear ended years ago at a red light by an idiot claiming he dropped jelly beans and bent over to pick them up and didn't see the red light or me. Why should Tesla be punished for those idiots?
 
So, is it the technology's fault, or the driver's fault? If the technology has a flaw that's causing accidents without the driver's ability to avoid it, then I'm behind the NHTSA investigation and a recall. For example: AP is engaged and heading directly at something, like a crashed vehicle or emergency vehicle, and the driver attempts to take over to avoid it but AP refuses to disengage... That's a serious problem.

If this is just stupid humans, not paying attention, then how is it different than any stupid human driving? Countless videos of idiot humans texting and driving, plowing into other cars. Hell, I personally got rear ended years ago at a red light by an idiot claiming he dropped jelly beans and bent over to pick them up and didn't see the red light or me. Why should Tesla be punished for those idiots?
What you're describing would 100% result in the technology being totally disabled and likely some other legal action taken against Tesla, that's not what the NHTSA wants here.

It should be noted that this original investigation into Autopilot crashing into first responders was likely a reaction to Tesla being sued by those first responders -- there are a few of these lawsuits in the system right now from police officers etc seeking compensation due to damages/injuries sustained in these crashes where drivers were not paying attention and Autopilot was active.

It's not "punishment" for Tesla, this is legitimately what the safety regulator's job is: to put a leash on the idiots and reasonably protect the broader public.