Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla recalls 2 million vehicles to limit use of Autopilot

This site may earn commission on affiliate links.
It amazes me of the many statistical crashes on the roads and highways that are investigated by NHTSA, they only want Tesla to do recalls.

I don't believe any other automaker claims that its ADAS features work everywhere, while simultaneously disclaiming that they actually do not in its own manuals.
Sooner or later, that marketing vs. reality bluff was going to get called.
I guess that moment has arrived.

As surprising as sunrise!

Oh no, they ticked off Musk mom. Now it's the FCC.

Great.
Now we know from whom Elon gets his paranoid streak.

There are multiple reasons for this:
1) Tesla is the only automaker that instantly knows when a crash happens on ADAS for practically any incident.

That's false.
Virtually every OEM in the US has some form of telematics and connected car integration. Some are way more sophisticated than Tesla's, some less.
This capability had started getting deployed way back in early 2000's.

2) Tesla is the only (relatively large) automaker that does regular non-infotainment OTAs, so the ease of implementing a recall like this is very high. Thus NHTSA is more likely to call for one because it's not as hard for Tesla to do one.

I don't recall where it says NHTSA considers easy of remediation as a variable in deciding to initiate a safety recall (pun intended!).
If you have evidence of that, please do share.
Otherwise, lets not make stuff up.

a
 
Last edited:
There are multiple reasons for this:
1) Tesla is the only automaker that instantly knows when a crash happens on ADAS for practically any incident. Other automakers have to rely on other forms of reporting and those do not always indicate if ADAS was active. The only system that probably comes close is OnStar, but even then I'm not sure if the system reports to GM if you don't buy the subscription, nor if it reports if ADAS was active in a crash.

2) Tesla is the only (relatively large) automaker that does regular non-infotainment OTAs, so the ease of implementing a recall like this is very high. Thus NHTSA is more likely to call for one because it's not as hard for Tesla to do one.

Mo' money mo' problems... Mo' data mo' problems...

Maybe should've just left it locally on a 'black box' on the car in the event of an accident? Ignorance is bliss. Hard to provide data you don't have...
 

whose fault is this?



giphy.gif
 
I don't believe any other automaker claims that its ADAS features work everywhere, while simultaneously disclaiming that they don't in its own manuals.
Sooner or later, that marketing vs. reality bluff was going to get called.
I guess that moment has arrived.

As surprising, as sunrise!



Great.
Now we know from whom Elon gets his paranoid streak.
Except Tesla doesn't claim it works everywhere (as it relates to AP), and Tesla already restricts where you can activate it (the steering wheel icon is not available).

That's false.
Virtually every OEM in the US has some form of telematics integration.
Most do more emergency assistance than Tesla (pro-active emergency service notification, voice call routing to 911, etc.). Tesla can't.
That's false according to the NHTSA standing order submitted so far. They don't have the ability to instantly report to the manufacturer when a crash happens with ADAS on. The reports are done in much more roundabout ways (like police or owner reports). Calling 911 is a completely different thing. The issue is the ability to determine that ADAS was active in a crash and automatically report that to the automaker. Given the order, even if they can implement it, they most likely will not want to do so now (as another pointed out, ignorance is bliss).
I don't recall where it says NHTSA considers easy of remediation as a variable in deciding to initiate a safety recall (pun intended!).
If you have evidence of that, please do share.
Otherwise, lets not make stuff up.

a
It's not an explicit deciding factor, but the ease of implementation is frequently a cited factor in how recalls are implemented (see how the Bolt recall was done for example). It doesn't get much easier than an OTA.
There are plenty of cars with much laxer driver monitoring where people have done dangerous stunts, yet NHTSA does not call for a recall, an example being the Infiniti from years ago:
 
NHTSA. This isn't about safety. If it was then there is so much more to overregulate first. The oversized trucks being exempt from certain safety standards. Why should phones allow you to use them when they are traveling at car speeds. Why do modern cars even allow for speeding? With GPS and acceleration by wire, cars shouldn't allow you to speed ever. Then to the more ridiculous. Why can I enable dumb cruise control on back streets? Why aren't there accelerator nags to make sure you aren't just putting a brick on the accelerator?

I know I'm being a bit hyperbolic, especially more so at the end, and to be clear I'm not pro overregulation. I don't want speed governors, geofencing, etc. I'm mad that this is about politics. About applying rules unequally with the excuse of protecting a handful of useful idiots to distract us from what's actually happening here. From the rot infesting our government.
For a while now I haven't been convinced that NHTSA puts safety first.

 
FYI, you don't have to install the coming software update.

I still own a 2015 Model S P85D with the original autopilot (Version 1.0) that when first enabled by software in late 2015 actually enabled HANDS-FREE driving. Unlike Tesla's V2 and later autopilot, V1 behavior (based on a MobileEye chipset) was very predictable, so you learned where it drove well vs. where it couldn't follow the road very well (e.g., on tight curves or over hills).

After NHTSA forced Tesla to remove the hands-free feature of V1 via a software update, I managed to go 4 years without updating to it. New software updates automatically showed up on my Model S, but I refused to install any of them. I got one reminder every morning to install the software, but once I dismissed that message any further restarts that day occurred normally without any reminders.

Ultimately after 4 years I was forced to update my car's software because 1) Tesla changed the communications protocol between the car and its servers, and 2) I had a failure of my rear drive motor necessitating re-installation of the vehicle software by the Tesla service center (covered under warranty). Not only did newer software remove hands-free driving, but unfortunately Tesla also "nerfed" all Model S P85 software to reduce the charging rate at Superchargers (to make sure the battery lasted 8 years under their warranty).

Looks like I may again have to start ignoring Tesla software updates...
 
  • Like
Reactions: Korchva and afadeev
I wonder how this will be materially different than the current SOP:

If the little grey steering wheel icon (at the top, near the speed readout) shows up, that means I can double tap the shifter, which will turn that icon blue and get Autosteer.
My guess is that the "single pull to AP", bypassing TACC, is part of the solution. I know that I sometimes even missed that it only engaged TACC and not Autosteer/FSDb as well, until it didn't steer when it should have. "Single pull" completely resolves that, as long as people choose to use it.

Tesla really needs to get a statement out clarifying what's going to happen but they may not fully know yet or might be still trying to negotiate behind the scenes.
Umm, they have been installing software since 12/7 that resolves the recall. (And the fleet of HW3 w/in-cabin camera cars were supposed to start getting 2023.44.30 starting yesterday.) So yes, Tesla knows what has changed.
 
It's not an explicit deciding factor, but the ease of implementation is frequently a cited factor in how recalls are implemented (see how the Bolt recall was done for example). It doesn't get much easier than an OTA.
There are plenty of cars with much laxer driver monitoring where people have done dangerous stunts, yet NHTSA does not call for a recall, an example being the Infiniti from years ago:
Recalls are a regulatory process. A process that Tesla has been involved with for years now. It's not like they were blind sided with this crash data (as you pointed out, they get and share this data). Tesla coulda acted and either 1. admitted that AP is not suited for all the conditions it can be activated in or 2. Voluntarily made these changes years ago. But no, they stomped their feet and ignored it. Tesla motto: Fake it until you make it. As long as the lie isn't big enough that people will stop buying the car, then they feel like they can tell it. This goes for FSD being "available at the end of the year" for four years or the range puposely being over reported. Tesla is relying on people to make the argument "well look at the big picture" yeah and that picture now includes over 1000 AP crashes and multiple dead. So Tesla coulda acted, they didn't and a full recall was ordered
 
Last edited:
I'm mostly concerned with the interior camera which I have kept covered since my first 3 in 2019. If they are going to start monitoring faces to use AP then I want nothing to do with it. I am ignoring all updates until we find out what behaviors it changes.
Agreed 100%. I also cover my camera. I have no intention of allowing the 20-somethings at Tesla to watch me whenever they want, regardless of what the official stance is from Tesla on these cameras. Additionally, I need AP on some non-highway roads that are technically not limited access but absolutely perfect for AP/autosteer in my usage. Losing those roads takes away a huge reason why I bought the car. I too will be ignoring all updates until I know exactly what's going on.
 
Agreed 100%. I also cover my camera. I have no intention of allowing the 20-somethings at Tesla to watch me whenever they want, regardless of what the official stance is from Tesla on these cameras. Additionally, I need AP on some non-highway roads that are technically not limited access but absolutely perfect for AP/autosteer in my usage. Losing those roads takes away a huge reason why I bought the car. I too will be ignoring all updates until I know exactly what's going on.
this was a factor in me choosing to by a used 2017 MS this time instead of a new one. I had the interior camera in my previous M3 and I had it covered from day 1
 
1,000? I'm only seeing 11.

Chronology :
- On August 13, 2021, NHTSA opened a Preliminary Evaluation (PE21-020) to investigate eleven incidents involving stationary first-responder vehicles and Tesla vehicles that were operating with Autosteer engaged.
 
They can and probably will force the update. I believe they did that in Hong Kong a few years back, but I don't recall the details. I've had updates applied that I am 100% sure I didn't approve.
My experience (owning 2 Teslas for nearly 13 years total) says otherwise. Since Tesla doesn't own my cars, it would seem to be illegal for Tesla to modify them without my permission as owner.

Legally a recall is a voluntary action that takes place because manufacturers and distributors carry out their responsibility to protect the public health and well-being from products that present a risk of injury or gross deception or are otherwise defective. Statistics compiled by the NHTSA show that only about 70% of recalled vehicles are usually returned for a repair or complete recall when the word gets out. That still means, however, that 30% of vehicles on notice for a product recall are not returned by their owners.
 
Sure. But the CNN headline as it is written is clickbait and misleading. It preys on fear.
The Feds are at fault with the semantics, not CNN. “Recall” is still the word for it. I think everyone here is pretty much agreed that OTA should have its own, less-alarming term in the law, but it doesn’t yet.
The semantics of this have been discussed endlessly on the forum. The point with this one is larger though, than the typical minor OTA thing. This is a big limiter on our autopilot and a huge shift within the industry and regulation etc. I think at some point the feds, the judges, industry and lawyers etc are going to have to agree with Tesla’s main point on this stuff: The driver has a responsibility to keep their hands on the wheel and stay alert.
If not, there can be and are accidents even on the finest divided interstate in board daylight and perfect conditions.
If the feds and courts can’t agree on that, autopilot and even pretty basic lane keeping can’t really exist.
 
Last edited:
OTA is correct. See the DOT letter that’s linked in the article.

I think the news here is, Tesla is going to limit the use of FSD.
I don't see FSD called out - I see Autopilot and specifically Autosteer - which is available under beta for free with AP (not FSD), as the culprit here. Perhaps I'm missing something, but I don't see how this impacts the full FSD stack per se.