Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla Begrudgingly “Recalls” FSD Beta for NHTSA

This site may earn commission on affiliate links.
I'm sure this will be a sticky on all of the vehicle forums shortly:


(moderator note: related threads here…)
FSD Recall? in Software
Recall FUD in Uk

46071715365_d36a6e2bf4_b (1).jpg

"Full Self Driving Tesla" by rulenumberone2 is licensed under CC BY 2.0.
Admin note: Image added for Blog Feed thumbnail
 
Last edited by a moderator:
I interpret it the other way. That FSD doesn't prevent the driver from exceeding the speed limits.
Its a designed in issue that it doesn't slow down with FSD but works great with AP.
AP has had the 5mph over limit for years, but FSD threw it away.

For good reason, of course. FSD appears to be much aggressive at speed limit sign detection, and tends to get them wrong with alarming regularity. If it didn't, we'd all be driving with our foot on the gas pedal because the AI hallucinated that it was driving a truck or pulling a trailer. 🙃

One of my biggest complaints about FSD beta is that it ignores school speed zones. That alone should trigger a NHTSA recall.

That is a nearly unsolvable problem. How do you handle signs that say "Speed limit when children are present"? Nope. Not going to happen any time soon. Nor are cars realistically going to be able to handle "Speed limit from 7:30 to 8:30 and 2:30 to 3:30 on school days" signs, because the car has no idea if school is in session or off for the holidays/summer/weekend/*.

School zones are one of those situations where if cities want autonomous cars to follow those restrictions reliably, they're going to have to upgrade the signs with flashing yellow lights indicating that the speed limit is active. And really, such requirements ought to become federal road standards, with states, counties, and cities strongly encouraged (read "threatened with defunding") to follow them


Musk has an uncontrollable propensity for exaggeration. FSD and Santa Clause have always been in same league, myths. FSD has struggled -- and failed -- to achieve L3 autonomous driving. It will never attain L4 or L5 autonomy -- or "FSD" as Musk has peddled it. First, the entire traffic infrastructure (roads, traffic, lights, etc), will need to be updated to deploy geofencing which in turn will enable vehicles to reliably communicate with each other -- and the road.

Umm... no. Even ignoring the fact that GPS is more than adequate for geofencing (with the exception of tunnels, for which you kind of need other solutions), there is absolutely no benefit whatsoever from cars communicating with one another or with the road in the general case. The reason that the industry has repeatedly shot down proposals for such communication is that the whole idea is fundamentally contrary to safety and security.

If you actually want any level of safety at all, cars have to be able to drive safely using only what they can see or otherwise perceive directly. Almost without exception, relying on any sort of communication with other cars or road infrastructure breaks that assumption and creates an unfixable security hole.

If you have vehicle-to-infrastructure communication, any bozo who could steal one of the boxes would be able to send out a fake signal that says "The speed limit is 90 miles per hour" right before a school, and cars would follow it. If you have vehicle-to-vehicle communication, any bozo could easily obtain one of those boxes and send out a signal that says "I'm braking hard," and cause traffic to pile up on the freeway. And there is absolutely no feasible way to prevent those sorts of tampering for precisely the same reason that unbreakable DRM is impossible.

The closest you will ever see to communication between vehicles would be pushing notifications about potholes, ice, and other problems up to a server, such that if a large enough number of vehicles report similar behavior, the server can recognize that the reports are probably legitimate and tell cars to slow down a little bit. And even then, it will involve heavy server-side filtering to require adequate concurrence between devices that show evidence of moving independently. Otherwise, you'll have incidents like that guy pulling a wagon full of cell phones who caused fake traffic jams on Google Maps.


So, don't be so gullible when Musk makes claims like Robotaxi will launch by the end 2020, etc, etc, etc.
This part I agree with. Any attempt to nail down an exact date when things will be good enough is pretty much guaranteed to be wrong.

Tesla did do a recall to fix the rolling stop


However, that seems to be another delay tactic because it is still an issue for this latest recall.
It isn't a general problem, but there are specific stop signs that mine tends to not stop for, usually involving having just turned onto the road in question, e.g. turning out of a row of cars into a road-like area in a parking lot and immediately encountering a stop sign before leaving the lot and going into an actual street. My guess is that it incorrectly thinks that those don't apply to it because of their position for some reason.
 
  • Disagree
Reactions: TSLY
Last I heard the California bridge crash was confirmed to be FSDb. The same thing could happen on a freeway overpass/underpass and a few other ways we probably haven't considered.
Link? Articles that last claimed this last month incorrectly interpreted a regular NHTSA report which only shows that some sort of ADAS was engaged within 30 seconds of the crash. So if the person had engaged TACC or AP within 30 seconds of the crash, it would still be in that report:
SF Bay Accident Surveillance Footage

Or are you referring to a newer report that says that FSD Beta specifically was engaged?
 
Last edited:
  • Like
Reactions: mongo
There is no beta of EAP.

Correction: they don't work FOR YOU. Obviously there are some people out there for whom FSD works. I have EAP and while I don't use the NOAP part of it, I REALLY appreciate having the lane change capability and the (dumb) summon feature to pull my car out of and into my garage. So, EAP works for me.
That's just Tesla's legal loophole to avoid litigation: everything is considered "Beta." This is different than the FSDb.
I figured as much, but I was replying to Kev. My original question of the recall affecting EAP (whether beta or otherwise) seems to have answered itself with no update showing on my car. Whew, crisis averted.
 
Link? Articles that last claimed this last month incorrectly interpreted a regular NHTSA report which only shows that some sort of ADAS was engaged within 30 seconds of the crash. So if the person had engaged TACC or AP within 30 seconds of the crash, it would still be in that report:
SF Bay Accident Surveillance Footage

Or are you referring to a newer report that says that FSD Beta specifically was engaged?
NotebookCheck reported "Full Self-Driving Beta (FSD)" but its abbreviation is missing the "b."

 
NotebookCheck reported "Full Self-Driving Beta (FSD)" but its abbreviation is missing the "b."

That's one of the incorrect articles... It referenced a CNN article that has since been corrected to say it was referencing the same standing order NHTSA report I linked that only meant ADAS was active within 30 minutes of the crash. That says nothing about if FSD Beta was active. Even if TACC was active 30 seconds before the crash, it would have still been included in that report.

This shows how misinformation can spread on the internet and not get corrected.

You aren't the OP however, so I'll wait for their response to see if they are referencing another newer report that might be referencing to newer investigation results.
 
Last edited:
Keep in mind that Tesla needs FSD to make mistakes in order to learn from them. Having highly accurate map data would just prevent the NN from learning how to deal without it.

As long as FSD is in beta (thus requiring L2 driver attention) there is no benefit to Tesla in adding better sensors, accurate maps, or other crutches.

There's a difference between high res map, and a shitty map with outright wrong speed limits. In regular AP I would love to turn the whole feature off so I don't have to keep the accelerator pressed for long sections of my trip because the posted limit is way lower than the actual.
 
  • Like
Reactions: DarkForest
That's one of the incorrect articles... It referenced a CNN article that has since been corrected to say it was referencing the same standing order NHTSA report I linked that only meant ADAS was active within 30 minutes of the crash. That says nothing about if FSD Beta was active. Even if TACC was active 30 seconds before the crash, it would have still been included in that report.

This shows how misinformation can spread on the internet and not get corrected.

You aren't the OP however, so I'll wait for their response to see if they are referencing another newer report that might be referencing to newer investigation results.
Good to know.

I tried to do some quick digging. Apparently the NHTSA is still investigating so no conclusion are available there. Given the scale of the accident and the Model S driver being a lawyer there will likely be a few lawsuits and one could imagine Tesla will try to silence details and settlement info. The only hope is NHTSA's special investigation is outside the reach of the courts and Tesla's arm but I couldn't find info even within NHTSA's investigation search.

Drawing an inference. The model S driver states FSD was engaged - check. Presumably Tesla has black box data but yet Tesla/Elon haven't publicly stated FSD was not engaged - check. The Model S characteristically behaved as if in an FSD initiated left turn with last sec hard brake, last sec lane change, stopped with right side wheels straddling the outer lane lines - check. My pure speculation is FSD will be shown to be a factor in the crash.
 
Good to know.

I tried to do some quick digging. Apparently the NHTSA is still investigating so no conclusion are available there. Given the scale of the accident and the Model S driver being a lawyer there will likely be a few lawsuits and one could imagine Tesla will try to silence details and settlement info. The only hope is NHTSA's special investigation is outside the reach of the courts and Tesla's arm but I couldn't find info even within NHTSA's investigation search.

Drawing an inference. The model S driver states FSD was engaged - check. Presumably Tesla has black box data but yet Tesla/Elon haven't publicly stated FSD was not engaged - check. The Model S characteristically behaved as if in an FSD initiated left turn with last sec hard brake, last sec lane change, stopped with right side wheels straddling the outer lane lines - check. My pure speculation is FSD will be shown to be a factor in the crash.
Well, we know at least AP or TACC was engaged leading up to the crash, so to the public, Elon can't exactly say FSD was not involved, if for example the driver had the FSD option package and used that to say "FSD" was engaged.

This was discussed in other threads, there are a couple of options:
1) FSD Beta was engaged and it was in that mode throughout the crash (was not in the highway NoA mode). This is what all the articles implied, but this have not been proven.
2) FSD Beta was engaged before entering bridge and it was in NoA mode through the crash. A test by an owner for KTVU shows a Model 3 with FSD Beta switches to NoA mode on the bridge through the exact same lane (see the UI in the video). This is what most experienced Tesla owners argued even before that test, that FSD Beta mode simply does not run in that bridge, it runs NoA, which is the same as if you had EAP and not FSD.
3) TACC / AP / NoA was engaged through the crash, but owner had the FSD package so he claimed "FSD" was engaged.
4) TACC / AP / NoA / FSD Beta was engaged, but was disengaged (or partially disengaged like doing a manual lane change and TACc still on) before the crash either by driver or automatically.
 
[funny]In all likelihood there has been more "characters" typed and wasted on this thread (including me) than the number of characters that will need to be changed in the FSD Beta code to implement the recall.:eek: :D

Also after it is pushed out I bet that about 70% will not notice more then subtle differences and not care or be indifferent to the nuance changes in driving behavior. 10% will have mixed feelings about the different changes (love some, hate some), 10% will LOVE every change and 10% will HATE them all. Of course the 20% minority will make the most noise as usual. Wonder where I will fall? 🤔 Only time will tell.🤣
 
This "recalls" news seems negative but in fact it has positive effects. It makes the public (or at least me) feel more confident with FSD because NHTSA and Tesla are working together to make FSD safer. There was another new from NHTSA last week that gives confidence to the public: NHTSA determined the accident that killed 2 people in car in Texas in 2021 was not caused by autopilot. It was caused by the driver's recklessness.
 
[funny]In all likelihood there has been more "characters" typed and wasted on this thread (including me) than the number of characters that will need to be changed in the FSD Beta code to implement the recall.:eek: :D

Also after it is pushed out I bet that about 70% will not notice more then subtle differences and not care or be indifferent to the nuance changes in driving behavior. 10% will have mixed feelings about the different changes (love some, hate some), 10% will LOVE every change and 10% will HATE them all. Of course the 20% minority will make the most noise as usual. Wonder where I will fall? 🤔 Only time will tell.🤣
You forgot to say that 99% will feel vindicated about what they thought before.