Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla Begrudgingly “Recalls” FSD Beta for NHTSA

This site may earn commission on affiliate links.
I'm sure this will be a sticky on all of the vehicle forums shortly:


(moderator note: related threads here…)
FSD Recall? in Software
Recall FUD in Uk

46071715365_d36a6e2bf4_b (1).jpg

"Full Self Driving Tesla" by rulenumberone2 is licensed under CC BY 2.0.
Admin note: Image added for Blog Feed thumbnail
 
Last edited by a moderator:
Since FSD was recalled for not being perfect, should other driver assist tools also be recalled for not being perfect? For example none of the other autosteer systems work well in sharp turns. Should we demand the NHTSA play fair and recall those?
 
Since FSD was recalled for not being perfect, should other driver assist tools also be recalled for not being perfect? For example none of the other autosteer systems work well in sharp turns. Should we demand the NHTSA play fair and recall those?
For sure if they are breaking the law. Some of the other lane assist features aren't designed to steer with full self driving as they aren't applying enough steering force or speed. I think Tesla designed autosteer to handle normal roadway turns although admittedly it doesn't always work.
 
Quite the opposite, actually. FSD benefits from trying to predict what it thinks it should do and having a trusted source of data let it know when it’s wrong. That kind of feedback is how the NNs learn.

Right now, FSD is always running, even when it’s not active, to see if the human driver disagrees with what it would have done. At the last Tesla Autonomy Day, they mentioned that one particular trigger for feedback was when FSD would have braked but the human driver maintained speed instead. That’s why we’ve been experiencing less phantom braking over time.
And this is done at the vehicle level, correct? I read that the second on-board processor is used to monitor its success predicting what the driver will do.

Does this mean my own car is constantly learning, or is this sent back to the "mother ship" to train the whole system? (maybe both)
 
Because thats what its called when the NHTSA compels an auto company to do something, software or no. How the recall is completed is not material to the name of the process.

Why did you post this in this thread instead of the "very active, 4+ Page, this one is likely going to get merged into" thread?
Importantly, "recall" also means the manufacturer has to fix the problem at no cost to the customer. For example, if this fix requires an upgrade HW in some models for the SW to work, Tesla would have to provide upgraded computers at no charge.

Sure, the term may be outdated (per Elon), but it's buried in existing law and regulation. If Elon thinks it inappropriate, he can ask his Communication team* to recommend to the feds a new term that would be more appropriate.

*sarcasm font?
 
And this is done at the vehicle level, correct? I read that the second on-board processor is used to monitor its success predicting what the driver will do.

Does this mean my own car is constantly learning, or is this sent back to the "mother ship" to train the whole system? (maybe both)
Sent back to Tesla for the next round of NN training.

I’m not sure what the second on-board processor is doing when running the Beta these days. They’ve mentioned in the past that the Beta is using a bit of the second processor for main processing, so I don’t think it’s a full validation on the first processor’s output. If it still does some validation, my assumption is that either a disagreement results in the red wheel of death, or that’s when you momentarily see no planned path and the car just continues down the current trajectory until the Beta recovers a second later (which is usually fine when going straight but detrimental during a turn).
 
For sure if they are breaking the law. Some of the other lane assist features aren't designed to steer with full self driving as they aren't applying enough steering force or speed. I think Tesla designed autosteer to handle normal roadway turns although admittedly it doesn't always work.
NHTSA mandate isn't about enforcing traffic laws. It is about car and road safety laws and regulations.

 
Since FSD was recalled for not being perfect, should other driver assist tools also be recalled for not being perfect? For example none of the other autosteer systems work well in sharp turns. Should we demand the NHTSA play fair and recall those?
For sure if they are breaking the law. Some of the other lane assist features aren't designed to steer with full self driving as they aren't applying enough steering force or speed. I think Tesla designed autosteer to handle normal roadway turns although admittedly it doesn't always work.
What law is FSD breaking? I think it is just an opinionated determination that it is unsafe.
 
Assuming map data is being used, maybe add a school zone?
So now you've turned a problem that the car needs to solve into something that a human has to do offline, which if neglected or done incorrectly, could be dangerous.

You've also created a situation where the car can not know about those signs for up to a year and a half after they are installed. (Case in point, nearly all MCU1 cars were still running late-2020 builds with late-2020 map data until roughly mid-2022, IIRC.)

Map data is not the answer. Map data is the question. "No" is the answer. This sort of thing really needs to be handled by the car, and signage needs to be standardized enough that such a thing is possible.

Besides, in spite of replacing one type of data problem with a different type of data problem, using map data still doesn't make the car notice the complete lack of cars in the parking lot and realize that school won't be back in session until August. Having the sign switched off over the summer, by contrast, does solve that problem (assuming they remember to turn it off 🤣).


(Then there's the driver, who should be able to tell when a school is in session and when to turn off FSD. But I digress.)
And that's good enough for an ADAS. It isn't good enough if the goal is to exceed L2 autonomy at any point in the future.
 
  • Like
Reactions: CyberGus
Importantly, "recall" also means the manufacturer has to fix the problem at no cost to the customer. For example, if this fix requires an upgrade HW in some models for the SW to work, Tesla would have to provide upgraded computers at no charge.

One option for Tesla. Give up on FSD for HW3 and claim "the regulators shut it down". Pay nothing back to customers because "it wasn't Tesla's fault". Then start again on HW4

I mean it's kind of in their fine print, FSD is (subject to) "regulatory approval". It doesn't say they will get regulatory approval, it kind of says if they don't get regulatory approval then no FSD. That would be major weasel behavior but could be worth a try...
 
  • Informative
Reactions: DrGriz
One option for Tesla. Give up on FSD for HW3 and claim "the regulators shut it down". Pay nothing back to customers because "it wasn't Tesla's fault". Then start again on HW4

I mean it's kind of in their fine print, FSD is (subject to) "regulatory approval". It doesn't say they will get regulatory approval, it kind of says if they don't get regulatory approval then no FSD. That would be major weasel behavior but could be worth a try...
My model s bought in early 2018 was sold to me as hardware complete for FSD. Software was all that was needed.
 
One option for Tesla. Give up on FSD for HW3 and claim "the regulators shut it down". Pay nothing back to customers because "it wasn't Tesla's fault". Then start again on HW4

I mean it's kind of in their fine print, FSD is (subject to) "regulatory approval". It doesn't say they will get regulatory approval, it kind of says if they don't get regulatory approval then no FSD. That would be major weasel behavior but could be worth a try...
Doesn't really address the other markets where they haven't even launched yet with FSD Beta. Tesla is pretty close to a decent door-to-door L2 system which may be satisfactory to a lot of buyers, so I would think they would at least offer that to HW3 buyers, instead of giving up now.
 
Map data is not the answer. Map data is the question. "No" is the answer. This sort of thing really needs to be handled by the car, and signage needs to be standardized enough that such a thing is possible.
20D23A8C-5EF4-423D-AA8A-F4004B388811.jpeg


School-zone signage is usually very specific, and/or has flashing lights. Processing a sign like the one above is within Vision AI capabilities.

Of course, the car doesn’t know if “school is in today” but it knows the day of the week. The safe thing would be to use the lower speed M-F.

For FSD to be safe, while also satisfying NHTSA and local traffic laws, it must drive conservatively enough to really piss off the non-AI drivers.
 
FSD was not recalled for "not being perfect"... It was removed for being hazardous!
FYI FSD was not removed, NHTSA says it can still remain running while Tesla works on an update to address 4 specific points. Just pointing it out because this point keeps being brought up (perhaps because many media, including major news media incorrectly reported on it when story first came out).
 
  • Like
Reactions: mongo
Doesn't really address the other markets where they haven't even launched yet with FSD Beta. Tesla is pretty close to a decent door-to-door L2 system which may be satisfactory to a lot of buyers, so I would think they would at least offer that to HW3 buyers, instead of giving up now.
If they can't "fix" FSDb to satisfy the requirements of the recall then they can't release FSD to any HW3 buyers. Plus there are many other bad things about FSDb that aren't even addressed in this recall. Maybe there will be other recalls issued to address those bad things too.

However the recall doesn't really say anything about how good it has to be, it's a pretty lame recall from the NHTSA in terms of setting any standards. Even Tesla just says they are going to "improve" it, they don't say they are going to fix it. So who knows if the NHTSA is going to say great, at least you're trying, or are they going to be more harsh and say no, it's still doing those bad things.

Tesla could back down on the scope of FSD and just go with a basic L2 system that meets with the approval of the NHTSA, which IMO is all they will ever get from HW3. Or, they can keep trying with FSDb, but I'm still not sure what they are hoping to achieve anyway. If they make something that has many of the apparent features of an L4 system but is called an L2 system then that's just going to encourage people to drive with blind reliance on a system that is not intended for that.

Well, let's see what 11.3 does. [Which doesn't really seem like it's addressing the recall, that is just another update. Are they really making the update address the specific terms of the recall? Again, I guess we'll see.]