Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla Begrudgingly “Recalls” FSD Beta for NHTSA

This site may earn commission on affiliate links.
I'm sure this will be a sticky on all of the vehicle forums shortly:


(moderator note: related threads here…)
FSD Recall? in Software
Recall FUD in Uk

46071715365_d36a6e2bf4_b (1).jpg

"Full Self Driving Tesla" by rulenumberone2 is licensed under CC BY 2.0.
Admin note: Image added for Blog Feed thumbnail
 
Last edited by a moderator:
Sorry, Tesla / Elon says it is a lie / wrong.

Just wrong. No mention of lying.

The annual UK vehicle test is called the MOT. MOT stands for Ministry of Transport, which hasn't existed since 1970.

Seriously, people need to get over it.

Either that or change the name to f**k-up. "The NHTSA has given a f**k-up notice to Tesla. A spokesperson said that Tesla has f**ked up a few things in its software again and needed to fix them."
 
My primary issue remains, Beta Software that has no chance of being production due to hardware and software limitations can be stated to consumers all you want, the fact remains, 400,000 people are driving cars that might injure or kill someone.

All kidding aside, think about this, it is not like beta and your personal info is at risk, a risk you take. It is you taking a risk that kills or harms ANOTHER person that did not sign up for that.

It should be done in a limited market and in a limited manner.

If it kills or permanently disables one person, that is too many people, if at the core it was truly the technology that harmed the person.

THIS COMING FROM A GUY EXCITED ABOUT THIS TECHNOLOGY AND HAS BEEN IN TECH ALL HIS LIFE.
 
Last edited:
  • Like
Reactions: kabin
Think of the "if a tree falls in the forest...". If you run a stop sign and no one is around to see it and you could see that no one was around, then were you being unsafe?
I find this interesting as well. When there is no other traffic present, the real issue with a rolling stop is that it's habit forming. Humans get into the habit of basically ignoring the sign because they drive there everyday and usually it's not a problem. Then one day, they're not paying much attention, they roll the stop as usual and whoa! An accident or a near-miss.

The thing about the computer (one of the less obvious "superhuman" features) is that it doesn't have those kind of attention lapses. If it's properly programmed to yield or stop for cross-traffic it will do so the 100th time same as the first time.

This is actually a key point and I wish that NHTSA would have considered this when the rolling-stop issue first came up. I believe that Tesla did say they had reviewed the programming with the regulators and shown that the rolling stop would only be taken in safe situations, but failed to convince them.
 
The good for Tesla. Just 6 years ago, they were a novelty vehicle, you literally had to look for them to find them, in South Florida, where nice cars are everywhere.
Now, literally I look out my window and drive and I cannot go anywhere without seeing a Tesla, literally every minute, so they have had explosive growth.

The bad, that explosive growth now makes them a MAJOR CAR MANUFACTURE, and as such, they are no longer a Tech Company. With that comes the responsibility and the oversight that accompanies that success.

They sold a little over 50K units in 2015 and now in 2022 they delivered 1.3 MILION UNITS, clearly one of the largest manufactures, Ford sold 13.8 Million Units with many more car lines and in multiple price ranges.

Tesla has to stop delivery and get it right and go into a true BETA mode, like Volkswagen Group and so many others are.

I know this may be a bad forum for this, as many will be so much in the need for the tech it is worth human loss to achieve.

PS I drive Model X, wife a model 3. Both very happy with the cars and tech.
 
....If it kills or permanently disables one person, that is too many people, if at the core it was truly the technology that killed the person....
In the US humans KILL over 100 people EVERY day and disable and severely injure 1000s every day driving cars. What if statistically FSD SAVES 10 people for every one it kills. Do you still prefer humans killing 10 people over FSD killing 1?
 
In the US humans KILL over 100 people EVERY day and disable and severely injure 1000s every day driving cars. What if statistically FSD SAVES 10 people for every one it kills. Do you still prefer humans killing 10 people over FSD killing 1?
This is the point I make frequently. The goal is to save lives. 42,000 people killed in car crashes in a year. If ADAS can save some of them, it's worth it. Yes, the technology will kill people, but less than if humans are driving.
 
In the US humans KILL over 100 people EVERY day and disable and severely injure 1000s every day driving cars. What if statistically FSD SAVES 10 people for every one it kills. Do you still prefer humans killing 10 people over FSD killing 1?
Yes, it is an old argument. With that argument, you are stating the tech can kill people. That does not need to be the case. Go to true Beta and get it working.

Love to see the stats that show 10 to 1. I have my own experience, Going straight into a Truck because it moves over, slams brakes, almost gets creamed from behind at 70MPH by car speeding. Human reaction is to slowly move over and ease the brakes.

By the way, I would assume the number to be more like 40,000 - 50,000 deaths to 1 Autonomous based on the fact there is 1.446 Billion Cars mostly in Asia, US and Europe and only around 300,000 drivers on and off running autonomous vehicles.

See I cannot see how you can statistically prove that given I truly know how the FSD is working, and as stated "Lucy we have a problem".

Lets go to Beta in limited markets and get it right.
 
This is the point I make frequently. The goal is to save lives. 42,000 people killed in car crashes in a year. If ADAS can save some of them, it's worth it. Yes, the technology will kill people, but less than if humans are driving.
I agree with both of you, IT SHOULD save more lives, the issue is IT DOES NOT save more lives, because it does not work as it should.
I BELIEVE IT WILL.

But to date countless evidence that it does not, that is why the Government is stepping in.

Put it in BETA, Limited Market, GET IT RIGHT.
 
  • Disagree
Reactions: Mullermn
I agree with both of you, IT SHOULD save more lives, the issue is IT DOES NOT save more lives, because it does not work as it should.
I BELIEVE IT WILL.

But to date countless evidence that it does not, that is why the Government is stepping in.

Put it in BETA, Limited Market, GET IT RIGHT.
Let's set aside the anecdotal stories from others here about it reacting to VRUs before the driver even saw them and focus on what you consider safe. Let's use your example of Beta, limited market. When is it safe? What threshold do you have to reach to release it to a broader segment?

To be clear, I'm not disagreeing with you. I want to see lives saved. The issue for me is where is the threshold? Some people believe that if it kills one person, it's a failure and should be removed from the road. My thinking is that if it kills 40,000 people, that means 2,000 people that would have died if humans were driving are alive. Isn't that a win for now? Yes, the ultimate goal is to save them all, but that's nearly impossible. Let's try to save as many as we can for now as the tech improves. I want those 2000 people to live.
 
Tesla has to stop delivery and get it right and go into a true BETA mode, like Volkswagen Group and so many others are.

I know this may be a bad forum for this, as many will be so much in the need for the tech it is worth human loss to achieve.

PS I drive Model X, wife a model 3. Both very happy with the cars and tech.

I used to have a black & white / negative opinion about FSD based on watching the videos. I bought in case it would help me, then got nervous about it.

It's not as if FSDb is being unleashed on the general public. I think there's a SMALL population who buy it with NO knowledge of the potential issues. Most FSD users seem to get used to the quirks and find ways to make it useful. I read about it, watched videos (which I found distressing at first) and talked to my neighbor who has been using it since the beginning. I'm finally starting to use it, and I can see there are locations where it slows down for no reason, and some circumstances that really flummox the robot.

Like anything, if people buy it and don't like it, it seems fair to ask for a refund. I imagine that's part of the negative feedback. You thought it would be great, but it scares you. Something Elon should consider in my opinion.
 
My primary issue remains, Beta Software that has no chance of being production due to hardware and software limitations can be stated to consumers all you want, the fact remains, 400,000 people are driving cars that might injure or kill someone.

All kidding aside, think about this, it is not like beta and your personal info is at risk, a risk you take. It is you taking a risk that kills or harms ANOTHER person that did not sign up for that.

It should be done in a limited market and in a limited manner.

If it kills or permanently disables one person, that is too many people, if at the core it was truly the technology that harmed the person.

THIS COMING FROM A GUY EXCITED ABOUT THIS TECHNOLOGY AND HAS BEEN IN TECH ALL HIS LIFE.
Millions of people are driving millions of cars that can kill you. Did you sign up for 16 year olds with learning permits to be on the road? Did you sign up for people who might have a heart attack driving next to you? What about all the other distracted/drunk/stupid things people do? Ever see a Mustang leaving a Cars & Coffee ;). People conveniently forget how risking driving is and somehow think they were perfectly safe until Elon came along an unleashed a bunch of cars with FSDBeta. FSD doen't need to be perfect. It just needs to be better than human drivers which means the bar is pretty low.
 
Millions of people are driving millions of cars that can kill you. Did you sign up for 16 year olds with learning permits to be on the road? Did you sign up for people who might have a heart attack driving next to you? What about all the other distracted/drunk/stupid things people do? Ever see a Mustang leaving a Cars & Coffee ;). People conveniently forget how risking driving is and somehow think they were perfectly safe until Elon came along an unleashed a bunch of cars with FSDBeta. FSD doen't need to be perfect. It just needs to be better than human drivers which means the bar is pretty low.
And it has many flaws that in totality make it unsafe. We have time to get it right.

Nobody answered my core issue, braking to near 0 on a highway.

Lucy we got issues lets fix them in a true beta environment.
 
  • Like
Reactions: 2101Guy
And it has many flaws that in totality make it unsafe. We have time to get it right.

Nobody answered my core issue, braking to near 0 on a highway.

Lucy we got issues lets fix them in a true beta environment.
There is NO core issue of FSD Beta braking to near 0 on a highway. Highway driving is almost always on the AP stack and NOT FSD Beta. Also how can any car brake to near 0 on a highway unless the driver is COMPLETELY incompetent and inattentive?

If you are referring to the California incident then we don't know yet what happened and even if AP was engaged it was NOT FSD Beta. Remember the Texas crash where there was NO driver in the seat so AP must have caused it and even the investigators made incorrect preliminary statements?

EDIT: Just to add all these systems are L2 and YOU ARE the driver. NOT the car.
 
  • Like
Reactions: derotam and Ruffles
There is NO core issue of FSD Beta braking to near 0 on a highway. Highway driving is almost always on the AP stack and NOT FSD Beta. Also how can any car brake to near 0 on a highway unless the driver is COMPLETELY incompetent and inattentive?

If you are referring to the California incident then we don't know yet what happened and even if AP was engaged it was NOT FSD Beta. Remember the Texas crash where there was NO driver in the seat so AP must have caused it and even the investigators made incorrect preliminary statements?
Oh ok.
 
There is NO core issue of FSD Beta braking to near 0 on a highway. Highway driving is almost always on the AP stack and NOT FSD Beta. Also how can any car brake to near 0 on a highway unless the driver is COMPLETELY incompetent and inattentive?

If you are referring to the California incident then we don't know yet what happened and even if AP was engaged it was NOT FSD Beta. Remember the Texas crash where there was NO driver in the seat so AP must have caused it and even the investigators made incorrect preliminary statements?

EDIT: Just to add all these systems are L2 and YOU ARE the driver. NOT the car.
While this is technically correct, I have certainly seen 60-65, to 30 (on highway, 280) very rapidly (HARD braking) and without any real or apparent reason…not shadow, traffic, vehicle in front, underpass, other… I would consider this nearly as bad as “near 0” at highway speed.
 
  • Helpful
Reactions: 2101Guy
If it is FSD or AP is a technical argument. The end user, the driver, is hardly able to distinguish. When I hard stopped and nearly got rear ended, clearly I did not try and analyze which module failed. Too busy watching the car behind me slam on his brakes, ANY RAIN, I was done for.

Maybe you are correct in such case maybe AP should be turned off and recalled also, since it is core to both technologies, and now the consumer has to understand what feature of the car should be turned on.

You truly underestimate the Tesla End User Base, and it's core competency. That is an issue with the tech, it is complicated. So complicated you have to explain which component FAILED, or RESPONDED in a very aggressive manner.

Module needs to be repaired, plain and simple.
 
  • Helpful
Reactions: 2101Guy