Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
One little but important detail. FSD is responding to a false target and is therefore being a hazard.

Not always. Sometimes it's slowing down in areas that are sensible to slow down for, but where human-drivers often wouldn't consider slowing.

For example, I've noticed that FSDb will consistently brake for large vehicles like buses blocking the sight-lines where side-streets join the main street. It makes perfect sense, because the bus is blocking both the turning traffic's view of me on the main road, and my view of them turning; but I've never once considered slowing down in a similar situation myself.

And I think more of this non-obvious driving behavior will arise as more control is handed to neural nets. It's like when Alpha-Go started developing novel strategies that human players hadn't thought about. FSDb ultimately doesn't care about driving like a human, it will optimize behavior for safety.
 
People keep saying FSDb has 90 million miles. The only actual Tesla sources I've ever found for the 90M number actually say "AP" has 90M miles.

Just a minor point, but it matters.
Does this help clarify?



Inasmuch as FSD Beta is about five times safer than the most recently available US average, the system’s safety stats are not as impressive as Autopilot’s recent results. As per Tesla’s Vehicle Safety Report for Q3 2022, the company reported one crash for every 6.26 million miles driven in which drivers were using Autopilot. Teslas operating with FSD Beta seem to be safer than cars not using Autopilot, however, as such vehicles recorded one crash for every 1.71 million miles driven.
 
Does this help clarify?



Inasmuch as FSD Beta is about five times safer than the most recently available US average, the system’s safety stats are not as impressive as Autopilot’s recent results. As per Tesla’s Vehicle Safety Report for Q3 2022, the company reported one crash for every 6.26 million miles driven in which drivers were using Autopilot. Teslas operating with FSD Beta seem to be safer than cars not using Autopilot, however, as such vehicles recorded one crash for every 1.71 million miles driven.
With only about 30 crashes, it's hard to draw anything but high level conclusions. However, one would expect more crashes per mile on city streets than on highways because there are more intersections per mile. It would also be interesting to know the details of those 30 crashes and how many of them had the Tesla with a significant percentage of fault. Like, how many are cars rear ending the Tesla stopped at a traffic light?
 
I don't have phantom braking that often.
Phantom braking does not exist for "no reason" it's because the car is indeed detecting and reacting to an obstacle. It's trying to be better safe than sorry.

And from my experience it doesn't "slams on the brakes!" It applies light to moderate braking which because it is unexpected surprises and feel like heavy braking. Instead of overriding it, on roads with fewer cars, let it do it's thing. It doesn't come to stop in the middle of the road (like slamming on the brakes does) It feels a lot more like my foot slipping off the accelerator.
I am not talking about phantom braking but the jerkiness/brake taps with fsdb. When I drive manually there is little to no use of friction brakes once you learn how to manipulate/ease off the go pedal even if the regen is limited. There is no reason fsdb can’t do the same or even better. These jerkiness is more apparent when you disable the “Apply Brakes When Regenerative Braking is Limited” and hover your foot over the brake pedal. But the underlying issue is that tesla vision cannot see/interpret the scenario from a distance and waits till last minute and so the jerkiness/brake slamming.

Also, the speed limit data is so awful for me starting from the very first street off my driveway to most of the streets surrounding me. There is right turn from a stop sign before leaving my neighborhood and there is no obstacle blocking the view (most easiest of right turns) yet fsdb creeps and crawls and pauses several times even though there is no other vehicles crossing the path as if there is a heavy traffic 😂🤷‍♂️
 
  • Like
Reactions: kabin
I am not talking about phantom braking but the jerkiness/brake taps with fsdb. When I drive manually there is little to no use of friction brakes once you learn how to manipulate/ease off the go pedal even if the regen is limited. There is no reason fsdb can’t do the same or even better. These jerkiness is more apparent when you disable the “Apply Brakes When Regenerative Braking is Limited” and hover your foot over the brake pedal. But the underlying issue is that tesla vision cannot see/interpret the scenario from a distance and waits till last minute and so the jerkiness/brake slamming.

Also, the speed limit data is so awful for me starting from the very first street off my driveway to most of the streets surrounding me. There is right turn from a stop sign before leaving my neighborhood and there is no obstacle blocking the view (most easiest of right turns) yet fsdb creeps and crawls and pauses several times even though there is no other vehicles crossing the path as if there is a heavy traffic 😂🤷‍♂️

While it indeed can be happening, IMHO the car may not be braking, it just feels that way. My wife isn't that good on modulating the accelerator and I get slammed to the dashboard and then to the back of my seat all the time. And I watch and her foot never hits the brake. It's actually the regen that feels like braking.

Speed limit data is often based on information from the GIS department of your county. I can swear that some of these folks don't know how to spell GIS, let alone use it correctly.

Turn at a stop sign. The car is definitely in a "I'm going to be really safe" mode now. It's like a 15-year-old with learners permit that is so paranoid about learning to drive. It's not a check once left, check once right, do it again and go, it's more like check left and right 40 times and go.
But think of it this way. Is Tesla ready for the barrage of news reports if something bad does happen? So far most of the FSD failure reports have not been FSD and no significant issues (except for that pesky little minor recall) reported.

How long has the 15 year old been learning things on the way to drive? 15 years? Tesla hasn't taken that long yet.
 
While it indeed can be happening, IMHO the car may not be braking, it just feels that way. My wife isn't that good on modulating the accelerator and I get slammed to the dashboard and then to the back of my seat all the time. And I watch and her foot never hits the brake. It's actually the regen that feels like braking.
Regen is still braking.
 
Funny thing, the other day I was picking my wife up from work in a heavily congested downtown setting. Another car was angling to cut us off, so I used a little bit of Tesla-torque to prevent that from happening. My wife actually said "Honey, could you let the car drive?" in response. So spousal approval of FSDb can vary from moment to moment.
Like everything!
 
And trust me, the car "sees" something when this happens.

There's a lot of things that humans first see, but then realize that it's not really there.
Can't trust that which makes no sense. It's a bug that's been improved in the past, at times regresses, hopefully improves in the future.

Sanity check. If what you suggest was true human drivers would be phantom braking regularly. They would be slamming on the brakes for 2D mirages, dark sections of roadway, shadows, oil patches, and eye floaters. It doesn't happen in the real world. Period.
 
Sanity check. If what you suggest was true human drivers would be phantom braking regularly. They would be slamming on the brakes for 2D mirages, dark sections of roadway, shadows, oil patches, and eye floaters. It doesn't happen in the real world. Period.

Humans also straight up run into things and make safety critical mistakes. We for the most part tune our driving to do our best to avoid obstacles but if we're unsure or maybe-kinda of see something we default to just keep going (and that causes crashes!). Have you ever seen a dark object moving near the edge of the road at night? Maybe a deer? How did you respond?

Computers have exactly the opposite logic: only keep going when confident it is safe; otherwise slow down. And being a computer they can apply that logic continuously 100% of the time, while braking require a conscious effort from humans.

An example of this behavior is a cross walk near where I live (it even has the blinking lights to warn drivers). I routinely start walking through the cross walk and cars just blast through it within a few feet of me. If I didn't step they are clearly on track to just run me over.

My overall point is humans do see things where we aren't totally confident in what we're seeing and how we should respond. The difference is our reaction to that: we just ignore the risks and keep going.
 
Can't trust that which makes no sense. It's a bug that's been improved in the past, at times regresses, hopefully improves in the future.

Sanity check. If what you suggest was true human drivers would be phantom braking regularly. They would be slamming on the brakes for 2D mirages, dark sections of roadway, shadows, oil patches, and eye floaters. It doesn't happen in the real world. Period.

I've braked at night for something I thought was an animal.
(I've also braked at night for an animal).
But middle of the day, no.
 
Last edited:
While it indeed can be happening, IMHO the car may not be braking, it just feels that way. My wife isn't that good on modulating the accelerator and I get slammed to the dashboard and then to the back of my seat all the time. And I watch and her foot never hits the brake. It's actually the regen that feels like braking.

Speed limit data is often based on information from the GIS department of your county. I can swear that some of these folks don't know how to spell GIS, let alone use it correctly.

Turn at a stop sign. The car is definitely in a "I'm going to be really safe" mode now. It's like a 15-year-old with learners permit that is so paranoid about learning to drive. It's not a check once left, check once right, do it again and go, it's more like check left and right 40 times and go.
But think of it this way. Is Tesla ready for the barrage of news reports if something bad does happen? So far most of the FSD failure reports have not been FSD and no significant issues (except for that pesky little minor recall) reported.

How long has the 15 year old been learning things on the way to drive? 15 years? Tesla hasn't taken that long yet.
If you hover your foot over the brake pedal you can easily tell the brake taps and reason for jerkiness. All it need to do is ease off the go pedal but like I said tesla vision just isn’t able to see and interpret from a distance is the underlying reason.

There is another right turn from a stop sign I encounter near my work where most vehicles are traveling at 45-50mph. Now this turn has some bushes blocking the view but nothing complicated once you stop and creep. However several times fsdb has tried to blindly merge and I had to manually brake to avoid the collision. So fsdb is a “blind 15 year old” driver at this point.
 
While it indeed can be happening, IMHO the car may not be braking, it just feels that way. My wife isn't that good on modulating the accelerator and I get slammed to the dashboard and then to the back of my seat all the time. And I watch and her foot never hits the brake. It's actually the regen that feels like braking.

Regen maxes at like 0.2g in normal use (IIRC it'll do 0.3 in track mode which isn't for use on public roads).... While actually slamming the brakes is nearly 0.8g- it should be VERY easy to tell the difference between those, and regen certainly should NOT be leaving you "slammed to the dashboard"

if it is you should schedule a service visit- something's wrong with the car.
 
  • Like
Reactions: nvx1977
Sanity check. If what you suggest was true human drivers would be phantom braking regularly. They would be slamming on the brakes for 2D mirages, dark sections of roadway, shadows, oil patches, and eye floaters. It doesn't happen in the real world. Period.
Read Nobel scientist Daniel Khaneman's book "Thinking Fast and Slow" and you will learn that humans do, in fact, misinterpret situations with alarming frequency, but are also capable of reinterpreting quickly. It's a survival skill.