Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Phantom braking so bad I want to return my car

This site may earn commission on affiliate links.
My feeling is that a significant portion of PB isnt really PB at all .. after all, how sure can any driver be that the car didn't see something they missed? Sure, not saying all PB falls into this, but I wonder what the percentage is? There have certainly been occasions when I've had the car brake when it took me several seconds to puzzle out why it did so.
That's the question, isn't it? By definition, PB is the car braking when the driver sees no reason to brake. In these cases there are 3 broad scenarios:
  1. The driver completely missed an obstacle or danger (major or minor) that the system caught.
  2. The system was deceived by a shadow, etc and perceived a problem that wasn't present and the human was better able to accurately distinguish and analyze the surrounding.
  3. The system simply braked without reason at all.
We could also add a '1.5' condition where was a minor obstacle (like a bird) that the car was more worried about than the driver.

Let's assume that #3 is not part of the problem. The question then becomes how often is there really a danger that the human driver is missing? Ultimately it's impossible to say because by definition the humans reporting the problem wouldn't report it or view it as a problem if they saw a reason to brake.

That means we're left using indirect evidence. First is the sheer number of reports. If there really were that many dangerous conditions that human drivers were missing we should be seeing far more accidents by people not using TACC. A few months ago I was driving about 150 miles back from our cabin and for the first half of the trip I got a PB event about every 5-10 minutes. I simply kept my foot on the accelerator and reflexively accelerated thorough them. Not one of those times did I hit anything or see anything. Moreover, my wife was sitting next to me and she did not see anything either.

I've also had several PB incidents on interstates with other cars around and I'm the only one to slow down. That would mean that the car saw a danger that every other driver around me missed. Possible? Yes, but not at all likely.

Finally, every time the car slows down I do a double take to make sure there wasn't anything I missed. (well, I used to - at this point the false positive rate is so high that I've all but quit doing it which is itself a problem.)

Ultimately, I the car is seeing something that it thinks is a danger but really isn't a danger then it's still making mistakes. We can say "it was confused by this shadow" but the entire point of the system is to process the surroundings and figure out what's a shadow and what's a car. if it can't effectively do that then it's failing at its job.
 
That's the question, isn't it? By definition, PB is the car braking when the driver sees no reason to brake. In these cases there are 3 broad scenarios:
  1. The driver completely missed an obstacle or danger (major or minor) that the system caught.
  2. The system was deceived by a shadow, etc and perceived a problem that wasn't present and the human was better able to accurately distinguish and analyze the surrounding.
  3. The system simply braked without reason at all.
We could also add a '1.5' condition where was a minor obstacle (like a bird) that the car was more worried about than the driver.

Let's assume that #3 is not part of the problem. The question then becomes how often is there really a danger that the human driver is missing? Ultimately it's impossible to say because by definition the humans reporting the problem wouldn't report it or view it as a problem if they saw a reason to brake.

That means we're left using indirect evidence. First is the sheer number of reports. If there really were that many dangerous conditions that human drivers were missing we should be seeing far more accidents by people not using TACC. A few months ago I was driving about 150 miles back from our cabin and for the first half of the trip I got a PB event about every 5-10 minutes. I simply kept my foot on the accelerator and reflexively accelerated thorough them. Not one of those times did I hit anything or see anything. Moreover, my wife was sitting next to me and she did not see anything either.

I've also had several PB incidents on interstates with other cars around and I'm the only one to slow down. That would mean that the car saw a danger that every other driver around me missed. Possible? Yes, but not at all likely.

Finally, every time the car slows down I do a double take to make sure there wasn't anything I missed. (well, I used to - at this point the false positive rate is so high that I've all but quit doing it which is itself a problem.)

Ultimately, I the car is seeing something that it thinks is a danger but really isn't a danger then it's still making mistakes. We can say "it was confused by this shadow" but the entire point of the system is to process the surroundings and figure out what's a shadow and what's a car. if it can't effectively do that then it's failing at its job.
As I said, I'm not denying that PB is real, it's just hard to judge how much is #1 vs #2 (hopefully, as you note, we're not seeing any #3).

What's much more interesting to me is why some people seem to see more PB incidents than others. I have maybe 1 every 10 miles on FSD, and mostly these are minor slow-downs, certainly nothing to cause any adrenaline or WTF reaction. Only twice have a had a really major braking incident in three years. I'm not clear what causes such wide variance among people/cars/circumstances. Am I using AP/FSD more conservatively? Anticipating and disengaging where others do not? Is my car calibrated differently? Are the road conditions/weather/traffic so different?
 
As I said, I'm not denying that PB is real, it's just hard to judge how much is #1 vs #2 (hopefully, as you note, we're not seeing any #3).

What's much more interesting to me is why some people seem to see more PB incidents than others. I have maybe 1 every 10 miles on FSD, and mostly these are minor slow-downs, certainly nothing to cause any adrenaline or WTF reaction. Only twice have a had a really major braking incident in three years. I'm not clear what causes such wide variance among people/cars/circumstances. Am I using AP/FSD more conservatively? Anticipating and disengaging where others do not? Is my car calibrated differently? Are the road conditions/weather/traffic so different?
Completely agree! Is it just different software versions? Differences in the roads that people drive? Or maybe some people just don’t care and aren’t bothered by it. Regardless I’ve seen that often - some people having persistent regular issues with it, some people having only sporadic issues and other people never having any (or at least claiming to never have any) problems.
 
I bet it does to NHTSA, if it is braking for a reason, that's way different than just braking at random. The owner may not like it either way, but for the former it won't be recall worthy, for the latter it might.
To some extent. For the NHTSA what matters is safety. 'Phantom Braking' as it's discussed on forums like this encompasses everything from a minor 3-5 MPH slowdown, to more significant 10-20 MPH slowdowns to a full-on false AEB event that locks up the brakes. The first is annoying and causes people to sway to and fro but isn't dangerous (assuming it doesn't you throw up all over the dash!) The second is potentially an issue depending on other traffic around and the latter is most definitely a safety issue.

While all of these represent system failures in some capacity, the false AEB events fortunately seem to be much less common. In the 2 years I've owned the car I may have had one of these whereas the minor slowdowns seem to occur almost every time I drive the car and have a much bigger impact simply by virtue of their frequency.

I have no idea exactly what was reported to NHTSA or what triggered their investigation but I would suspect it's the latter. That would also fit with the other investigations of other makes that I've seen.
 
To some extent. For the NHTSA what matters is safety. 'Phantom Braking' as it's discussed on forums like this encompasses everything from a minor 3-5 MPH slowdown, to more significant 10-20 MPH slowdowns to a full-on false AEB event that locks up the brakes. The first is annoying and causes people to sway to and fro but isn't dangerous (assuming it doesn't you throw up all over the dash!) The second is potentially an issue depending on other traffic around and the latter is most definitely a safety issue.

While all of these represent system failures in some capacity, the false AEB events fortunately seem to be much less common. In the 2 years I've owned the car I may have had one of these whereas the minor slowdowns seem to occur almost every time I drive the car and have a much bigger impact simply by virtue of their frequency.

I have no idea exactly what was reported to NHTSA or what triggered their investigation but I would suspect it's the latter. That would also fit with the other investigations of other makes that I've seen.
From what I have seen for other recalls of AEB it is the latter you describe, which is hard braking that occurs for no reason and also have led to crashes. So far (knock on wood) rear end accidents from such braking haven't occurred with Tesla AFAIK.

The current investigation seems to be most focused on the car not braking enough, like for example for emergency vehicles and also if the AP system doesn't do enough to force people to pay attention. The Phantom Braking doesn't seem like it is the core part of investigation.
 
  • Like
Reactions: DrGriz and WhiteWi
From what I have seen for other recalls of AEB it is the latter you describe, which is hard braking that occurs for no reason and also have led to crashes. So far (knock on wood) rear end accidents from such braking haven't occurred with Tesla AFAIK.

The current investigation seems to be most focused on the car not braking enough, like for example for emergency vehicles and also if the AP system doesn't do enough to force people to pay attention. The Phantom Braking doesn't seem like it is the core part of investigation.
Sorry - I just don’t get why ignoring an emergency vehicle is Tesla’s problem and not the drivers’. Ditto with the systems to force people to pay attention. They already have several systems in place. At some point people need to be responsible for their own inattentiveness.

I view AP like enhanced cruise control. It’s pretty danged good, but I’d never do anything like…watch a a movie on my iPad, for example.
 
Slowed down abruptly for overhead bridge while driving on EAP. Didn’t do this for long time before this occasion.
Are you sure there wasn’t a troll standing in the road that you missed? 😜
It matters because, if a significant percentage of "PB" events are the car reacting to genuine events that a human is missing, these will not (and should not) ever go away, since they are not PB at all.
Agree, but like I said above, the circumstantial evidence is that there is nothing to brake for in the majority of cases. The times I’ve looked down at the screen it hasn’t shown anything, either, for what that’s worth.

I had a real AEB event earlier today - I was driving through a town going 20-25 MPH. There were cars parallel parked next to me and a man came out to get into his car. As he exited the sidewalk and walked behind his car he looked up and saw me and proceeded to walk immediately next to his car but the Tesla freaked out and slammed on the brakes because all it saw was someone coming out from between the parked cars. I was going fairly slow, there was plenty of room between my car and his and since I knew he saw me I wasn’t worried but there’s no way for the AEB system to know his intentions so I can’t blame it at all. Had he been staring down at his phone the AEB could have saved his life. 👍

Overall, PB seemed to be better, at least for the first half of the trip. I had a lot of small, random 1 MPH ‘mini‘ events - almost like it was still braking but they sped up the algorithm so it figures out that it’s a false alarm more quickly and resumes the original speed. Not quite perfect but much less irritating and much improved from before. I actually found that I didn’t need to keep my foot perched over the accelerator the way I’ve grown accustomed to. The second half of the trip was a bit worse but still better. Hopefully the improvements continue!
 
Still on 12.3.2. PB is basically solved, but awareness of oncoming traffic in my lane is dangerously lacking.

Drove 1500 miles the last 5 days, I-90 across WA State and back, and Spokane to Boise and back. Exactly 2 medium semi PB events while using TACC (slowed 15 mph, not AEB, felt like max regen).

One, bad pavement with lots of patches on a crest in the road. I forgive this one, since I was also kind of alert to the situation, looking for potholes.

Two was bad mirages for a half mile. I'd prefer these not to happen, but am OK taking over in these conditions.

Had probably dozens or more situations that previously would have triggered PB on the same roads.

Dangerous: on a city street at 25mph, driving manually, around a sweeping left, a motorcycle was entirely in my lane, about to hit me head on, maybe 40-60 meters ahead of me. I had to brake hard and swerve into the empty parking lane to miss him. He reacted after passing me, got back in his lane later. Zero action or warnings from the Tesla. Freaked out the family on the sidewalk with kids, dogs and strollers right next to me.

At 65 on a 4 lane highway, driving manually, a left turning car crossed in front of me so late I had to brake as hard as possible to miss a massive t-bone accident. No warnings or action from the Tesla. Missed them by under a car length.

I'm just assuming the AEB is non-functional at this point. It may be just these scenarios, but I don't have the mental space to understand when it might or might not work.

(no, I don't have video from either event. Dashcam had the red x in both situations.)
 
PB is a real issue. To deny it is just Tesla fanboi nonsense. Stop with the ignorance.

I get phantom braking in the same areas every single day. Some areas are under construction which is perfectly understandable. Two of the areas are on a 2 mile stretch of open highway with no on/off ramps or distractions of any kind. I have reported this to Tesla multiple times. I just don't use cruise or lane assist in those areas now BUT when it first happened it scared the crap out of me and definitely created a very unsafe situation.

Tesla is not infallible. They have an issue and hopefully address it voluntarily or the NHTSA will force them to. And if that doesn't happen they will lose major market share and then they'll address it for profit reasons. One way or another it will be addressed in time,
 
I was driving on two way local road on eap,follow distance 4 following pick up with trailer and when my car approached tree shadows it showed a personan on visualization and car braked hard and I mean hard almost to a dead stop. Well that was creepy for sure.
 
  • Informative
Reactions: Matias
I was driving on two way local road on eap,follow distance 4 following pick up with trailer and when my car approached tree shadows it showed a personan on visualization and car braked hard and I mean hard almost to a dead stop. Well that was creepy for sure.
Yeah, if the visual processing software makes a mistake then it’s like a person hallucinating and seeing a bear running at them - to them the bear is real and they’re going to scream and run away. To the car the illusion was real so ESB acted accordingly. The problem was the information sent to the ESB system by the visual analysis system was wrong.

It’s like the question @drtimhill posted earlier about how much PB was due to artifact like this. It’s impossible to know, but ultimately it doesn’t matter because the entire job of the TACC system is to accurately process the data.

On a related note, I apparently parked on top of a trash can in our garage! The interesting thing is the car didn’t yell at me as I was pulling in. I assume it’s becasue I was only going so slow.
D040D6FC-5414-4433-8E89-F87812DC12C2.jpeg
 
Yeah, if the visual processing software makes a mistake then it’s like a person hallucinating and seeing a bear running at them - to them the bear is real and they’re going to scream and run away. To the car the illusion was real so ESB acted accordingly. The problem was the information sent to the ESB system by the visual analysis system was wrong.

It’s like the question @drtimhill posted earlier about how much PB was due to artifact like this. It’s impossible to know, but ultimately it doesn’t matter because the entire job of the TACC system is to accurately process the data.

On a related note, I apparently parked on top of a trash can in our garage! The interesting thing is the car didn’t yell at me as I was pulling in. I assume it’s becasue I was only going so slow.View attachment 817768
Tesla trash compactor.
 
Yeah, if the visual processing software makes a mistake then it’s like a person hallucinating and seeing a bear running at them - to them the bear is real and they’re going to scream and run away. To the car the illusion was real so ESB acted accordingly. The problem was the information sent to the ESB system by the visual analysis system was wrong.

It’s like the question @drtimhill posted earlier about how much PB was due to artifact like this. It’s impossible to know, but ultimately it doesn’t matter because the entire job of the TACC system is to accurately process the data.

On a related note, I apparently parked on top of a trash can in our garage! The interesting thing is the car didn’t yell at me as I was pulling in. I assume it’s becasue I was only going so slow.View attachment 817768
 
Yeah, if the visual processing software makes a mistake then it’s like a person hallucinating and seeing a bear running at them - to them the bear is real and they’re going to scream and run away. To the car the illusion was real so ESB acted accordingly. The problem was the information sent to the ESB system by the visual analysis system was wrong.

Probably not relevant to the conversation, but was inspired by 'hallucinating'. Not sure if any of you have seen the DeepDream algorithm, but it reiterates the images backwards through the network to see what a higher confidence output would see. Really trippy!