Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Phantom braking explanation?

This site may earn commission on affiliate links.

sleepydoc

Well-Known Member
Aug 2, 2020
5,587
9,929
Minneapolis
I was driving on a state Highway this morning and happened to look down at the display to see a truck in the other lane. The truck blinked on and off a couple times then disappeared. The problem is there was no truck. I hadn’t recently passed one, either. The road was empty except for me.

I assume that the screen visualizations simply reflect what the image processing system ’sees,’ so it would make perfect sense of the processor sees an illusion that it would brake, then stop braking when it disappears.

I know we’ve all hypothesized about this before but this is the first actual evidence I’ve seen. In the end, it still doesn’t matter - it’s the job of the system to accurately interpret the visual input, so if it’s hallucinating then it’s failing, but it’s interesting from an academic perspective.
 
  • Like
  • Informative
Reactions: OxBrew and VegasMYP
I was driving on a state Highway this morning and happened to look down at the display to see a truck in the other lane. The truck blinked on and off a couple times then disappeared. The problem is there was no truck. I hadn’t recently passed one, either. The road was empty except for me.

I assume that the screen visualizations simply reflect what the image processing system ’sees,’ so it would make perfect sense of the processor sees an illusion that it would brake, then stop braking when it disappears.

I know we’ve all hypothesized about this before but this is the first actual evidence I’ve seen. In the end, it still doesn’t matter - it’s the job of the system to accurately interpret the visual input, so if it’s hallucinating then it’s failing, but it’s interesting from an academic perspective.
Did your car apply brakes too when this image of a truck flickered on and off?
 
The car also shows me a lot of false returns from the parking sensors when driving fast - even when there isn't a barricade or anything beside me. Just driving down the road it will flash the little grey curved lines from time to time. Those ultrasonic sensors are not good at high speed - it behaves like my boats depth sounder when running at high speeds.
 
I was driving on a state Highway this morning and happened to look down at the display to see a truck in the other lane. The truck blinked on and off a couple times then disappeared. The problem is there was no truck. I hadn’t recently passed one, either. The road was empty except for me.

I assume that the screen visualizations simply reflect what the image processing system ’sees,’ so it would make perfect sense of the processor sees an illusion that it would brake, then stop braking when it disappears.

I know we’ve all hypothesized about this before but this is the first actual evidence I’ve seen. In the end, it still doesn’t matter - it’s the job of the system to accurately interpret the visual input, so if it’s hallucinating then it’s failing, but it’s interesting from an academic perspective.
After i have access to FSD beta a month ago, i had a couple of long road trips. 500+ Miles trip each way. I had at least two or three phantom braking incidents, and that makes me really nervous because the car behind me might run into my car. I am not sure how Tesla recognizes the issue, but they really need to improve the problem.
 
After i have access to FSD beta a month ago, i had a couple of long road trips. 500+ Miles trip each way. I had at least two or three phantom braking incidents, and that makes me really nervous because the car behind me might run into my car. I am not sure how Tesla recognizes the issue, but they really need to improve the problem.
out of curiosity, do you find it better, worse or unchanged since getting FSDb? There's an open NTHSB investigation on the matter, so Tesla is definitely aware.

What was the nature of the events you experienced? Were they full on emergency braking events (slamming on the brakes, alarms beeping, red warning on the screen? Or simply braking moderate to hard and rapidly slowing?

People have described several types of phantom braking. Some appear to be false activations of the emergency braking system while others are simply random episodes of aggressive slowing anywhere from 5-20 MPH and a similar braking intensity to what one would do if a light turned yellow and it was just soon enough that you couldn't make it before it turned red. False activations have been a problem in many makes, Tesla included. The random slowdowns seem to be a problem with TACC/Adaptive cruise and are much more common and more unique to Tesla.
 
out of curiosity, do you find it better, worse or unchanged since getting FSDb? There's an open NTHSB investigation on the matter, so Tesla is definitely aware.

What was the nature of the events you experienced? Were they full on emergency braking events (slamming on the brakes, alarms beeping, red warning on the screen? Or simply braking moderate to hard and rapidly slowing?

People have described several types of phantom braking. Some appear to be false activations of the emergency braking system while others are simply random episodes of aggressive slowing anywhere from 5-20 MPH and a similar braking intensity to what one would do if a light turned yellow and it was just soon enough that you couldn't make it before it turned red. False activations have been a problem in many makes, Tesla included. The random slowdowns seem to be a problem with TACC/Adaptive cruise and are much more common and more unique to Tesla.
I am not sure if it is an emergency braking, but it was hard enough to make very nervous me and my wife. The bags on the back seat all fell down due to the braking. Quite often, it takes brake without any reason. The road was pretty clean.
 
This is the first time I've heard that a phantom brake was involved in an accident (paywall from the New York Times published today). It's been documented by Tesla data and video:


In July 2020, with a clear sky at about 2 PM, Tracy Forth was on Autopilot at about 77 MPH near Tampa, Fla, and made an Auto Lane Change.

The system sensed an obstacle (human ability does not sense or classify anything in that scenario as an obstruction). It aborted Auto Lane Change. It decelerated from 77 MPH to 55 MPH in a fraction of a second, and at that speed, it was hit by the car behind.

The video shows a parking tow truck well inside the right side of the road shoulder, not protruding to the lane at all. The system might mistakenly classify that as an obstacle and perform a phantom brake (a brake that makes perfect sense to the machine but not to humans).

1660868305384.png

Photo: The New York Times

When there's a car behind you, it's doubtful that anyone would have a superhuman reflex to overcome the phantom brake that caused the rapid deceleration in a fraction of a second before the car behind would hit you in this particular scenario.
 
  • Informative
Reactions: sleepydoc
Mine was hard aggressive breaking, gave me whiplash going form 60mph to near stop I was IN MANUAL with my foot on the peddle, the road was down hill and approaching the end of the slope country road B road.
I have constant steering alarms, UK minor roads are possibly a bit too tight and lack of white lines confuses where the road limits are and lots of times it locks on to tryer ruts not the edge of the road.
 
This is the first time I've heard that a phantom brake was involved in an accident (paywall from the New York Times published today). It's been documented by Tesla data and video:


In July 2020, with a clear sky at about 2 PM, Tracy Forth was on Autopilot at about 77 MPH near Tampa, Fla, and made an Auto Lane Change.

The system sensed an obstacle (human ability does not sense or classify anything in that scenario as an obstruction). It aborted Auto Lane Change. It decelerated from 77 MPH to 55 MPH in a fraction of a second, and at that speed, it was hit by the car behind.

The video shows a parking tow truck well inside the right side of the road shoulder, not protruding to the lane at all. The system might mistakenly classify that as an obstacle and perform a phantom brake (a brake that makes perfect sense to the machine but not to humans).

View attachment 842469
Photo: The New York Times

When there's a car behind you, it's doubtful that anyone would have a superhuman reflex to overcome the phantom brake that caused the rapid deceleration in a fraction of a second before the car behind would hit you in this particular scenario.
“Backed by data from her Tesla, Ms. Forth ultimately decided to sue the driver and the owner of the car that hit her, claiming that the car tried to pass hers at an unsafe speed. (A lawyer representing the other car’s owner declined to comment.)”

Fault has yet to be determined in this case.
 
“Backed by data from her Tesla, Ms. Forth ultimately decided to sue the driver and the owner of the car that hit her, claiming that the car tried to pass hers at an unsafe speed. (A lawyer representing the other car’s owner declined to comment.)”

Fault has yet to be determined in this case.
the legal fault may be undecided but as a practical matter it seems pretty clear - the car braked unnecessarily and caused an accident that would not have happened if it didn't.
 
Driving 101 is to always assume the car in front of you is going to do something unexpected and go from speed to no speed suddenly. If you hit that car you are at fault. If you tailgate another car inside your control distance, or let another car do this to you, then good luck........Why would you let FSD have control when being tailgated at highway speeds? That fails driving 101, repeat the course........
 
Another claim of accidents caused by phantom brakes running from 55 MPH down to 20 MPH as reported by the driver (need car log to verify):

Tesla ‘full self-driving’ triggered an eight-car crash, a driver tells police
Definitely need more details. Even if the car was 'FSD' as the article claims then it should have been on the AutoPilot stack if it was on the freeway, so the FSD claim is false. The other question was whether it was on TACC or AP. That's not quite as critical a difference, though.