Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

AP/phantom braking.

This site may earn commission on affiliate links.
Tesla says humans do it with just vision, so cars can too. Well, yeah. Except we humans use binocular vision and parallax to judge distance, as well as other clues. As far as I know, with three forward-facing cameras of different focal lengths, Tesla isn't doing parallax computations.
One thing that has impressed me as I've read posts here, looked at Tesla (and other car makers') systems is just how much our brains do and how good they are at it without even 'thinking.' we instantly and subconsciously judge distance with both binocular vision and other depth cues. We keep track of a myriad of different objects and trajectories and plan our own, all while drinking our coffee and eating a burger!
 
  • Like
Reactions: trisk and CharleyBC
I developed a hypothesis about phantom braking as well as what I dubbed phantom lane change, where the car says, "changing to faster lane", when there's nothing close ahead. My thought is that if it sees a phantom ahead, and thinks it's going just a little slower than we are, then it changes lanes. And if it sees a phantom that is scarier, it jams on the brakes. Either way, I think it's seeing something that's not there. Sort of a "duh" so far.

Here's what I noticed toward the end of our long drive home from Salt Lake City a couple weeks ago. The freeway was straight as an arrow in the left-to-right sense, but had some gentle undulations. The result was that a car that was far ahead (maybe a quarter mile?) disappeared from view down a dip. As we and the other car moved over the terrain, the other car emerged back into view. Phantom braking occurred instantly. But there was no reason--the other car was much too far ahead to be any immediate threat to safety. So I thought, interesting, maybe anytime it first sees something ahead that it hasn't been seeing, it worries. Then with better evaluation of the situation, it figures out how far the other car is, and resumes the set speed. Meanwhile sleeping passengers awaken in abject fear.

So I tested my hypothesis. Sure enough, next time we were following a car at long distance, and it disappeared down a dip and then reappeared, we had a PB event. Alas, this didn't dawn on me until near the end of the freeway portion of our trip, so I have a shortage of data points. But I think it makes some sense.

And I blame (at least partly) Tesla removing radar from the mix. We had some PB back when the radar was in use, but it was usually the scary overpass syndrome, or something like that. Never on clear, straight freeway. Now with the radar shut off, we have this new phantom formula. And we had it a lot on our trip from OR to UT and back.

Tesla says humans do it with just vision, so cars can too. Well, yeah. Except we humans use binocular vision and parallax to judge distance, as well as other clues. As far as I know, with three forward-facing cameras of different focal lengths, Tesla isn't doing parallax computations.

Can I please have my radar back?
Great post!
 
I finally got a chance to go for a hiking at a provincial park again yesterday. It's a 45km drive one way, with 100% 2-lane 80km/h road with lots of curves.

When I took the same trip earlier on with 2022.3.101.1 mostly on AP, there were lots of PBs - perhaps 15 to 20 during this trip. They were predictable, but still annoying.

Yesterday, I experienced zero PBs with 2022.8.2. Some cars passing by were quite big, including RVs and trucks - some very close to the center line. Zero PBs, not even 1km/h slow down.

Today, I have to drive 380km on mixed roads including faster highways. I am cautiously optimistic on 2022.8.2 AP. I'll make sure AP is enabled whenever I feel there is a risk on PB just to see what it does. I still need to do some manual driving to keep up with the safety score...
 
  • Informative
Reactions: sckor
Aaah, PB and all its reactions. I’ve had it since purchasing my Model 3 in 07/2018 and firmware 2022.4.5.16 was, for me, the best software version by far with few and minor PB events. Not zero, but the best I’ve experienced. Then 2022.8.2 came along and wham! Very aggressive and frequent PB, back to where I was on some of the worst earlier versions. Passenger escalates her reactions to DEFCON 1. Things go bad from there. Just really don’t get it. And again, the useless plea to Tesla for dumb cruise control. So once again, no TACC, no AP, no FSD use, at least with passengers/pets in the car.
 
So how does the software determine distance to object?
good question - my assumption is it relies on binocular vision, measuring differences between two camera views and using that to triangulate. Radar and ultrasound are good for distance but Tesla has stated they are moving away from radar and ultrasound is really only good at short range (like parking sensors.)

Humans use contextual cues as well - you know how big a car is so you can guess how close it is by how much of your field of view it occupies, and if you see something move behind something else, you know it's further away. I have no idea if Tesla or other cars with optical systems use these techniques or not.

I suspect some of it is 'trade secrets' and I don't know if anyone here knows but maybe someone with more intimate knowledge of Tesla's systems can comment.
 
  • Like
Reactions: enemji
And there’s the rub. Same software, two different cars and two dramatically different experiences.
I'm curious about this too.

After today's longer drive, I'm still not experiencing any PB. SalisburySam has a Model 3, and I have a Model Y. I doubt that makes a difference, but my experience on AP is FAR from useless.

I tried it on Highway 401 (12-lane highway within Toronto) with lots of traffic, and it handled fine. Lots of cars cut in front of me, and it gracefully slowed down, and sped up. Even handled all "clover" shape exits. Tried lane changes, and I had zero concerns on how it handled each time. But one time, the speed sign went from 100km/h to 50km/h instantly, and it did precisely that. Tesla needs to handle this better.

Don Valley is a curly 6-way highway with some aggressive drivers. No problem there either.

Also tried on 4-lane (very tight and busy) roads, that was ok too.

In all cases, my left hand was gently holding the steering, the arm weight was just enough to not bring up any warning.

For the heck of it, once I turned it off, and I manually focused hard to stay in the middle of the lane. My wife told me the AP is better than me! 😅

Overall, it is now much more relaxing to drive. Even my wife told me that. I have to drive this route weekly, and I'm looking forward in using AP from now on.
 
  • Like
Reactions: sleepydoc
So how does the software determine distance to object?
binocular vision is likely used to measure parallax, however in recent years there have been lots of advancements in measuring distances (and stuff in general) from a single video output. lots of mathematical models have been developed that can, to a high degree of precision, measure distance using simple object reference analysis (sizes of objects in frames over time), among other methods. pretty sure google's measure app is based on some form of this lol. math is not my focus tho so please dont ask me to explain the nitty gritty lol but i'd expect Tesla, and all self driving car companies use this in some form.

this type of thing is more a result of advancements in tech, not so much the math lol but all the same :)
 
  • Like
Reactions: sleepydoc
I'm curious about this too.

After today's longer drive, I'm still not experiencing any PB. SalisburySam has a Model 3, and I have a Model Y. I doubt that makes a difference, but my experience on AP is FAR from useless.

I tried it on Highway 401 (12-lane highway within Toronto) with lots of traffic, and it handled fine. Lots of cars cut in front of me, and it gracefully slowed down, and sped up. Even handled all "clover" shape exits. Tried lane changes, and I had zero concerns on how it handled each time. But one time, the speed sign went from 100km/h to 50km/h instantly, and it did precisely that. Tesla needs to handle this better.

Don Valley is a curly 6-way highway with some aggressive drivers. No problem there either.

Also tried on 4-lane (very tight and busy) roads, that was ok too.

In all cases, my left hand was gently holding the steering, the arm weight was just enough to not bring up any warning.

For the heck of it, once I turned it off, and I manually focused hard to stay in the middle of the lane. My wife told me the AP is better than me! 😅

Overall, it is now much more relaxing to drive. Even my wife told me that. I have to drive this route weekly, and I'm looking forward in using AP from now on.
Historically, PB has had much more of a problem on two lane roads and fewer problems on the multi-lane highways you describe so that may be part of the difference.
 
To me, the most obvious reason for the majority of PB scenarios is the "uncertain" nature of the camera sensor data. This uncertainty is best manifested as the jumping cars when you're stopped at the traffic light. A lot of times you see cars "jumping" left and right or front and back, sometimes even appearing and disappearing from one second to the next. The lane markings are also jumping all the time, but to a lesser degree. The rendering algorithm is just interpreting the sensor data, so I don't think the algorithm is introducing any randomness here. It can only be the inaccuracy of the sensors. If they can't consistently detect distance when you're idle, it's hard to believe that they will do a better job when you're going 60 mph.

On two-lane roadways with an incoming semi, even a slight inaccurate sensor reading could result in the car "thinking" that the semi has crossed the median, and therefore slamming the brakes to avoid a potentially deadly collision. We don't see the "jumping" when traveling at high speed is likely due to the algorithm smoothing out the rendering between consecutive readings of the sensors (purely guessing on my part).

TBH, I haven't done any research on the camera quality used by Tesla, but that's where I think FSD/AP can potentially gain the most benefit if the camera quality is drastically improved, like upgrading from a Kodak to a Nikon. Imagine if the camera is always returning consistent readings and the visualization of your car's surroundings is entirely stable - no jumpiness. Of course, there is a cost to any improvement.
 
  • Like
Reactions: enemji
To me, the most obvious reason for the majority of PB scenarios is the "uncertain" nature of the camera sensor data. This uncertainty is best manifested as the jumping cars when you're stopped at the traffic light. A lot of times you see cars "jumping" left and right or front and back, sometimes even appearing and disappearing from one second to the next. The lane markings are also jumping all the time, but to a lesser degree. The rendering algorithm is just interpreting the sensor data, so I don't think the algorithm is introducing any randomness here. It can only be the inaccuracy of the sensors. If they can't consistently detect distance when you're idle, it's hard to believe that they will do a better job when you're going 60 mph.

On two-lane roadways with an incoming semi, even a slight inaccurate sensor reading could result in the car "thinking" that the semi has crossed the median, and therefore slamming the brakes to avoid a potentially deadly collision. We don't see the "jumping" when traveling at high speed is likely due to the algorithm smoothing out the rendering between consecutive readings of the sensors (purely guessing on my part).

TBH, I haven't done any research on the camera quality used by Tesla, but that's where I think FSD/AP can potentially gain the most benefit if the camera quality is drastically improved, like upgrading from a Kodak to a Nikon. Imagine if the camera is always returning consistent readings and the visualization of your car's surroundings is entirely stable - no jumpiness. Of course, there is a cost to any improvement.
keep in mind what you see in the visualization is not what the computer sees. that's a rendering for your benefit by the infotainment computer. it wouldn't show as a coherent image if they displayed the raw output, that translation leads to artifacts in the render like the car jumping around. a little speculation there but i really doubt the car thinks trucks are teleporting haha.
 
keep in mind what you see in the visualization is not what the computer sees. that's a rendering for your benefit by the infotainment computer. it wouldn't show as a coherent image if they displayed the raw output, that translation leads to artifacts in the render like the car jumping around. a little speculation there but i really doubt the car thinks trucks are teleporting haha.
The other thing it doesn't really explain is all the PB events where there is nothing on the road at all. Semis in the oncoming lane are annoying and still a failure (especially compared to other systems on the road,) but at least you can prepare for them a they're somewhat understandable.

I listened to a podcast discussing the upcoming changes in FSD algorithms and one of the presenters mentioned an issue he had where the car would somehow interpret tire skid marks as a bicycle. He would be driving down an empty road and there would be skid marks at random spots and he noticed that every time the computer said there was a bicycle and started braking.

It's possible (actually likely) that a similar phenomenon is happening with other PB events. I'm typically looking at the road so I wouldn't see if a 'ghost bike' or other random object transiently pops up on the display, but as @OncomingStorm said the display isn't necessarily what the computer is 'seeing' anyway.

Ultimately it still means that the computer is interpreting the input incorrectly and is therefore a failure of the system, but it's an interesting hypothesis.
 
keep in mind what you see in the visualization is not what the computer sees. that's a rendering for your benefit by the infotainment computer. it wouldn't show as a coherent image if they displayed the raw output, that translation leads to artifacts in the render like the car jumping around. a little speculation there but i really doubt the car thinks trucks are teleporting haha.
My point was just that between an algorithm introducing "randomness" in data interpretation and low-quality sensors introducing "false positives", I'll always lean to the latter. The visualization is just the algorithm translating the sensor input (raw data) into visual form for humans to see. Unless I get actual evidence to say otherwise, I wouldn't pin the jumpiness on the algorithm. Again, you may be better versed in this area than I am.

The other thing it doesn't really explain is all the PB events where there is nothing on the road at all. Semis in the oncoming lane are annoying and still a failure (especially compared to other systems on the road,) but at least you can prepare for them a they're somewhat understandable.
Very true. The semi example is just the most obvious one I can think of. One that's most easily explained. It's much harder to explain ghost objects or even overpasses, but it's not out of the realm of possibility that low-quality sensors can generate false positives out of nothing. I really should do some research on the camera specs myself so I'm not just pulling crap out of my behind, but given all of Tesla's constraints (age of company, investments, timelines, demands, etc.), it makes sense that the cameras may not be top-notch, i.e. relative to the need of a reliable FSD/AP.
 
  • Like
Reactions: sleepydoc
The other thing it doesn't really explain is all the PB events where there is nothing on the road at all. Semis in the oncoming lane are annoying and still a failure (especially compared to other systems on the road,) but at least you can prepare for them a they're somewhat understandable.

I listened to a podcast discussing the upcoming changes in FSD algorithms and one of the presenters mentioned an issue he had where the car would somehow interpret tire skid marks as a bicycle. He would be driving down an empty road and there would be skid marks at random spots and he noticed that every time the computer said there was a bicycle and started braking.

It's possible (actually likely) that a similar phenomenon is happening with other PB events. I'm typically looking at the road so I wouldn't see if a 'ghost bike' or other random object transiently pops up on the display, but as @OncomingStorm said the display isn't necessarily what the computer is 'seeing' anyway.

Ultimately it still means that the computer is interpreting the input incorrectly and is therefore a failure of the system, but it's an interesting hypothesis.
i think classifying it as a failure because of that is a bit of an overstatement but at a minimum is a limitation / shortcoming of the system yes.

and i do agree with that line of thinking.... the problem we have as humans when trying to figure out what the computer is "thinking" is we dont think like a computer, so a random set of pixels that is in fact nothing but dead air and some confusing contextual lines behind it is easily dismissed by us but may, even for a nanosecond, appear as something that needs attention by the computer, resulting in a breaking event with no apparent cause.

the good and bad is this is a perception issue (AI understanding not referring to vision only configuration), over time this will get resolved but doesn't bring us satisfaction now, i'm very encouraged by the most recent updates tho, i've experienced almost no breaking events on 4.5.17, the wife has been very happy haha.
 
keep in mind what you see in the visualization is not what the computer sees. that's a rendering for your benefit by the infotainment computer. it wouldn't show as a coherent image if they displayed the raw output, that translation leads to artifacts in the render like the car jumping around. a little speculation there but i really doubt the car thinks trucks are teleporting haha.
I think I'm just seeing now what you actually meant. I may be putting too much faith in the algorithm not being buggy. Something as complicated as interpreting and translating the raw sensor data is very unlikely to be bug-free, as with all software programs, so you're right the algorithm can play a major role in generating false positives. I was probably too idealistic to think the "rendering" process as simple and straightforward.
 
I thought I was seeing an uptick on phantom braking - but turns out there's nothing phantom about them at all.
One real trouble spot for me turned out to be a misinterpretation of the speed limit. It would detect a 35mph adjacent to the 70mph highway and brake hard.
The other one was a regular road where there was a particular patch that it would very often alert and brake hard. My passenger noticed the display briefly flashed a pedestrian before the alert sound and deceleration kicked in.
Now in area where a I know its going to happen I'm checking the display much more than I normally would to catch why its slowing down.
It would be nice if it left those messages on the screen when it happened
 
  • Like
Reactions: sleepydoc
I thought I was seeing an uptick on phantom braking - but turns out there's nothing phantom about them at all.
One real trouble spot for me turned out to be a misinterpretation of the speed limit. It would detect a 35mph adjacent to the 70mph highway and brake hard.
The other one was a regular road where there was a particular patch that it would very often alert and brake hard. My passenger noticed the display briefly flashed a pedestrian before the alert sound and deceleration kicked in.
Now in area where a I know its going to happen I'm checking the display much more than I normally would to catch why its slowing down.
It would be nice if it left those messages on the screen when it happened
I had a case where my car virtually stopped in the carpool lane of I394 coming out of downtown Mpls. Turns out it was where the regular highway crosses under the carpool lanes. I think the car was using GPS and map data rather than sensor data. Issues like this won’t resolve until Tesla switches over from relying primarily on map data. Good news is it sounds like that will happen reasonably soon.