Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla.com - "Transitioning to Tesla Vision"

This site may earn commission on affiliate links.
You like to just drop "that isn't a phantom brake" in on videos that clearly show a car slowing down when no human would have. Most people tend to use it in that way, much like most people would assume "full self driving" means you don't need to drive the car. But we're in Elon's world, so words mean different things.

Can you please give us the definition of phantom brake so we can all be enlightened and use it correctly?
In that particular case it is very clear why the car slowed down. Phantom braking normally refers to it slowing down very quickly for no reason that can be determined.
 
Maybe I'm a fanboy or whatever, but my mind gushes in awe every time I watch Tesla Vision in rain. I don't know what you guys are seeing, but I can't believe it. It's too good. The feeling is similar to the birds-eye-view intersection predictions, although those can be rough.

The Tesla Vision predictions are stable, seems accurate, and just like whatttt?!? It can't be real. That's what's going through my mind because I've worked with NNs for text recognition, and they suck (even when done by Google or Apple) for something so easy with millions of examples. How can Tesla predict car positions in the rain? What is a car? How does it know? It's bananas.
 
This is *not* phantom braking (around 52 seconds), just for your reference:

I would classify that as a phantom brake even though a human might have yielded because the confusing merge road sign, but there was no need to actually yield:
not a merge.jpg


It's somewhat unclear if that yellow merge sign is for the lane that just merged (very slender lane just behind the right wiper) or a warning for the current 4-lane highway that the incoming lane for the silver SUV is merging.

Map data most likely indicated eastbound Briley Parkway merges onto northbound I-24 as it stays as 4 lanes while it also knew the westbound merge for the SUV results in 5 lanes. However if you look at the location of the eastbound merge, it's 100 meters away from where the lane finally ends, and Tesla probably has various heuristics and buffers to estimate merge lane length, e.g., a fixed 200 meters. This length estimation happens to overlap with where the SUV's lane is added, and that length is exceeded right at the Thompson Lane overpass, so that's why Autopilot initially yielded (shaded the adjacent vehicle darker) then quickly stopped yielding.

osm i24.jpg


This should get fixed by having "city streets" FSD behavior on the highways if it can visually determine whether a lane is merging instead of relying on maps and heuristics.
 
  • Like
Reactions: gearchruncher
In that particular case it is very clear why the car slowed down.
And also, yes, some humans would slow down for that car.
What? Slow down for a car entering from a ramp to the right, that has it's own lane? No wonder traffic is so bad in the USA if any human would slow down for that. The Tesla doesn't even slow down enough to avoid an impact so what was the point?

Just because we can theorize what makes it slow down doesn't make it not a phantom brake. If all cars drove around like that constantly, traffic would be a mess.
 
I would classify that as a phantom brake even though a human might have yielded because the confusing merge road sign, but there was no need to actually yield:
View attachment 680885

It's somewhat unclear if that yellow merge sign is for the lane that just merged (very slender lane just behind the right wiper) or a warning for the current 4-lane highway that the incoming lane for the silver SUV is merging.

Map data most likely indicated eastbound Briley Parkway merges onto northbound I-24 as it stays as 4 lanes while it also knew the westbound merge for the SUV results in 5 lanes. However if you look at the location of the eastbound merge, it's 100 meters away from where the lane finally ends, and Tesla probably has various heuristics and buffers to estimate merge lane length, e.g., a fixed 200 meters. This length estimation happens to overlap with where the SUV's lane is added, and that length is exceeded right at the Thompson Lane overpass, so that's why Autopilot initially yielded (shaded the adjacent vehicle darker) then quickly stopped yielding.

View attachment 680895

This should get fixed by having "city streets" FSD behavior on the highways if it can visually determine whether a lane is merging instead of relying on maps and heuristics.


This is the problem with everything being classified as "phantom"
The video review shows that the car on the ramp was highlighted on the MCU as a lead car. That caused the Tesla to slow down.

Phantom braking is when you do not have any of the visual clues as to why your car slowed down.
 
Last edited:
What? Slow down for a car entering from a ramp to the right, that has it's own lane? No wonder traffic is so bad in the USA if any human would slow down for that. The Tesla doesn't even slow down enough to avoid an impact so what was the point?

Just because we can theorize what makes it slow down doesn't make it not a phantom brake. If all cars drove around like that constantly, traffic would be a mess.
There is no hard definition of phantom brake. If the driver at the time didn't feel it was a phantom brake (they understood at the time why it was slowing down or if they themselves would have slowed down), then it's not a phantom brake. Phantom brake usually refers to unexpected slowing which you can't determine a reason for.

I've suggested before Tesla can do a better job of communicating what the car is doing and that can be a way to eliminate what many may consider as "phantom braking". For example, I've seen plenty of examples (even in other brands) of cars braking or slowing for a speed limit change and the driver not knowing that was the reason why (thus treating it as a "phantom brake"). But if the car announced beforehand that it was doing that, the driver would know to expect it (and might even be able override it ahead of time).
 
I would classify that as a phantom brake even though a human might have yielded because the confusing merge road sign, but there was no need to actually yield

That's absolutely not phantom braking. I'm not sure if Tesla is using maps for every merge case or not, but when I drive somewhere new, I'm not sure if a car is merging or not, and on occasion, I will slow down for a "merging" car that isn't actually merging. Perhaps Tesla is using a NN to predict whether or not a car is merging, and sometimes it gets it wrong, just as humans do.

In this particular example, the car thought it was merging for about 2 seconds and then decided it wasn't merging anymore (goes from dark gray to light gray).

This actually happens often in my radar car. Anyone who's used AP in the right lanes will experience this. It's one of the annoying things about AP at the moment, but it's not phantom braking.
 
Last edited:
he video review shows that the car on the ramp was highlighted on the MCU as a lead car. That caused the Tesla to slow down.

Phantom braking is when you do not have any of the visual clues as to why your car slowed down.
Got it. So this was just AP driving like a complete amateur idiot. Do we have a category for that since we're not allowed to use phantom brake?
Someone should tweet a "will v9 fix the car believing someone merging in a lane next to you is a threat?" to Elon so we can get a "yes" ;)
 
Also, the owner says that he has not had a phantom braking incident in the 3k+ miles, that would have been the first - which he then classifies as bad merging logic.
I agree with the owner.

And this is a guy who's had a radar Model 3 prior to this vision-only car (check out his old AP videos). So he knows what he's talking about. He's also an engineer of some sort I believe.

Edit: anyone who's experienced phantom braking first hand will understand how confusing and unpredictable it is. Videos don't do it justice. It's a jab to your gut, lol. There's no way to avoid it or understand when to keep your foot ready. It's vastly different than "dumb" slowing down behavior by AP (wrong speeds / merging). The latter is predictable and easily mitigated by experienced AP-users.
 
Last edited:
Phantom braking is when you do not have any of the visual clues as to why your car slowed down.
Well given that Autopilot is primarily a vision system and Karpathy even explained that bad radar returns are associated to some low confidence visual output, there's always some visual clue whether that's an overpass, some nearby vehicle, debris or even plant on the side of the road. It's been a while, but I remember some FSD beta videos where it briefly mis-classified a cactus on the side of the road as a pedestrian or a small pile of snow as an animal, and both cases Autopilot slowed and swerved -- are those phantom braking? An average person probably wouldn't even know to look for those potential misclassifications, so I guess they could fall into "no visual clue?"

In any case, sounds like your definition sounds stricter than any undesired braking, and that's fine and quite reasonable especially for Autopilot team to have something concrete to address instead of a potentially unending list of braking related bug reports that people will report.
 
  • Like
Reactions: powertoold
Okay, just did 160 mile highway trip at dusk. Definitely phantomn braking twice. I say “phantom” but I know (I think) that my Y with no radar saw the highway signs above the highway as it came up over a small hill and twice went from 60 to about 43mph very fast. The third time it was an overpass. So, depending on your definition of “phantom” that’s three in 160 miles. There were NO other cars around me. Regardless, the car shouldn’t have slowed at all.

As for YoTube folk like Dirty Tesla, I like him a lot, but 98% of the time he doesn’t really say anything bad about a Tesla. Even in the summon video he just said it’s the same as radar cars. There’s a reason he got an invite to the Plaid event and was able to get a Y as soon as he wanted one and got a response from Uncle Elon within minutes of a tweet. He’s not so much a fanboy, but he spends most of his time giving them very positive press. He’s informative, but not objective.

Final piece. TV definitely doesn’t “see” cars that are two ahead. Radar was much better at that. Not sure it matters, but even when I can see a portion of the vehicle two ahead of me, it’s rarely shown on my Y’s screen. Sometimes, but not often.

Okay, having said all of that, TV did an excellent job for the majority of the trip including handling well when I was cut off and when someone sped up to stop the car from changing lanes. All smoother responses than most humans would have had and it got on and off three different highways with no issues. I am impressed by much if TV, disappointed with the braking issues, but I guess I’d rather it be a bit conservative.
 
What I find kinda funny is the prevailing consensus on TMC was that Tesla Vision would significantly reduce phantom braking, but in the reports we have so far from people with Tesla Vision (without Radar) is that its worse.

Hopefully its better in the most recent update if you've installed it.
I did install 4.18.10 on Tuesday or Wednesday. I then had to travel about 350 miles for work on Thursday, I had several actual phantom braking events during that trip. No one in front of me, nothing on the side of the road or anything I could see that I would think that could confuse the system. I really wish I knew what it is seeing (or maybe not seeing) that is causing it. And to be clear a majority of these phantom brake events I'm experience are the very momentary but quick ones. It will rapidly slow down like it tapped the brakes hard, but only 2-3 mph and then go right back up to set speed.

BTW, that trip was my first time using a supercharger. Seeing 250Kw being dumped in was pretty damn cool. Going from 21-63% in just 16 minutes was awesome. And it was so seamless. Seeing a lot of the CCS issues other youtubers have shown is partly what made me pick the M3 over the Mach-E.

Now I'm not in agreement that we should have dumb cruise control as an option, and instead I think we need more granularity in which features to turn off. Like I was a fairly simple adaptive cruise control. In fact I'd like to set it to "smoothness" over safety. Like I want it to optimize smooth driving over everything else.

That's not a bad idea either. A sensitivity slider could certainly help alleviate issues. I'm just hoping that the issue becomes non-existent within the next few months of updates and this discussion becomes obsolete. Ideally it would never happen, but I won't even be greedy and take once every 100 miles on avg.

And if I'm the only one having this many problems, then maybe my car specifically has issues. Either that or May 2021+ owners aren't being vocal about it yet.
 
Well given that Autopilot is primarily a vision system and Karpathy even explained that bad radar returns are associated to some low confidence visual output, there's always some visual clue whether that's an overpass, some nearby vehicle, debris or even plant on the side of the road. It's been a while, but I remember some FSD beta videos where it briefly mis-classified a cactus on the side of the road as a pedestrian or a small pile of snow as an animal, and both cases Autopilot slowed and swerved -- are those phantom braking? An average person probably wouldn't even know to look for those potential misclassifications, so I guess they could fall into "no visual clue?"

In any case, sounds like your definition sounds stricter than any undesired braking, and that's fine and quite reasonable especially for Autopilot team to have something concrete to address instead of a potentially unending list of braking related bug reports that people will report.
Well, I would say if you as the driver were not looking at the visualization, or were unaware what to look for, it's still qualified as a "phantom brake" (because that's what the driver would classify it as, given they would have no clue of the cause). If however the car does a great job informing you it was going to brake for something it detected ahead, then that wouldn't qualify as phantom braking, given you expected it.
 
Well, I would say if you as the driver were not looking at the visualization, or were unaware what to look for, it's still qualified as a "phantom brake" (because that's what the driver would classify it as, given they would have no clue of the cause). If however the car does a great job informing you it was going to brake for something it detected ahead, then that wouldn't qualify as phantom braking, given you expected it.
That is a good point.
The closer you get to FSD the more a phantom braking event will be noticed by occupants because they will be paying less and less attention.
So, the only real solution is to not have phantom braking. :)
 
In that particular case it is very clear why the car slowed down. Phantom braking normally refers to it slowing down very quickly for no reason that can be determined.
On TMC I read a lot of anecdotes about phantom braking, and usually they end with "good thing no one was behind me or I probably would have been rear-ended". There are so many comments like this, and apparently phantom braking occurs so often around the world, it makes me wonder why such an accident hasn't actually occurred (to date).

Is it possible that the presence of other cars affects AutoPilot's decision tree, based on multiple situational and confidence factors? For example, if the car detects a possible obstacle (or maybe one of those indecisive blip-trains that Karpathy described for radar), but assigns only 30% confidence that its real, it will decide not to brake when there's a close-trailing car, but more likely decide it's prudent to brake when there isn't one? Avoid creating a known-likely accident scenario in the effort to mitigate a probably-unreal one. I'm not saying that it's ever a good thing to slam the brakes needlessly, but maybe there's a little more software discretion behind this behavior than it seems.
 
I did install 4.18.10 on Tuesday or Wednesday. I then had to travel about 350 miles for work on Thursday, I had several actual phantom braking events during that trip. No one in front of me, nothing on the side of the road or anything I could see that I would think that could confuse the system. I really wish I knew what it is seeing (or maybe not seeing) that is causing it. And to be clear a majority of these phantom brake events I'm experience are the very momentary but quick ones. It will rapidly slow down like it tapped the brakes hard, but only 2-3 mph and then go right back up to set speed.

BTW, that trip was my first time using a supercharger. Seeing 250Kw being dumped in was pretty damn cool. Going from 21-63% in just 16 minutes was awesome. And it was so seamless. Seeing a lot of the CCS issues other youtubers have shown is partly what made me pick the M3 over the Mach-E.



That's not a bad idea either. A sensitivity slider could certainly help alleviate issues. I'm just hoping that the issue becomes non-existent within the next few months of updates and this discussion becomes obsolete. Ideally it would never happen, but I won't even be greedy and take once every 100 miles on avg.

And if I'm the only one having this many problems, then maybe my car specifically has issues. Either that or May 2021+ owners aren't being vocal about it yet.
I can confirm. May 2021 vision car here.

I’ve put ~200 miles in for each software update.

I also had a radar car for 8 days/300 miles for a loaner getting some repairs done.

Phantom breaking definitely exists on Tesla Vision. There have been a few times on each software version where a shadow in the road (I think) would confuse the car into slowing down (75->50). Not always in the same spot either; in various parts of the parkway when it was not busy.

Also since .18.10, the car picks up an angled stop sign for another road and tried to slow me down from 55 to a complete stop in the middle of busy traffic (rt 34, 2 lanes each way).

And finally I’d like to confirm that even when I can see a second car in front of me, the CID often does not show it as well
 
I wonder what's behind the phantom braking reports from you guys vs Dirty Tesla / others with videos.

I don't think Dirty Tesla is biased enough to under-report phantom braking or issues with AP. I've been watching his videos for a long time, and he tends to be very picky about unnatural AP decisions or maneuvers.

Perhaps there is more phantom braking on city streets vs highway? It'd be great to see video examples.
 
Tesla's not the only company doing vision only. Wayve is also doing it. Watch their videos with only 6 cameras as the car drives around the absolutely nutty city of London. Crazy the thing did so well with just cameras and AI. Done all on-the-fly like Tesla with no HD mapping, lidar/radar etc...


Go to the 1hr mark, they demo the car going through London various scenarios. Worse than San Fran.
Wow. Tesla is woefully behind. 😂
 
... vs Dirty Tesla / others with videos.
He is a fanboy and wants to keep Tesla happy. He is good at hiding really bad stuff. For example when summons was about to run over a barricade he just says whoops that was weird and moves on. He spent a total of one second on a really bad issue.


I don't think Dirty Tesla is biased enough to under-report phantom braking or issues with AP.
I have seen him complaining about phantom breaking. Don't know if it was with recent vision videos, but for example when there are pedestrians on the side of the road it has braked on him.

I've been watching his videos for a long time, and he tends to be very picky about unnatural AP decisions or maneuvers.
He will get picky about subtle stuff that should be corrected, but major stuff like phantom braking he will just quickly move on.