Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Phantom Braking

This site may earn commission on affiliate links.
Are you talking AP or TACC? Because AP is explicitly not intended to be used on such roads, and assumes all traffic is going the same direction you are- so being confused by oncoming traffic would be expected.

The beta is the solution being worked on for that- and while it can slow for oncoming traffic (esp. when hills or curves are involved) it's gotten extremely mild at this point when it does (2-3 mph typically)

When it comes to PB I don't believe it will make any difference whether its TACC or AP. In fact TACC might have more PB events/intensity due to the fact that the speed setting can be higher.

The whole "AP is explicitly not intended to be used" is kind of a joke because the car fully knows what you're doing by limiting the speed to 5mph over the speed limit while on undivided roads, but still allows you to use AP. Then they have the traffic light response for those of us with FSD, and unlike the FSD
beta we all have it.

Instead of the "explicitly not intended" its more of a wink-wink. A sort of "the lawyers are making us write all these disclaimers" sort of thing.
 
I drive a lot of 2 lane highways in rural alabama and it seems to me that the phantom braking events happen more frequently when there is a dotted center line as opposed to when there is double solid center lines. As if the car is unsure which direction the opposite lane is traveling when center line isn’t solid. Anyone else notice this?

Can you try a a couple experiments using the following settings?

Set FCW to early
Set Following distance to maximum
Turn off FSD Beta
turn off traffic light response

With those settings see how PB compares to what was before.

Use it at night to see if there is a pattern of PB with oncoming traffic versus when there is no traffic going the other direction.
 
To clarify, I was talking about while using in FSD beta. Does that make a difference?
It's not clear, but I don't think you can generalize 'phantom braking' using FSD with phantom braking with TACC. FSD is necessarily looking at much more parameters. I also suspect they have alarms (both audible and the internal software 'alarms' that the computer sees but you don't) set much more conservatively. The related question is does FSD affect TACC even when you're not using FSD? One wouldn't think so but it's possible.
 
I purchased a 2022 Model 3 in December. My driving, until today, has been mainly around town. Today I had a 170 mile trip, with 80 miles of the trip on a 2 lane highway. During the 80 mile stretch, I had 3 incidents with phantom breaking. Two of these occurred with AC and one with AP. During one incident, someone was tailgating me and almost collided with the back of the car when the emergency brakes kicked in. I won’t be using AC or AP on two lane roads until this is fixed.
Just a follow up to this post, I submitted this to Tesla. Here is the response received:

”Pulled logs and ran diagnostics - no faults or hardware problems found. Auto steer and Traffic Aware Cruise Control are both Beta features and may occasionally cause Model 3 to brake when not required or when you are not expecting it. These scenarios will improve over time with firmware updates. As firmware rolls out each update the system improves.”

I asked for timing on the firmware updates/resolution, but was not given an answer. I was thinking of upgrading to FSD, but given this issue, I am holding off on that.
 
I've a new poll on PB.

 
  • Like
Reactions: sleepydoc
i know that can seem like a waste of resources, however keep in mind that the guys creating Steam integrations are probably useless when it comes to programming their AI, so there really isn't a resource cost since those guys couldn't help if they wanted. programming tends to be a highly specialized field.

There are a number of non ML problems with the software that could be fixed. Additionally most of the newer programmers have had classes/projects on AI, ML, big data, distributed systems,… I doubt they would be totally useless. Lots would be eager to work on something big. Even the older/wiser ones need to continually learn in order to stay relevant.

IMO the sw efforts could use some help from some of the people whom have been around the block. They know about setting priorities, regression testing, usability, ….
 
Last edited:
There are a number of non ML problems with the software that could be fixed. Additionally most of the newer programmers have had classes/projects on AI, ML, big data, distributed systems,… I doubt they would be totally useless. Lots would be eager to work on something big. Even the older/wiser ones need to continually learn in order to stay relevant.

IMO the sw efforts could use some help from some of the people whom have been around the block. We know about setting priorities, regression testing, usability, ….
ok thats completely fair, they wouldn't be totally useless lol. i was more referring to the actual coding, i'm guessing that steam games and tesla's AI are completely different frameworks, methodologies, languages and dev environments (at least i hope so! lol). likely the steam guys are just interacting with predefined API's that tesla provides for access to their systems (like an app on an iphone).

there is an argument to be made, if the software package is different enough, having experts can actually hinder progress as, like all humans, they get stuck in their ways and every problem is a nail that can be "solved" with their particular hammer, sadly doesn't always end well.

that said i do wish they could speed up the dev process haha, which is honestly a selfish thing, software dev is sloooow, hell i pre ordered a video game once and it ended up being delayed like 5 years (pre covid) lol. (not an excuse for tesla, just the reality)
 
Just a follow up to this post, I submitted this to Tesla. Here is the response received:

”Pulled logs and ran diagnostics - no faults or hardware problems found. Auto steer and Traffic Aware Cruise Control are both Beta features and may occasionally cause Model 3 to brake when not required or when you are not expecting it. These scenarios will improve over time with firmware updates. As firmware rolls out each update the system improves.”

I asked for timing on the firmware updates/resolution, but was not given an answer. I was thinking of upgrading to FSD, but given this issue, I am holding off on that.
so essentially "no problems. it's working as designed. it should get better." Except it doesn't reliably get better with each update. :/
 
  • Like
Reactions: gforce2002
So I was just reviewing the MY manual and noticed something - Tesla puts TACC under the AutoPilot section and what most people consider 'autopilot' Tesla calls auto steer.

That actually explains a lot, among other things, why they consider it a 'beta' feature when no other manufacturer does (or needs to) as well as the awful reliability. They took cruise control away and replaced it with 'TACC' but instead of making it a straight forward, functional and reliable adaptive cruise they made the algorithm orders of magnitude more complex by adding traffic analysis and other features needed for AutoPilot, essentially adding more data that's not necessary for the core function but causes confusion and errors; TACC is functionally autopilot that you have to steer yourself.

I'm trying to figure out why. As an end users, it means we end up with a feature that functions poorly, is prone to error, and doesn't give any real benefit over the 'plain' adaptive cruise control that every other manufacturer has. I'm guessing Tesla was developing their AP algorithms and someone said "we can just take out the steering and use it for cruise control!" and Elon said "Great! It will be fully functional in a few months anyway!" That reduced the amount of coding they needed to do (and freed up programmers to work on adding more games.) If it's using the same code it also means it's harder/impossible to separate the features.

While this may explain the situation, it still doesn't excuse it IMO. Teslas are very capable of having adaptive cruise that works just as well as every other car, Tesla has just chosen not to implement it and obscures the picture with a term like 'TACC' to make it sound like you're getting something you're not.


1645725969297.png
 
Virtually nothing in that post made any sense or is how any of this works. But then you've repeatedly gotten upset anytime someone tried to explain how it works and insisted you refuse to read such things.

Though especially silly was the part where you think the guys doing AI and neural net programming are the same folks who make sure the atari emulator works.
 
  • Like
Reactions: WhiteWi
I am just not experiencing the same Phantom Braking problems in my 2018 M3 w/ radar on Navigate on AutoPilot (NoA) that others seem to or that I do on FSD Beta. This leads me to wonder: when the car switches from FSD Beta to NoA on the highway onramp, does it go back to using the radar as input to the speed control?

Also, @sleepydoc just a single downclick of your right stalk enables just TACC and it works just fine as a dynamic cruise control (unless you have a car without radar, I guess).
 
I am just not experiencing the same Phantom Braking problems in my 2018 M3 w/ radar on Navigate on AutoPilot (NoA) that others seem to or that I do on FSD Beta. This leads me to wonder: when the car switches from FSD Beta to NoA on the highway onramp, does it go back to using the radar as input to the speed control?


Nope.

One way you know this is you're still limited to 80 mph max speed and min follow distance of 2 just like the cars that don't have radar at all.
 
Also, @sleepydoc just a single downclick of your right stalk enables just TACC and it works just fine as a dynamic cruise control (unless you have a car without radar, I guess).
No, that’s the problem. TACC doesn’t work fine as a dynamic cruise control. It’s constantly slowing down randomly for no reason, and that’s my complaint. Tesla could give us a decent adaptive cruise but they chose not to.
 
Phantom braking was a very small issue with my 2018 M3 Midrange with EAP, until recently. Suddenly it seems to occur very often in normal traffic on the freeway. As far as I know, this car still uses radar but the recent software updates seem to have made the problem very much worse.
 
Hi, I'm a new member here, I don't have a Tesla vehicle (drive BMW i3) but may get one soon. I work professionally in machine learning but not on vision or robotics, more classical prediction.

I think a significant problem with the vision only is likely to be the absence of stereo cameras. People are praising the Subaru EyeSight (which seems to work great without radar, but with stereo). I looked up about its history: it's been in development since 2008, with multiple revisions. Apparently there was a singular Japanese researcher with some great ideas back at its inception. This person as of 2018 has a new startup with an improved algorithm and chipset.

He discusses the problems with various approaches, which I think are relevant here. He supported stereo cameras over mono plus radar. (Mono cameras only weren't ever considered by anybody!!) I think it's safe to say he's an expert.

Mono vision relies on the neural networks to use illumination and shape to detect objects, and then it is limited to detecting known pre-trained examples, and will be confused by less clear ones. This could cause 'phantom braking' as the net gives a non-trivial probability of a danger from a weakly classified object and because of risk mitigation (need to take into account a low percent chance of a highly dangerous situation) you get phantom braking until the object is large enough to be classified as benign or not an object.
ITD Lab’s Saneyoshi maintains that stereo-vision processing is ideal for detecting generic objects without training. In contrast, monaural vision can fall short of detecting objects when it encounters something that it has not been trained on, he noted. “For example, Volvo’s self-driving technology reportedly struggled to identify kangaroos in the road,” he said. Kangaroos’ movements in mid-jump confused the mono camera’s vision processing.

On the other side, safety could be better with stereo as it gives direct physical prediction of an obstruction even if it can't be classified as to 'what it is' until later. The article gives the example of the Uber fatal accident. In a nutshell a mono image recognition neural net detected the crossing bicycle 1.29 seconds in advance (too late) whereas his new chipset on stereo would have done so 2.23 seconds.


The chipsets appear to be classical non-neural signal processing for direct distance calculation which are then fed into the later phases of the driver assistance.

Remember that the existing mono camera sets were designed with the assumption that it would be supplemented by radar. Elon is out of his depth on machine learning and bullshitting here. If he were going by "first principles" and analogizing to human performance, the cameras would be stereo, well behind the windscreen (for more weather resistance), multiple gimballed and much higher resolution. (existing cameras are 1280x960, not very high at all when dealing with far off objects moving at high speed)

The performance of the Subaru indicates that stereo cameras is a valid and successful approach, though personally I would want stereo vision plus high-resolution 77 GHz imaging radar. I think it's fine to ignore lidar.

I think Tesla has a very good neural network perception stack in the ML case with a great advantage of being able to push out updates to the fleet and gather data, but it's made to do too much work because of the camera limitations. (Their route planning code isn't great as far as I can tell but that's relevant more for FSD than straight AP, but it hasn't yet been turned into a fundamentally machine learning solution.)

If they were to deploy some thousands of cars with good stereo cameras and collect data for a retrain, I bet their solution would be excellent and the power of the fleet wide machine learning would shine.