Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Phantom Braking

This site may earn commission on affiliate links.
FWIW I drove a bit north of 100 miles today, and paid specific attention to speed.

Excluding braking when the speed limit legit changed (because that's the opposite of "phantom") I observed:

No unexplained slowdowns whatsoever on highways (which was about 65-70% of miles travelled)- which is pretty consistent with all previous experiences in highway driving, going back years. Not for shadows, not for overpasses, not for "nobody around me"- it brakes but only when there's actually a reason to.

On non-highways (bear in mind this is FSDBeta 10.8.1 running) I got slowdowns I guess I'd say moderately often in cases of either cresting a hill or taking a curve with oncoming traffic in opposing lane... but I made special point to observe the degree of braking. It was typically 2-3 mph- so not remotely dangerous- with one exception--

On a 2-lane road, approaching what would be a right turn, there was an oncoming car that I guess was slowing down a bit that FSD seemed to think might be planning to turn left at the same place I was going to be turning right.... so even though I would've had right of way, it actually came to a stop right before the right turn just as this guy was reaching the intersection coming the other way.

As it turned out, after slowing down a bit, the guy sped up again continuing straight.... so that wasn't "phantom" braking just a wrong guess, nor was it dangerous since the car was gonna need to slow down to make the turn anyway.



Anyway, for anyone curious- out of the 100+ miles today, not counting disengagements to enter/exit/drive around parking lots, I had a total of 3 interventions.

One is a T-stop where you have to crest a bit of a hill that is also a railroad crossing- that FSDBeta has never handled well, so I manually take over for it.

One is another T-stop where there's actually a light (I'm approaching the long leg of the T, needing to turn left) where for some reason it always requires me to tell it it's ok to go even though it correctly displays when the light turns green (it prompts to tap the accelerator or stalk to continue, then proceeds through just fine).

One was a legit disengagement on the highway because I had to get over to the right on a highway interchange and it's one of those ones where you pass the on-ramp to your right and that has cars coming in from the other highway, but that same onramp lane is the exit lane to the highway they came from, so you have a quite short amount of space/time to take the interchange. With little or no traffic NoA handles this. When it's busy it needs help, which it did today.
 
  • Like
Reactions: Yelobird
Most of mine are mild and annoying. Had a severe one for a flashing yellow the other day, the car behind me almost ended up in my back seat 😅, luckily I was paying enough attention to floor the accelerator
Was this under TACC/AP or FSD Beta?

I hope that people aren't including FSD Beta phantom braking. It sorta feels like they are since they're reporting the kinds of annoying slight up and downs in speed that FSD Beta has.
 
No, it doesn't help at all because it's not a matter of perspective. It's a matter of a flawed system.

Tell me what 'perspective' I'm missing when I'm driving down an empty road in broad daylight and the car suddenly slows down 3-5 MPH then speeds back up. Tell me what 'perspective' I'm missing when I'm on a straight interstate under similar conditions, with minimal traffic and no one in front of me for 3-400 feet and the car suddenly drops its speed by 10+ MPH causing the cars behind me to flash their brights.

If I had a chauffeur that randomly stepped on the brakes and slowed the car 5 MPH then accelerated again to the speed limit I'd fire him. "not causing accidents" is an incredibly low bar. Like I've repeated 100 times, my bar is the standard that every other car with adaptive cruise control has achieved. It's not that low.

Please stop making excuses for a flawed system. It's incredibly tiring to have people keep saying "you're driving it wrong" when the real problem is "they're programming it wrong."
You can cry as much as you want. Not going to change the fact that the “machine” will have its own behavior which may or may not fit your style of driving. For the record, it is not flawed like you want to emphasize that your style of driving is the gold standard. It is NOT. The system will continually keep changing too. Some will like it, others may not, while some may altogether hate it.

Other cruise controls have less sensors. The more the sensors, the more the data it gets, and it has to evaluate all of that, not just what is in front of me.
 
You can cry as much as you want. Not going to change the fact that the “machine” will have its own behavior which may or may not fit your style of driving. For the record, it is not flawed like you want to emphasize that your style of driving is the gold standard. It is NOT. The system will continually keep changing too. Some will like it, others may not, while some may altogether hate it.
It’s not *my* style of driving - it’s the style of every other car with adaptive cruise.

Please explain how the examples I and others gave above are ‘appropriate.’ What ‘style’ of driving randomly brakes and then accelerated for no reason? (And no, ‘the computer must have a reason if it slowed down’ isn’t an answer. Of course it had a reason: it’s a flawed algorithm.


You can deny all you want but it doesn’t change the fact that the programming is flawed.
 
It’s not *my* style of driving - it’s the style of every other car with adaptive cruise.

Why do you keep telling this lie?

Every time a car brand was named I linked to a bunch of posts/stories found in seconds for that brand with SUDDEN UNEXPECTED BRAKING from the automated/adaptive braking systems.

This is an issue for every brand of car

Yet you keep pretending it's not.
 
Especially if it is false.
Exactly.
Why do you keep telling this lie?

Every time a car brand was named I linked to a bunch of posts/stories found in seconds for that brand with SUDDEN UNEXPECTED BRAKING from the automated/adaptive braking systems.

This is an issue for every brand of car

Yet you keep pretending it's not.
You gave a cases of emergency braking systems being activated. That's not what we're talking about. Unless you can understand the topic, don't bother to comment.
 
Last edited:
  • Like
Reactions: Sporty
As it turned out, after slowing down a bit, the guy sped up again continuing straight.... so that wasn't "phantom" braking just a wrong guess, nor was it dangerous since the car was gonna need to slow down to make the turn anyway.
My suspicion is that some not-insignificant amount of PB is actually not phantom, but the car reacting to a real potential danger that the driver did not notice (and thus assigns as PB). Certainly I've had FSD beta trip a few times on me and it took me a second to realize what triggered the (valid) slow-down.
 
I just want them to fix the fricking unintended slowdowns. No more excuses or people saying my experience is fine and would be no better in other cars. In our Tesla only, It's a pain In the a&& and really sours the driving experience so much wife hates using cruise control at this point. They have to know it sucks. Just fix it like all my other cars that have radar cruise or throw in the towel and give me the option for another Beta function -> stupid cruise that just controls speed, doesn't slow ever. That would be a more useful beta than say the Beta high beams that stink or beta auto wipers that are terrible.
 
Last edited:
“It almost never happens to me anymore”? I should NEVER happent. Happened to me yesterday on a snowy road with no one in front of me and no oncoming traffic at 35mph and no Autopilot. If it had happened 500 yards later I would have been off the road. I loved the car, until yesterday.
" I should NEVER happent." ???? It happend just yesterday on two of my friends one is a Volvo the other one is a Mercedes
 
Exactly.

You gave a cases of emergency braking systems being activated. That's not what we're talking about. Unless you can understand the topic, don't bother to comment.


No, in many cases it was the active cruise system doing it as well. You just didn't bother actually reading the sources. Which you admitted at the time.

In fact LAST time I called you out on telling this lie your defense was at least THOSE cars can switch to dumb cruise. So it's weird you'd now pretend you didn't remember some of them had the problem on active cruise.


But in short- If you're unwilling to actually read any sources shown to you beyond headlines- don't bother to comment.


Couple of examples thereof:

TLDR - if you can’t say something in less than 30 pages then you’re probably not going to get your point across.

This after I debunked some of your claims with a long list of sources- you admitted you couldn't be bothered to read them.


TLDR

I did read the manual (2020, the model of our Forester)

This when I caught you lying claiming Subaru does NOT have the SAME types of "car might brake unexpectedly and incorrectly" warnings in the manual. I even linked you to the manual debunking your claim.





I don’t get why justifying a flaw with “well others have a similar flaw” is an acceptable form of rebuttal 🤔


Because it's not a flaw, it's inherent to the type of system

Which is why every car maker has the same types of MIGHT BRAKE WHEN IT SHOULD NOT warnings in their manuals.

This is not something exclusive to Tesla. As repeatedly proven with source after source that apparently nobody can be bothered to read so they keep repeating the nonsense claims it's some Tesla flaw.
 
Last edited:
False positives are, as I understand it, mainly an issue with world modelling and it has been a known problem for a long time in the AV space along with stuff like static object detection. The degree of issues between different brands, different models, and even model years is something I imagine the NHTSA would be looking into, and the manufacturer response to the problem is important.

Tesla vehicles might experience the issues more often because the systems are trying to do so much everywhere rather than focusing on a narrower set of functions, and more complexity tends to leave more room for problems. I would personally not be buying a vehicle that doesn't come with regular dumb cruise control, even great adaptive cruise isn't without its quirks that I'm not a fan of.

Seems like not offering regular dumb cruise control is shoehorning people into bad experiences. How about offering regular dumb cruise control and identifying which roads the system doesn't function well on, possibly alerting drivers that they're approaching a spot where L2 ADAS can create a bad experience?

How about building a three-layer system with L2 ADAS functional where it works well, L3 Traffic Jam pilot in slow-moving traffic, and regular dumb cruise control everywhere else? And then seamlessly integrate them with take-over warnings etc.


There are so many good ways to do this, I don't think we've scratched the surface yet.
 
Seems like not offering regular dumb cruise control is shoehorning people into bad experiences.

Tesla leading the industry in every major survey of owner satisfaction appears to dispute that conclusion.


How about offering regular dumb cruise control and identifying which roads the system doesn't function well on, possibly alerting drivers that they're approaching a spot where L2 ADAS can create a bad experience?

How about building a three-layer system with L2 ADAS functional where it works well, L3 Traffic Jam pilot in slow-moving traffic, and regular dumb cruise control everywhere else? And then seamlessly integrate them with take-over warnings etc.


Sounds like a perfect recipe for mode confusion and accidents caused thereby.

Owners ALREADY repeatedly show they can't be bothered to understand the SINGLE clearly written in the manual ODD of regular autopilot- and routinely complain when it doesn't "work right" someplace Tesla already told them it's not intended to work right.

A 3-mode system would be worse (and more dangerous)
 
Tesla leading the industry in every major survey of owner satisfaction appears to dispute that conclusion.





Sounds like a perfect recipe for mode confusion and accidents caused thereby.

Owners ALREADY repeatedly show they can't be bothered to understand the SINGLE clearly written in the manual ODD of regular autopilot- and routinely complain when it doesn't "work right" someplace Tesla already told them it's not intended to work right.

A 3-mode system would be worse (and more dangerous)
Relying on people reading blurbs buried in the manual is asking for bad experiences, driver prompts and seamless integration with mapped/identified roads is likely the smarter way to go. Level 3 will require driver prompts and take-over warnings regardless, and that seems to be the likely next step for these systems outside of heavily geofenced Level 4.

The human-machine interface and clear communication will be key here until we're ready for generalized Level 4+, which doesn't seem anywhere close to reality yet.
 
Relying on people reading blurbs buried in the manual is asking for bad experiences, driver prompts and seamless integration with mapped/identified roads is likely the smarter way to go. Level 3 will require driver prompts and take-over warnings regardless, and that seems to be the likely next step for these systems outside of heavily geofenced Level 4.

The human-machine interface and clear communication will be key here until we're ready for generalized Level 4+, which doesn't seem anywhere close to reality yet.

Honestly I'm unsure we're going to see any widespread use of L3 in consumer vehicles, for exactly the reasons mentioned.

Mode confusion, and lack of any standard for takeover time. It's an open question if any such system CAN recognize it's about to be unable to drive on its own soon enough to accurately alert the driver in enough time to safely take over a high enough percentage of the time to make such a system safe.

From the current state of FSDBeta (or any of the robotaxi-only systems we've seen in action) it's pretty clear nobody has a system that can see into the future in a way that'd make L3 make any sense in almost any circumstance.

Even the "only in really slow traffic jam" L3 systems have largely been vaporware (or in the case of Honda, limited to 100 total cars and with video evidence it's not safe even then)
 
Honestly I'm unsure we're going to see any widespread use of L3 in consumer vehicles, for exactly the reasons mentioned.

Mode confusion, and lack of any standard for takeover time. It's an open question if any such system CAN recognize it's about to be unable to drive on its own soon enough to accurately alert the driver in enough time to safely take over a high enough percentage of the time to make such a system safe.

From the current state of FSDBeta (or any of the robotaxi-only systems we've seen in action) it's pretty clear nobody has a system that can see into the future in a way that'd make L3 make any sense in almost any circumstance.

Even the "only in really slow traffic jam" L3 systems have largely been vaporware (or in the case of Honda, limited to 100 total cars and with video evidence it's not safe even then)
Apparently Tesla is discussing Level 3 capability in some capacity if we're believing the leaked internal communications regarding the omitted redundant steering control modules from MIC vehicles -- the context being an employee discussing how the redundant modules will be required to push out Level 3 abilities.

If the goal is to flip from Level 2 ADAS to Level 4, I think it'll be many years before most of us get to experience anything beyond driver assist that requires constant attention. I'd pay good money for a Level 3 system that will do my highway hauls while I check out until warned to take over, but would also need to have a lot of confidence in the system.
 
Apparently Tesla is discussing Level 3 capability in some capacity if we're believing the leaked internal communications regarding the omitted redundant steering control modules from MIC vehicles -- the context being an employee discussing how the redundant modules will be required to push out Level 3 abilities.

"unnamed sources" especially from china, aren't likely to be reliable.

To my knowledge on the rare occasions anyone at Tesla has used SAE levels to discuss things publicly they've never mentioned 3 at all... just 2 (current system) and 4/5 for future plans.

And the omission would be equally relevant for any level above 2.

I can't really see any reason Tesla would release anything L3. On a practical level it's almost indistinguishable from L2 to the user (they still have to be present in the drivers seat, and ready to take over when asked- there's just some theoretical delay in takeover time available to them- with no evidence what the "safe" time limit is).

So it'd be almost entirely downside on the liability/bad PR front, with little benefit so long as nobody else is seriously offering L3 either (and Hondas is not a serious offering, limited to 100 total cars, at low speed, and working poorly even then).