Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla.com - "Transitioning to Tesla Vision"

This site may earn commission on affiliate links.
On TMC I read a lot of anecdotes about phantom braking, and usually they end with "good thing no one was behind me or I probably would have been rear-ended". There are so many comments like this, and apparently phantom braking occurs so often around the world, it makes me wonder why such an accident hasn't actually occurred (to date).

Is it possible that the presence of other cars affects AutoPilot's decision tree, based on multiple situational and confidence factors? For example, if the car detects a possible obstacle (or maybe one of those indecisive blip-trains that Karpathy described for radar), but assigns only 30% confidence that its real, it will decide not to brake when there's a close-trailing car, but more likely decide it's prudent to brake when there isn't one? Avoid creating a known-likely accident scenario in the effort to mitigate a probably-unreal one. I'm not saying that it's ever a good thing to slam the brakes needlessly, but maybe there's a little more software discretion behind this behavior than it seems.
This is an interesting argument, but the fact remains that phantom braking is startling. When you're rolling along on the highway and all of a sudden your car starts to slow down with no indication of why it is doing that, it is weird at best and frightening at worst. What are you supposed to do?

A similar conditioning happened to me with aborted lane changes. For a while I got tons of aborted lane changes, and I'd chose to wrest control from AP and finish the lane change. The car would be half or so done with the lane change and then "woosh!" swerve back. I thought it was dangerous. Well one time there really was a car in my blind spot, and I almost hit him when I took control. The point is that reacting to these unexplained weird behaviors (braking, lane aborts) makes it had to know if you're overriding a mistake or a genuine risk.
 
This is an interesting argument, but the fact remains that phantom braking is startling. When you're rolling along on the highway and all of a sudden your car starts to slow down with no indication of why it is doing that, it is weird at best and frightening at worst. What are you supposed to do?

A similar conditioning happened to me with aborted lane changes. For a while I got tons of aborted lane changes, and I'd chose to wrest control from AP and finish the lane change. The car would be half or so done with the lane change and then "woosh!" swerve back. I thought it was dangerous. Well one time there really was a car in my blind spot, and I almost hit him when I took control. The point is that reacting to these unexplained weird behaviors (braking, lane aborts) makes it had to know if you're overriding a mistake or a genuine risk.
Really no disagreement at all. The purpose of my post was not at all to excuse or defend phantom braking, or any of the other incorrect/indecisive behaviors. I am trying to understand, given the undeniably poor and dangerous nature of these unwanted maneuvers, how is it they haven't actually caused many more real accidents on the road.

But yes even if we were to figure this out, and the answer were reassuring in that phantom braking happens a lot less with close-trailing cars present, I certainly agree it's a serious issue. Tesla shouldn't rest until their cars achieve a low startle-quotient along with a low accident rate.
 
I am trying to understand, given the undeniably poor and dangerous nature of these unwanted maneuvers, how is it they haven't actually caused many more real accidents on the road.
Just to ask it the other way- how do we know they haven't? Tesla is pretty tight and obscure with their safety data, and generally only uses airbag deployments as a threshold, which a rear end rarely causes.

To any cop this just looks like someone getting rear ended (which is basically always the person behind's fault).

Where would you expect this data to show up, and why are we so sure they aren't happening? We really have no idea either way. All we know is that automotive system safety assessments put unintended declaration as a pretty high hazard (unintended full brake application is even higher).
 
Really no disagreement at all. The purpose of my post was not at all to excuse or defend phantom braking, or any of the other incorrect/indecisive behaviors. I am trying to understand, given the undeniably poor and dangerous nature of these unwanted maneuvers, how is it they haven't actually caused many more real accidents on the road.

But yes even if we were to figure this out, and the answer were reassuring in that phantom braking happens a lot less with close-trailing cars present, I certainly agree it's a serious issue. Tesla shouldn't rest until their cars achieve a low startle-quotient along with a low accident rate.
I’d imagine most of the reactions to phantom braking, etc are amplified by the fact they’re surprises. They’re startling, so they will seem more dangerous than they really are. Not that they’re not dangerous in and of themselves, but I’d imagine a lot of “slammed on brakes” are actually “applied brakes and slowed down way more than anyone would or should” but not actually tires screeching, ABS pumping slamming on brakes
 
Just to ask it the other way- how do we know they haven't? Tesla is pretty tight and obscure with their safety data, and generally only uses airbag deployments as a threshold, which a rear end rarely causes.

To any cop this just looks like someone getting rear ended (which is basically always the person behind's fault).

Where would you expect this data to show up, and why are we so sure they aren't happening? We really have no idea either way. All we know is that automotive system safety assessments put unintended declaration as a pretty high hazard (unintended full brake application is even higher).
Further to this. Now that NHTSA requires everyone (&Tesla) to report accidents involving Level 2+ features, is Tesla going to self-report data that it has sourced from the cars or will it wait until they are "made aware" of the accident by police, insurance, or drivers. They certainly could self-report certain kinds of incidents but are they under an obligation to do so. What if they just "look the other way" on incoming data that shows an incident.

For as someone pointed out earlier, other companies do not have the same over the air data that Tesla has. How are they going to get notified of incidents - one would assume through police, insurance, or drivers. Should Tesla do more than that just because they can? It's probably not to their advantage to self-report a higher frequency of incidents than they have "been informed" of. Are we concerned about Tesla snooping into our car data for accident statistics, or are we concerned if they don't self-report?

We know that Tesla does self-report or snoop when there is an airbag deployment or other severe accident, because they have done so in the past. The car will push the data after a bad crash. Does it push data for all incidents, and/or can Tesla alter the reporting level of the car so they report less serious (but still Level 2) reportable incidents?
 
Last edited:
Really no disagreement at all. The purpose of my post was not at all to excuse or defend phantom braking, or any of the other incorrect/indecisive behaviors. I am trying to understand, given the undeniably poor and dangerous nature of these unwanted maneuvers, how is it they haven't actually caused many more real accidents on the road.

But yes even if we were to figure this out, and the answer were reassuring in that phantom braking happens a lot less with close-trailing cars present, I certainly agree it's a serious issue. Tesla shouldn't rest until their cars achieve a low startle-quotient along with a low accident rate.

What I find disturbing is they change the behavior of the person behind the wheel.

With TACC engaged a person should be ready to hit the brake because the expectation is that you'll need to slow down, but Phantom braking teaches a person to have their foot ready to hit the accelerator.

Phantom braking also means I can't use features I'd like to use. As example I don't use Traffic Sign/Signal response because it increases phantom braking frequency.

With my vehicle (FSD with Radar) the most common phantom braking event is a maps issue. The slowdown is about what you'd expect if the car thinks a 40mph turn is ahead, and you're driving 70mph. I don't think it will cause an accident, but it is embarrassing to slow down for no reason at all.

Until Tesla gives us the ability to report issues through a system that tracks issues I won't consider Tesla being serious on actually fixing the issues related to phantom braking.
 
  • Like
Reactions: Paul8810
Further to this. Now that NHTSA requires everyone (&Tesla) to report accidents involving Level 2+ features, is Tesla going to self-report data that it has sourced from the cars or will it wait until they are "made aware" of the accident by police, insurance, or drivers. They certainly could self-report certain kinds of incidents but are they under an obligation to do so. What if they just "look the other way" on incoming data that shows an incident.

For as someone pointed out earlier, other companies do not have the same over the air data that Tesla has. How are they going to get notified of incidents - one would assume through police, insurance, or drivers. Should Tesla do more than that just because they can? It's probably not to their advantage to self-report a higher frequency of incidents than they have "been informed" of. Are we concerned about Tesla snooping into our car data for accident statistics, or are we concerned if they don't self-report?

We know that Tesla does self-report or snoop when there is an airbag deployment or other severe accident, because they have done so in the past. The car will push the data after a bad crash. Does it push data for all incidents, and/or can Tesla alter the reporting level of the car so they report less serious (but still Level 2) reportable incidents?
Tesla has to report the data because they use the data in their own AP Safety reports that they issue Quarterly. As much as I hate the conclusions Tesla comes to using this data I have to give them credit for publishing this data. There is no way the NHTSA is not going to cross check the data.

My expectation is that every car company who has connected vehicles will comply because there is too much at stake not to. Like the Cadillac Supercruise doesn't even work unless the person pays the monthly subscription, and that's really for connectivity.

I do expect fairly low compliance from non-connected cars because I'm not aware of any mechanism by which the manufacture is told about every accident involving an airbag going off. But, I don't know if there is an non-connected L2 vehicle that is the least bit interesting to track safety data of.
 
Tesla has to report the data because they use the data in their own AP Safety reports that they issue Quarterly.
and we count all crashes in which the crash alert indicated an airbag or other active restraint deployed.
Tesla only includes airbag deployments in their AP safety reports. There are a lot of accidents in which the airbag does not deploy. The NHTSA requires reporting of a broader set of accidents than airbag deployments.
 
Tesla has to report the data because they use the data in their own AP Safety reports that they issue Quarterly. As much as I hate the conclusions Tesla comes to using this data I have to give them credit for publishing this data. There is no way the NHTSA is not going to cross check the data.

My expectation is that every car company who has connected vehicles will comply because there is too much at stake not to. Like the Cadillac Supercruise doesn't even work unless the person pays the monthly subscription, and that's really for connectivity.

I do expect fairly low compliance from non-connected cars because I'm not aware of any mechanism by which the manufacture is told about every accident involving an airbag going off. But, I don't know if there is an non-connected L2 vehicle that is the least bit interesting to track safety data of.
Yeah, there's probably a 50/50 chance that Tesla was the whole reason for the NHTSA coming out with this reporting requirement anyway.
 
  • Like
Reactions: gearchruncher
Tesla only includes airbag deployments in their AP safety reports.
No, that isn't true:

we count all crashes in which the crash alert indicated an airbag or other active restraint deployed. In practice, this correlates to nearly any crash at about 12 mph (20 kph) or above, depending on the crash forces generated.
 
Just to ask it the other way- how do we know they haven't? Tesla is pretty tight and obscure with their safety data, and generally only uses airbag deployments as a threshold, which a rear end rarely causes.

To any cop this just looks like someone getting rear ended (which is basically always the person behind's fault).

Where would you expect this data to show up, and why are we so sure they aren't happening? We really have no idea either way. All we know is that automotive system safety assessments put unintended declaration as a pretty high hazard (unintended full brake application is even higher).

Tesla only includes airbag deployments in their AP safety reports. There are a lot of accidents in which the airbag does not deploy. The NHTSA requires reporting of a broader set of accidents than airbag deployments.

That's incorrect, Tesla says explicitly their report includes "crash alert indicated an airbag or other active restraint deployed". That means it includes stuff like seat belt pretensioners. Also they say: "more than 35% of all Autopilot crashes occur when the Tesla vehicle is rear-ended by another vehicle". That means they are capturing a good number of rear end crashes.

As for NHTSA, Tesla makes a good point that many accidents may not be reported to them in the first place given not every accident has a police report filed (which actually may be the real reason why NHTSA pushed the recent law, the automakers have better crash data available than police reports, not the other way around!).

As for the whole talk of phantom braking causing rear end accidents, I'm reminded of the whole noisemaker discussion and about how pedestrians are "startled" by a silent EV. There's still a big gap between startling people and causing an accident.
 
Last edited:
That's incorrect, Tesla says explicitly their report includes "crash alert indicated an airbag or other active restraint deployed".
I like how everyone quotes one part of my post without quoting the other part which quotes exactly what you're quoting. You are correct of course, there is a narrow set of cases where a seatbelt pretensioner will go off. but not airbags. But just like Tesla says, this is about 12 MPH- Plenty of accidents occur when the vehicle velocities are below this, and serious damage occurs at this velocity (especially if what you hit was a person).

Also they say: "more than 35% of all Autopilot crashes occur when the Tesla vehicle is rear-ended by another vehicle". That means they are capturing a good number of rear end crashes.
Cool. So they should be able to tell us why those rear end accidents happened. But they don't, so we have no idea one way or another.

But now I am really interested what passive devices deploy when a Tesla is rear-ended.

I'm reminded of the whole noisemaker discussion and about how pedestrians are "startled" by a silent EV. There's still a big gap between startling people and causing an accident.
And I'm reminded that when a Tesla on AP gets into an accident, we blame the driver for not paying enough attention, and we should all know it needs constant monitoring to avoid accidents, everyone knows that because the manual says so. But then when we suggest it could cause an accident, it's dismissed.

All I know is that that NHTSA's ISO 26262 FHA for braking systems assigns ASIL-D (literally the most severe level) to unintended braking (page 49) but I guess we should dismiss those safety experts, clearly it's no big deal, it just startles people. When Tesla does it, it's level QM because the manual says pay attention.
 
Last edited:
  • Like
Reactions: Darblish
Tesla only includes airbag deployments in their AP safety reports. There are a lot of accidents in which the airbag does not deploy. The NHTSA requires reporting of a broader set of accidents than airbag deployments.

They do require a broader set of accidents, but they're not things you can pull from the logs like the AirBag Deployment. So Tesla isn't at a disadvantage in having an entirely connected fleet with the additional data required.

Here is the list:
  • Within one day of learning of a crash, companies must report crashes involving a Level 2 ADAS or Levels 3-5 ADS-equipped vehicle that also involve a hospital-treated injury, a fatality, a vehicle tow-away, an air bag deployment, or a vulnerable road user such as a pedestrian or bicyclist. An updated report is due 10 days after learning of the crash.
 
Last edited:
Okay, back to some legit thread talk. I have an X P90D with AP 1.0 and comparing a few items to a new FSD Y with no radar for this post. NOT apples to apples, so I get it. Please don’t hate yet.

1) Speed Limit Recognition - I’ve driven 1,900 miles on the Y and 67,000 on the X. The Y sees a sign with a speed limit , almost ANY sign, and identifies it as THE speed limit. Excellent example “TRUCKS 25 mph” on highway downhill and the Y thinks “25” is the limit even though there’s a stacked sign with a regular “55” underneath. The AP1.0 X knows the difference almost every time. AP 1.0 never takes the “School Zone” speed limit as the official limit. The Y does every time. Never seen the conditional speed example shown by Dirty Tesla. FSD is hard, this type of recognition and training is not. Disappointing that TV performs poorly here.

2) Phantom Braking - 1.0 might do it once on a 200 mile trip. But I never would’ve brought it up as a top 10 AP issue for 1.0. It’s definitely top three issue for TV. Don’t know the reason, not smart enough, but it’s real and it sucks. That said, it really only happens when no other cars are around.

3) Lane changing - Clearly 1.0 won’t do it on its own. The Y is odd as it changes lanes well, but often for no reason.

4) Random Bugs - not fair since 1.0 is old, but it was definitely less buggy. I get the “AP Features Unavailable” every drive, even WHEN I am using it. Photo below. It still works fine, but what’s up?

They just need to push more updates quickly, as they did in early days of the S and AP 1.0. Seems like they’re slow to push software updates now.

As an aside, Sentry mode is still “iffy” at best and I am on .10. It probably records 50% of the time.
 

Attachments

  • 1259FFBA-29B3-45FF-B783-575DA2FE8321.jpeg
    1259FFBA-29B3-45FF-B783-575DA2FE8321.jpeg
    564 KB · Views: 47
Last edited:
Yeah, there's probably a 50/50 chance that Tesla was the whole reason for the NHTSA coming out with this reporting requirement anyway.

I like the idea that they share my concern about a human drivers ability to oversee an advanced L2 vehicle.

I don't know if it really makes that much difference who the manufacture is, what driver monitoring it has, etc. The key difference Tesla makes is they're the ones pushing advanced L2 feature sets like auto lane-change, stop light response. Everyone else seems to be following closely. So the NHTSA needs to get ahead of it.

I can't answer the question of whether AP reduces my reflexes in avoiding an accident. The other day I was pretty amazed at how well AP was doing on a mountain road. But, then I saw a massive rock. The car took no evasive action, and by the time I saw the rock it was too late. I felt like I was paying attention, but it just didn't register. The car didn't hit the rock, but it got awfully close. It wouldn't have been an airbag deployment accident, but just some body damage. Even if I was driving manually I may not have done any better so I can't say it was AP induced or not.
 
Why isn’t there a lot of video of this supposed phantom breaking? Sure there are one or two videos from 2018 out there....
Because I am not a YouTuber and am not going to record, edit and post. Tesla knows about the issue. I don’t see it as my responsibility to put myself in harm’s way to record for folks online. I just don’t use it if there are no cars around. For me, phantom braking only occurs when there are no cars around and the Tesla has to rely 100% on its own “vision” with no “help” from other objects.