Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Phantom braking is the biggest issue with AutoPilot.

This site may earn commission on affiliate links.
Agree with the OP, phantom braking and my biggest other complaint (SpeedAssist issues with TACC) are potentially dangerous and need fixing.

On the other hand, it shows how far Autopilot has come over the years. These problems are clearly addressable and it will make an already impressive system that much closer to perfection.
 
So far the majority of the things people are complaining about (cross-traffic, merging in, merging out) are completely legitimate bake use and the driving instructors these people had should feel bad for sucking huge granite rocks through a wet paper straw.

Just because you wouldn't brake in that situation doesn't mean people shouldn't brake in that situation. This shows why "you" (people in general) average about half a million miles between accidents and the car averages over a million or several million.

Yep, generally the car is as cautious as we "should" be, just way more cautious than we are. My wife is frequently bugging me about assuming too much about what the car in front of me is going to do, especially when they're turning. Where the Tesla would slow to ensure the car was completely gone and everything was safe, I/we tend to assume the driver ahead will get out of the way in the nick of time, or we'll be able to swerve around them. AP assumes the driver ahead may be a complete moron and may come to a complete stop at any point. :)
 
So far the majority of the things people are complaining about (cross-traffic, merging in, merging out) are completely legitimate bake use and the driving instructors these people had should feel bad for sucking huge granite rocks through a wet paper straw.

Just because you wouldn't brake in that situation doesn't mean people shouldn't brake in that situation. This shows why "you" (people in general) average about half a million miles between accidents and the car averages over a million or several million.

I agree. There's a fundamental difference between phantom braking and "the car slowed down when/where I wouldn't for merging traffic" etc.
 
Yep, generally the car is as cautious as we "should" be, just way more cautious than we are.

I get your point, but this is not the core complaint of phantom braking of the OP. Phantom braking is defined as a braking action where even the most 'safe' driver would've never braked. For example: nobody would ever brake for a shadow of an overpass. At least, I hope not...
 
Agreed. These two issues have basically led to me permanently disabling NOAP. It’s great if these two issues are improved but extremely dangerous and anxiety inducing until then in situations where there’s anything more than light traffic.

EDIT: NOAP or AP.
Yes!

I’m also irritated by NOAP’s tendency to brake needlessly after I gesture to change lanes. As the car begins to cross the line into the new lane, it often applies the brakes for no reason.
Indeed!

Agree with the OP, phantom braking and my biggest other complaint (SpeedAssist issues with TACC) are potentially dangerous and need fixing.

On the other hand, it shows how far Autopilot has come over the years. These problems are clearly addressable and it will make an already impressive system that much closer to perfection.
Let's hope so!


I can keep quoting other people, but the Phantom Braking is really my biggest problem with my S100D on HW2.5

For no reason it hits the brake hard or it just randomly slows down. Sometimes for 1 second, sometimes even longer.
 
So far the majority of the things people are complaining about (cross-traffic, merging in, merging out) are completely legitimate bake use and the driving instructors these people had should feel bad for sucking huge granite rocks through a wet paper straw.

Just because you wouldn't brake in that situation doesn't mean people shouldn't brake in that situation. This shows why "you" (people in general) average about half a million miles between accidents and the car averages over a million or several million.

maybe most but certainly not all. braking on sweepers (on a highway) for no other reason than god knows why is a thing. I don't think your thinking this out with a logical mind. Braking for no reason is a good reason to get rear-ended and feel bad about it.
 
braking on sweepers (on a highway) for no other reason than god knows why is a thing. I don't think your thinking this out with a logical mind. Braking for no reason is a good reason to get rear-ended and feel bad about it.

Yeah, there's quite a long list where even the safest driver would never apply brakes, especially not in the abrupt manner AP does. It seems to boil down to situations where the computer judges a situation as dangerous for just a split-second. It's almost as if you can feel the car: this is really dangerous!!! Oh no, it isn't, sorry...

Phantom braking also occurs quite often with oncoming traffic in shallow curves. Every now and then the computer will misjudge an oncoming vehicle, even if it's clearly not a threat at all. It rarely happens if there's a clear center divider, but on roads without a center line phantom braking will happen in virtually any corner with oncoming traffic mid-corner. It appears V10 will make a lot of improvements here.
 
I find the auto pilot constantly swerves at the wide spot for merges onto the highway. I got pulled over once for the car not tracking well. I didn't tell him it wasn't me, it was the car. No ticket. He just wanted to see if i was drunk. I wonder if using the auto pilot will in the end invalidate pulling over drivers for "swerving in their lane". It's not a crime, but can indicate intoxication. But not if the car is doing it. This may invalidate probable cause for such traffic stops.

AP was, is, and always will be considered an assist system in which the driver is accountable for the action, speeds, and overall behaviour of the vehicle. There is no grey area here, the car itself warns you about this. Whatever "FSD" becomes may be different, but both its capabilities and liabilities will differ from location to location (and I suspect at the pace laws move, the responsibility will belong to the driver for quite some time).

Probable cause is still applicable, as there is no external way of knowing if "the car is driving". Even if it is a fully autonomous vehicle, dangerous or erratic driving behaviour should and will be investigated (imagine faulty systems, damaged sensors, obstructed sensors, etc. could lead to poor driving behaviour). Now, who is at fault is the interesting part still under much debate, but a cop should certainly be able to pull over an erratic autonomous car for everyone's safety.

Elon Musk on Twitter

That being said, what version are you on? Worked for me on 2019.32.2.1 as long as it recognized the words "Bug report" and didn't hear something else, and there must be no pause after "bug report". It then says "Thank you for your report" or something similar on the screen and goes away with no talkback from the car. It also flags the log on the system for sending and if you WiFi well, it will flag recently-saved teslacam deets for sending.

Thanks! I wish it was more official/discoverable than Twitter, but it's something and gives confidence it at least worked at some point, perhaps even still today.

Unsure of firmware at the time, but I've always been fairly up to date. Currently on 2019.32.2.2. I'll try again soon. Maybe I simply didn't notice the "thank you" message, or perhaps my enunciation and pace wasn't great for the car to understand me.

Drove 850 miles around SoCal this weekend.

3 events of unwarranted braking/slowing due to car on onramp. The cars weren't intending or needing to merge, but my S slowed for them anyway.

1 event of hard overpass braking.

3-5 micro-events of momentary stutter-braking due to unknown causes.

With 100% reproducibility, I can confirm that on-ramps that form a new lane (in which drivers continue) cause the car to brake hard, even if the other vehicle is behind (but still overlapped with) the Tesla using AP. Do we consider this phantom braking? It's certainly inappropriate braking, but the cause is known.
 
Elon Musk on Twitter

That being said, what version are you on? Worked for me on 2019.32.2.1 as long as it recognized the words "Bug report" and didn't hear something else, and there must be no pause after "bug report". It then says "Thank you for your report" or something similar on the screen and goes away with no talkback from the car. It also flags the log on the system for sending and if you WiFi well, it will flag recently-saved teslacam deets for sending.

Sorry, I'm having trouble following all that. Was that a "yes" or a "no" to the question of "Are we sure this does something?" The service center has told me these do not get sent anywhere. There is an approximately two week window for the service center to download them then they overflow the end of the buffer. Are you saying a bug report will be sent to someone? Who?

What is a "deet"?
 
So far the majority of the things people are complaining about (cross-traffic, merging in, merging out) are completely legitimate bake use and the driving instructors these people had should feel bad for sucking huge granite rocks through a wet paper straw.

Just because you wouldn't brake in that situation doesn't mean people shouldn't brake in that situation. This shows why "you" (people in general) average about half a million miles between accidents and the car averages over a million or several million.

Not much actual facts there. When a car crosses the road some distance in front of me doing 25 mph it has a breaking distance longer than the vehicle or the lane, so it can't possibly stop and become a hazard for me. That's why I don't hit my brakes. What is Tesla's excuse for hitting the brakes irrationally?

When a car is leaving the lane for a turn lane and is slowing down with a few inches still over the line it is no more dangerous than the cars about to run into my rear bumper because they have no idea why I am still riding my brake pedal. Even a couple of seconds after the turning vehicle leaves the lane my Tesla won't pickup the pace and not only continues going slow, but continues to slow down further, nearly to a stop. Yeah, these are dangerous issues.

The numbers you cite above are not the issue because while people have a known accident rate, autonomous vehicles simply have not been on the roads enough to have good statistics given the moving target of what software is being measured. Then there is the issue that the Tesla software is claimed to be dependent on humans as a back up which is a very flawed approach. Humans are very poor at situations where they are a passive observer expected to take over. We won't have anything to actually compare to a human driver until the system is fully autonomous and a human operator is not required.
 
Not much actual facts there. When a car crosses the road some distance in front of me doing 25 mph it has a breaking distance longer than the vehicle or the lane, so it can't possibly stop and become a hazard for me. That's why I don't hit my brakes. What is Tesla's excuse for hitting the brakes irrationally?

The fact- already explained exhaustively- that the system is not intended to be used with cross traffic

The default assumption of the TACC/AP today is that all cars are going the same direction as you are

So it has no understanding whatsoever what a turning-across-your-lane-car is.

It just sees there's suddenly a huge radar signature in front of it that you are approaching at speed.

So it hitting the brakes when it sees that is 100% rational


When a car is leaving the lane for a turn lane and is slowing down with a few inches still over the line it is no more dangerous than the cars about to run into my rear bumper because they have no idea why I am still riding my brake pedal. Even a couple of seconds after the turning vehicle leaves the lane my Tesla won't pickup the pace and not only continues going slow, but continues to slow down further, nearly to a stop. Yeah, these are dangerous issues.

I agree- people who keep insisting on using drivers aids in situations the manual explicitly says not to use them are very dangerous.
 
  • Like
Reactions: mmccking
Sorry, I'm having trouble following all that. Was that a "yes" or a "no" to the question of "Are we sure this does something?" The service center has told me these do not get sent anywhere. There is an approximately two week window for the service center to download them then they overflow the end of the buffer. Are you saying a bug report will be sent to someone? Who?

What is a "deet"?

Bug reports are automated WiFi upload and bypass the service center. Set up a packet sniff and MITM (Man In The Middle - Intermediary certificate) the secure connection on the WiFi and it's possible to see the data that gets sent. Spotted a squelch flag too in the response, so I'd guess that some people abuse the voice command bug report function. These reports go into a system that the service center may or may not be able to see. They GO, but as to who looks at them? No idea. :)

So the answer to "Are we sure this does something?" is "No", since we have no way of knowing if any given person is squelched, or if they have their WiFi set up correctly. However there is a sufficiently-reasonable possibility that it does, so if you make a good faith effort to provide high quality data, then you might be helping, and that can't hurt.

'Deet' is slang for 'detail' when it's not referring to the insect repellent diethyltoluamide.

Not much actual facts there. When a car crosses the road some distance in front of me doing 25 mph it has a breaking distance longer than the vehicle or the lane, so it can't possibly stop and become a hazard for me. That's why I don't hit my brakes. What is Tesla's excuse for hitting the brakes irrationally?

Unless it hits something and comes to an unexpected standstill. Please also remember that radar doesn't provide anything other than the relative speed of everything in the cone of detection and the optical data is currently frame-based. The car literally has zero concept of "time". It does not know what happened a frame ago, nor can it anticipate what will happen in the future.

But in general it's user error in activating TACC at all in that environment.
From the manual:
"Traffic-Aware Cruise Control is primarily intended for driving on dry, straight roads, such as highways and freeways. It should not be used on city streets."

Most vehicles that use radar-based TACC will exhibit this behavior if TACC is used in an inappropriate situation. With the advent of Teslas, we're having more opportunities for people who don't understand to become statistics and stories like the guy who turned on cruise control in the motor home then went to make a sandwich because they thought "cruise control" meant the car would completely drive for them. It appears that a lot of Tesla owners think the car has superpowers that it doesn't. Tesla is working toward this, but until Level 5 Autonomy is complete, RTFM (Read The Flipping Manual) and pay attention to the limitations. You're getting partial

When a car is leaving the lane for a turn lane and is slowing down with a few inches still over the line it is no more dangerous than the cars about to run into my rear bumper because they have no idea why I am still riding my brake pedal. Even a couple of seconds after the turning vehicle leaves the lane my Tesla won't pickup the pace and not only continues going slow, but continues to slow down further, nearly to a stop. Yeah, these are dangerous issues.

Absolutely! Dangerous issues that are easily solved by not using the TACC in the places it's not meant for. This is why the manual makes it clear that the driver is in charge and shouldn't do the thing. If you're using the handle of a screwdriver to hammer in a screw and it slips and you cut your hand, don't blame the screwdriver. Misapplication of a tool is user error.

The numbers you cite above are not the issue because while people have a known accident rate, autonomous vehicles simply have not been on the roads enough to have good statistics given the moving target of what software is being measured. Then there is the issue that the Tesla software is claimed to be dependent on humans as a back up which is a very flawed approach. Humans are very poor at situations where they are a passive observer expected to take over. We won't have anything to actually compare to a human driver until the system is fully autonomous and a human operator is not required.

I'll summarize that there are some logical fallacies in that paragraph and this is both objectively incorrect and subjectively questionable as a result. I don't want to go into the details on that as it may come across as patronizing or lecturing and I like people to have the option to introspect retrospectively and consider what might be improved before pointing it out directly.

===========

In general, I'm trying to figure out why people think that...
  • Teslas, which use the exact same well-known technology as other cars, are expected to act differently than those other cars in the same situations
  • The car should be blamed when the driver is misusing a tool it provides
  • Incomplete software isn't incomplete despite being told many, many times in the manual and the UI that it's beta/incomplete/subject to limitations
So far one person has shown me two videos of unexpected braking, saying the car "slammed on the brakes". Actual events:

Video 1: Car went over a seam in a bridge that caused a forward drop and a gravitational impression of braking. Based on FxF no actual braking occurred. You can see cars approaching the bridge braking before hitting the seam, so those drivers anticipated the buck, while the Tesla had clear road ahead so the buck was substantially worse since it didn't brake. That's an opportunity for improvement once video data is temporally analyzed.

Video 2: Leading car goes into left turn lane and Tesla continues to slow after that car is fully in the left turn lane. FxF shows delta v of approximately 2 ft/s² after the lead vehicle is clear and delta v matching lead vehicle until it is clear. Matching the lead vehicle is fully expected. The continued braking afterward was because the traffic in front of the left-turner was also stopped at a signal and there was no logical reason to accelerate to the stopped car like a bad driver might. That's very light braking .Not even full regen. But it still returns to a: Don't use TACC there; and b: It operated safely, not the way you'd expect a human driver to.

I still welcome people to provide video of unexpected braking events. ^.^ Helping folks understand the technology, its limitations, and how it responds to what is always a good thing.
 
  • Informative
Reactions: DopeGhoti
... Even a couple of seconds after the turning vehicle leaves the lane my Tesla won't pickup the pace and not only continues going slow, but continues to slow down further, nearly to a stop. Yeah, these are dangerous issues ...

I want to highlight this clipped portion of your quote. In the 1h30 drive I did specifically to evaluate AP and TACC recently, there were a few situations where the driver directly in front slowed down to take a right turn. Sometimes it reacted perfectly fine, if having that tail lag you mention and continuing to slow down. However, other times the Model 3 had no reaction to the right-turn-lane-taking car until it had actually mostly completed its right turn and was very much out of the way of both radar and cameras. This led to an annoying 1-2 second slow down after completely clearing the road the other driver took, and it did not slow down before this.

This lag is concerning. I wonder if some phantom braking events are explainable, but we need to think about what happened 2 seconds ago that is already long gone?

Most vehicles that use radar-based TACC will exhibit this behavior if TACC is used in an inappropriate situation. With the advent of Teslas, we're having more opportunities for people who don't understand to become statistics and stories like the guy who turned on cruise control in the motor home then went to make a sandwich because they thought "cruise control" meant the car would completely drive for them. It appears that a lot of Tesla owners think the car has superpowers that it doesn't. Tesla is working toward this, but until Level 5 Autonomy is complete, RTFM (Read The Flipping Manual) and pay attention to the limitations. You're getting partial

This is an unfair statement. Tesla one is one of the slower adopters for TACC, and it's still a "beta" feature. Other common manufacturers not only have this feature, but do not claim it to be "beta" software. They do make their liability statements (you are the driver, be in control, etc.) of course. TACC does work better (w.r.t. phantom braking) from other manufacturers, likely because these manufacturers cannot rely on their customers accepting beta features with no way to provide updates. It needs to behave well before they ship it.

In general, I'm trying to figure out why people think that...
  • Teslas, which use the exact same well-known technology as other cars, are expected to act differently than those other cars in the same situations
  • The car should be blamed when the driver is misusing a tool it provides
  • Incomplete software isn't incomplete despite being told many, many times in the manual and the UI that it's beta/incomplete/subject to limitations
...

I still welcome people to provide video of unexpected braking events. ^.^ Helping folks understand the technology, its limitations, and how it responds to what is always a good thing.

Au contraire, we're expecting Tesla's TACC, which uses the same well-known technology as other cars, to behave the same as other manufacturers. Participants (who otherwise adore their Model 3s) in this thread have chimed in saying their previous vehicles' TACC behaved much better. I'm sure other manufacturers had years to figure out their TACC systems before they made them available. Tesla's pretty busy in general, but why can't we even have basic non-adaptive cruise that doesn't need a beta? My 2019 vehicle is lacking a non-beta implementation of cruise that vehicles have had for decades!

Let's take an example of the very keyboard I am typing on. It's a well-known technology. This one works great. I used my coworker's keyboard once, and while I had to get used to it at first it served the purpose quite well. However, another keyboard I once had was verifiably absolute garbage. Keys randomly failed to "type" at times. Other times, keys would stick until hit again. It was incredibly frustrating even though maybe 1/1000 key presses had an issue. And you know what? It wasn't even a bad keyboard. It was the USB port that was crapped out and led to this behaviour. My point is that the technology is effective and can work great, but there are various variables (signal update frequency, software handling, drivetrain, etc.) after that technology that make each whole system behave differently from another. You can buy good and bad keyboards, and their price may not even be different. Bad keyboards will be replaced with better ones. There are also different types of keyboard switches if we want to continue the analogy, but as with all analogies it eventually gets ridiculous :)

I have a ton of footage to comb through on my recent trip (both phantom braking and autosteer issues). It'll take a while to sort out which is which, but I plan to have some videos put together in the coming weeks showing the behaviour. Keep in mind it's just Tesla dash cam footage and portrays deceleration from braking extremely poorly.

EDIT: At this risk of this being a terrible idea, I think it's fair to further highlight the absurdity that is not having a non-beta version of cruise. Cruise isn't a legal requirement, but how far are we willing to excuse missing common features in a very expensive vehicle just because "beta" is slapped on? Wipers are a current example. Auto wipers are beta, but Tesla provides no physical control for a non-beta way to turn on the wipers beyond a single wipe (button on the stalk). I almost didn't buy the car because of the wiper situation. The wipers are in beta, with no safe well-known non-beta control method.
 
Last edited:
  • Like
Reactions: alexGS
The Auto wipers are in Beta. The settings of "Off, I, II (slow and fast intermittent), III, and IIII (slow and fast constant)" are not, and work perfectly as designed. The MFD will even helpfully bring up the windscreen controls for you if you invoke the one physical control you mentioned.

Yes, which is why the second part of that statement is important. Non-tactile touchscreen controls are not safe, since they require visual location and confirmation to interact with. This means you are briefly not looking at the road, and we know mere seconds of doing so are dangerous (see: all smartphone usage in cars studies and reports). Therefore, when using non-beta features, the only common and well-known method to actuate the wipers that is safe (doesn't require you to look elsewhere) is the single-wipe press on the stalk. I've been known to look completely silly and keep pressing the button rhythmically until I have a safe enough period to look at the screen and select the appropriate speed (when auto isn't working right for the current situation).

Perhaps I'm just too old-fashioned for this and I just want a car that has some basic features that work well without being a danger or an inconvenience. I'm sincere about this -- I know I tend to prefer things like physical switches for the reason I mentioned above, and I also enjoy it when things work well/reliably even if their functionality is basic.
 
This lag is concerning. I wonder if some phantom braking events are explainable, but we need to think about what happened 2 seconds ago that is already long gone?

What happened was- someone used a feature in a situation it's explicitly not intended to be used- so the feature being confused about that shouldn't surprise anyone who understands this.


TACC does work better (w.r.t. phantom braking) from other manufacturers

My own experience is that other MFGs TACC equivalents are terrible compared to Teslas.

Most I experienced before buying my car wouldn't work at all below like 30 mph, come to full stops, or resume from stops....making them utterly useless in stop and go traffic on a highway.

And even then they usually had a minimum follow distance so far away I would constantly get cut off and dropped even further back over and over again.

Test driving a Tesla and seeing how much better their system was than anybody elses I'd used (Infiniti, Lexus, GM (non supercruise), etc) was literally the thing that made me want to buy the car.


But again I actually paid attention to where the system is supposed to work and where it's not.




why can't we even have basic non-adaptive cruise that doesn't need a beta?

You can. Buy one without Autopilot and that's exactly what you get.

I think the owners manual mentions that fact too- lotsa neat stuff you seem to be missing out on not reading it :)
 
I want to highlight this clipped portion of your quote. In the 1h30 drive I did specifically to evaluate AP and TACC recently, there were a few situations where the driver directly in front slowed down to take a right turn.

Translation: You were using it where you are not supposed to. This is entirely your fault and you should feel bad for it.

"But other cars do this just fine!"

Every other car I have ever read the manual of says TACC on access-controlled roadways only. And no, they don't do this just fine. The difference is that their drivers either don't try to use it in an improper place, or don't complain as much if they do and it doesn't do well.

This is an unfair statement. Tesla one is one of the slower adopters for TACC, and it's still a "beta" feature. Other common manufacturers not only have this feature, but do not claim it to be "beta" software. They do make their liability statements (you are the driver, be in control, etc.) of course. TACC does work better (w.r.t. phantom braking) from other manufacturers, likely because these manufacturers cannot rely on their customers accepting beta features with no way to provide updates. It needs to behave well before they ship it.

So far I have yet to see any claims of unexpected braking be legitimized. All this bark, no videos, and a ton of people admitting its their own fault.

Au contraire, we're expecting Tesla's TACC, which uses the same well-known technology as other cars, to behave the same as other manufacturers. Participants (who otherwise adore their Model 3s) in this thread have chimed in saying their previous vehicles' TACC behaved much better.

Wholly anecdotal. Where's the videos or lack of self-incrimination? And what about the people who say theirs works perfectly fine or better than their prior vehicles'? Human tribalism indicates that you will only accept anecdotal evidence that supports your view.

I'm sure other manufacturers had years to figure out their TACC systems before they made them available. Tesla's pretty busy in general, but why can't we even have basic non-adaptive cruise that doesn't need a beta?

You can. Don't buy the FSD package and it's regular CC, not TACC.

I have a ton of footage to comb through on my recent trip (both phantom braking and autosteer issues). It'll take a while to sort out which is which, but I plan to have some videos put together in the coming weeks showing the behaviour. Keep in mind it's just Tesla dash cam footage and portrays deceleration from braking extremely poorly.

Please do. Dash cam footage is fine. It's completely analyzable. I do want to see actual misbehavior. It's very difficult to fix actual problems when there are a lot of people crowing about problems that are not actual problems. Let me know when it's available. Quick advisory though: Any use in situations where it's not proper (city streets, non-access-controlled roadways) is already an automatic fail on your part and will be pointed out as such if you bother to post them. No TACC is meant for city streets on any cars.

EDIT: At this risk of this being a terrible idea, I think it's fair to further highlight the absurdity that is not having a non-beta version of cruise. Cruise isn't a legal requirement, but how far are we willing to excuse missing common features in a very expensive vehicle just because "beta" is slapped on?

I have absolutely no problem with it because I use it in places where it's expected to work better. Just like every other car out there's TACC: Access-controlled roadways. (Freeways and highways that don't have cross-traffic of any sort.)

Non-tactile touchscreen controls are not safe, since they require visual location and confirmation to interact with.

Not sure about Canada, but the US has a plethora of cars with only touch-screen controls for a lot of things. Prius Prime is an example, And they do MUCH worse for controls. Precision finger-stabbing is necessary and occasionally four or more steps to get what you need, plus much less-responsive controls. Teslas are hugely better in the UI department.

Two touches to turn wipers on. Also notable that the technology behind both TACC and Auto Wipers is actually not 100% the same as other vehicles. TACC has more data inputs and AW have different data input. So yes, some of this is being built from the ground up. That extra input is why the TACC and safety features of a Tesla are safer than the TACC of other vehicles.
 
  • Like
Reactions: mmccking
I think the owners manual mentions that fact too- lotsa neat stuff you seem to be missing out on not reading it :)

It does. RTFM applies to a lot of the folks on this thread. I do wish a lot of them would have a better understanding of how the technology works too, and actually drove other vehicles with similar features before. I was lucky that my Prius Prime's level of tech had "Full Range TACC". The lower trim levels didn't and their TACC would cut off under 30 kph.
 
Phantom braking is the reason I didn't buy Autopilot after free trial period.
I had about 5 -10 phantom braking occurrences during my short free trial period.
Even during free trial period, I became very hesitant to use it because I didn't know when it was going to happen.
 
The fact- already explained exhaustively- that the system is not intended to be used with cross traffic

The default assumption of the TACC/AP today is that all cars are going the same direction as you are

So it has no understanding whatsoever what a turning-across-your-lane-car is.

It just sees there's suddenly a huge radar signature in front of it that you are approaching at speed.

So it hitting the brakes when it sees that is 100% rational


Not really. All that is saying is that this is an aspect of driving the software can't cope with. I get that, in fact, that is my entire point! So, you are agreeing with me. The difference is you seem to think this is somehow acceptable because Tesla acknowledges that they aren't on the ball with this aspect. It has many other shortcomings too. Do you feel the same way about them as well?


I agree- people who keep insisting on using drivers aids in situations the manual explicitly says not to use them are very dangerous.

So you are saying autopilot with auto breaking should not be used on public roads??? That is the environment we call "roads". Where does the manual say not to use the autopilot on roads?