Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

AutoPilot Almost Killed Me!

This site may earn commission on affiliate links.
So I am cruising down the freeway using autopilot at the posted 70mph. As I am approaching a giant sign above the freeway showing which lane to be in to pay toll in cash my Y suddenly starts to rapidly slowdown. The car behind me swerved as to not hit me. I jammed the throttle immediately to get back up to speed. WOW!!!! I am PISSED! Now I am afraid to use my autopilot! I also paid for the FSD to the tune of 7k and this is what I get? I am told that the cameras misinterpreted the big sign over the freeway. If you want FSD or even autopilot to be safe there is no room for "misinterpretation" at all. This never happens in any car with adaptive cruise control. It seem like lately I keep running into real world scenarios where Tesla for all its hype is substandard to most other cars with current tech. Sadly I am losing my ability to say "its ok, its a Tesla. (rant over, drops mic)

Well, the fact still remains: every single driver on the road must be ready at ANY moment to apply full braking with enough distance to not rear end the person braking in front. While this was caused by 'phantom braking', you could very well have a situation that you chose to urgently brake that the person behind you is unaware of. You being endangered is more on the person behind you than it is Autopilot.
 
Well, the fact still remains: every single driver on the road must be ready at ANY moment to apply full braking with enough distance to not rear end the person braking in front. While this was caused by 'phantom braking', you could very well have a situation that you chose to urgently brake that the person behind you is unaware of. You being endangered is more on the person behind you than it is Autopilot.

As does the fact that every single driver on the road must be ready at ANY moment to apply full braking with enough distance to not to hit a pedestrian, but I'm still not going to be playing soccer in the middle of the freeway. Putting yourself in harms way is never a good idea - regardless of who is ultimately at fault.

I don't deny that autopilot can save lives. Nevertheless, it shouldn't endanger them either. Tesla AP has a phantom braking problem and it needs to be fixed ASAP before it get's someone injured or killed (even if it's the inattentive person behind me).
 
You have to set expectations correctly. I've had a couple of those, but I'm not scared of it, you gotta learn it. I've seen one where a yellow sign with a flashing yellow near an intersection, FSD wants to treat it like a yellow traffic sign and really hesitates to brake and then continues.

what?
You gotta learn to live with $60,000 car braking suddenly in middle of road for nothing?

that’s a new low tesla boy. Now you are talking killing people to lick Elon arse.
 
honestly i'm frustrated that people don't figure out how to use the tech, then blame the tech.

when AP on, just hover your foot over Go pedal, you should be good at one pedal driving by now, right?

then if you were paying attention, like you should, the instant you feel phantom brake, you over ride it by stepping on GO pedal. problem solved.

some wavy roads, up and down elevation mountain roads, AP tends to phantom a little. and i never freaked out. dude if you are "freaked out" by a little braking, you likely aren't paying attention. just give up your licence already. take uber, then robotaxi
 
honestly i'm frustrated that people don't figure out how to use the tech, then blame the tech.

when AP on, just hover your foot over Go pedal, you should be good at one pedal driving by now, right?

then if you were paying attention, like you should, the instant you feel phantom brake, you over ride it by stepping on GO pedal. problem solved.

some wavy roads, up and down elevation mountain roads, AP tends to phantom a little. and i never freaked out. dude if you are "freaked out" by a little braking, you likely aren't paying attention. just give up your licence already. take uber, then robotaxi
This wins the stupid comment award for this thread. Tell him what he's won Bob..
 
Well, the fact still remains: every single driver on the road must be ready at ANY moment to apply full braking with enough distance to not rear end the person braking in front. While this was caused by 'phantom braking', you could very well have a situation that you chose to urgently brake that the person behind you is unaware of. You being endangered is more on the person behind you than it is Autopilot.

In a perfect world we would all be ready to react at a moment's notice. The reality is however, there will always be people following too closely, not paying attention, or who lack the experience or reflexes to avoid a collision even at the recommended distance (e.g the old couple in the car behind you). You can safely consider that a constant. The question is, do you really want one of these people running into you, or them possibly getting hurt or seriously injured? It would certainly not sit right with me if my car phantom braked and some family behind me ends up in a serious accident. Or they plow into me and send me off into a pole or another vehicle.

The systems aren't perfect, understood, but phantom braking should explicitly be mentioned as a possible issue if it's going to be dealt with over a prolonged period of time as seems to be the case. It's one thing to claim the software is beta and the driver is responsible for safety, etc, but it's another to have your car literally slam the brakes (in some cases) at freeway speeds because it's scared of a shadow or overpass.
 
  • Like
Reactions: saltsman
I use AP exclusively on long sparsely populated stretches of divided four lane highway only. Very happy with it. As others stated it is foolish to leave AP on in bumper to bumber traffic on busy highways. Summon, neighborhood driving, just gimmicks. I believe in autopark, but only if the curb is tall.

Again, if full self driving was a reality, most of us will not have jobs, as AI would have developed complex decision making capabilities to satisfy corporate globalization needs.
 
  • Like
Reactions: Dennisis
Clearly you've never been brake checked .. and yes, ideally you can be at a safe distance, but there are times when people move over on you and then instantly brake check you, before you have time to ease back to a safe distance.

Of course I have, mainly when I was young and dumb and would often tailgate. After driving hundreds of thousands of miles as an adult I have learned to ease back and relax, and keep plenty of distance between me and the driver in front of me. Pay attention and anticipate stupid things other drivers might do. And always leave yourself a way out.
 
  • Like
Reactions: brkaus and PACEMD
Exactly! So on my Cybertruck reservation I did not select FSD. Not worth the money considering all the flaws. Perhaps in the future.

Keep in mind that the phantom braking you experienced likely wasn't caused by FSD.

I'm certainly not saying you should get FSD, but simply that your experience wasn't caused by it.

In my experience with cars with adaptive cruise control Phantom braking does happen occasionally, but it happens way more often with a HW2/2.5/3 Tesla (basically anything modern) than anything else.

In my 2015 Model S I would rarely get a phantom braking event, and when I did was always an overhead ramp. Like one time it phantom braked when there was a dip in the freeway right before the overpass so the overpass looked like it was right in front of me. The Model S had HW1 that was provided by Mobile Eye so it was pretty well proven out.

In my 2018 Model 3 with HW3 I experience a lot more events, and they can be from all sorts of causes. Like on the weekend I was on a freeway heavily traveled by Model 3's, and it still phantom braked due to an overpass ahead. Yet, I've been in that same spot numerous times before without phantom braking happening.

Usually the biggest cause of phantom braking in my Model 3 is semi trailers in the lane next to me when I'm coming up to them. Now some braking is expected if the differential is high (this is actually a feature they enabled), but most of it isn't caused by high speed differential.

If that wasn't bad enough there are specific phantom braking events that happen if you turn on NoA (an EAP and FSD feature), and additional ones if you turn on Traffic light response (a FSD feature).

Basically driving with it is an exercise in "No, don't stop. I want to go"

It's gotten to a point where they absolutely have to fix this or they can't go forwards. Tons of people are going to pass on FSD when basic AP doesn't work right.
 
Last edited:
honestly i'm frustrated that people don't figure out how to use the tech, then blame the tech.

when AP on, just hover your foot over Go pedal, you should be good at one pedal driving by now, right?

then if you were paying attention, like you should, the instant you feel phantom brake, you over ride it by stepping on GO pedal. problem solved.

some wavy roads, up and down elevation mountain roads, AP tends to phantom a little. and i never freaked out. dude if you are "freaked out" by a little braking, you likely aren't paying attention. just give up your licence already. take uber, then robotaxi

Exactly. The OP just does not understand the technology, expects it to be something it is not, and fails to exercise his responsibility to monitor the vehicle at all times. Unless he realizes the capabilities and limitations of FSD he should refrain from using it for his sake and that of all other road users.
 
  • Like
Reactions: PACEMD
Exactly. The OP just does not understand the technology, expects it to be something it is not, and fails to exercise his responsibility to monitor the vehicle at all times. Unless he realizes the capabilities and limitations of FSD he should refrain from using it for his sake and that of all other road users.

I'm not the OP, so I can't speak to his/her situation. I can speak to mine.

I understand the tech. I understand the need to be attentive and supervise AP/EAP/FSD. I understand the need to have hands on the wheel and my foot ready to immediately move to the accelerator pedal or brake pedal as necessary.

But since Tesla has no option for non-AP cruise control, I'm left with a binary choice.

1) Use AP and play phantom braking roulette. Mentally ready to be slammed into my seatbelt without warning. Ready to jam my foot on the accelerator before I get rear-ended. Making sure I stay acutely aware of my surroundings as to be absolutely sure I am seeing EXACTLY what AP/EAP/FSD was seeing (or more) so I can immediately and without delay determine a phantom breaking event from an emergency breaking event and react accordingly.

2) Not use AP/EAP/FSD at all since there is no option for a dumb, speed-only cruise control.

If I use AP/EAP/FSD as many here appear to suggest, as described in option 1, performing actions such as " foot hovering over the accelerator" as Brown1428 commented, I struggle to find what value AP/EAP/FSD provides. I might as well just drive at that point. If you can't trust the computer, you can't trust the computer.

The irony is AP/EAP is just good enough to lull you into trusting it, only to be thrust into a panic situation when the car suddenly slams the brakes on for some unknown reason.

There is no question that AP/EAP/FSD can provide added safety most of the time. The problem is is does so at the added risk of being rear ended due to a random phantom breaking event some of the time.

I get OP's situation. I've been there myself on multiple occasions. It's scary when it happens - especially in heavy traffic.

After having EAP/FSD on out Model 3, we had to ask ourselves why we were paying $8K for the privilege to help train the algorithm? We opted to not purchase it on the Y and I would not recommend anyone else do so until Tesla get the kinks worked out. I'm not willing to serve as a crash-test dummy while the software goes through yet another rewrite - nor do I suggest you do either.
 
Also not the OP, so I am talking about my own situation.

Whenever I drive with AP/EAP/FSD engaged, I drive as if the thing is going to kill me. Actually there are large areas of road that I simply don't drive with any of it engaged. If you think it's fun going from 70 to 50 in about a second in heavy traffic, leave it on. I don't. If the car behind you is following to close or not paying attention, just remember as the driver you're at fault. More importantly, I don't need my wife yelling at me because it scared the hell out of her. He yelling at the top of her lungs make the situation a lot worse. But I can't exactly train her not to panic in the passenger seat.

To say there is no problem just means you're lucky enough to live in an area where there isn't. For me since these stretches of road are part of my normal driving routine, I often simply leave AP/EAP/FSD all off and do the driving myself. Not really what I have the feature for, but hopefully it will improve over time.

Now I'm going to say something I'm sure will enrage people, but I much more prefer driving with the assistive technologies on my wife's Honda Pilot. It doesn't try to kill us by slamming on the breaks for no apparent reason (but it will if you have a car unexpectedly swerving into your lane). And more importantly it doesn't simply turn itself off because I touch the steering wheel a little more then it's expecting because I am trying to avoid a object in the road, give a truck some more space, or simply apex a corner correctly.

I love my MY, but I'm also realistic about the problems that exist with them.
 
The title of this thread is a little misleading. The phantom braking is a function of TACC, not Autopilot. When Autopilot is engaged, so also is TACC. But TACC can be engaged without Autopilot, and the same phantom braking events occur.

It's easy to say that the Tesla driver should always be in control, but phantom braking doesn't announce itself. It just applies the brakes without warning. Yes, you can override it, but not before the braking has already begun.

I have experienced one very bad situation where the car in front of me slowed and then moved into an exit lane. The Tesla and the motorcycle behind me both began accelerating as the lane ahead was now open. Just as we were almost even with the exiting car, the Tesla jammed on the brakes for probably half a second. The motorcycle behind almost lost control and almost rear ended me. He then entered road rage mode. I really don't blame him, he could have been killed. He got over it after a few minutes and went on his way, but I know he still thinks I did that on purpose.

My BMW and Toyota both have adaptive cruise control and they do not do this.

I love my Model Y. It's my second Tesla, and I'm sure there is a third in my future. But this has been going on for years. Tesla has to fix this.
 
honestly i'm frustrated that people don't figure out how to use the tech, then blame the tech.

when AP on, just hover your foot over Go pedal, you should be good at one pedal driving by now, right?

then if you were paying attention, like you should, the instant you feel phantom brake, you over ride it by stepping on GO pedal. problem solved.

some wavy roads, up and down elevation mountain roads, AP tends to phantom a little. and i never freaked out. dude if you are "freaked out" by a little braking, you likely aren't paying attention. just give up your licence already. take uber, then robotaxi
Agreed. Well spoken. .
 
I'd love to hear from a software engineer or Elon himself their views on phantom braking.

Well I'm not Elon and I do not work for Tesla or have any special inside knowledge of how their automated driving systems work, but I am a software engineer. So I am going to speculate at what is going on, and please take what I have to say with a grain of salt.

From what I know, the smarts in Tesla's systems are based on neural networks, with some degree of image processing to complement. I've worked with neural networks in the past for defect detection in factories, and if properly sized, designed and trained to handle enough variance, they can do a very good job at recognizing what they have been taught. But that can require a lot of data to capture all the subtle variances, depending on what information is being feed into the neural network for training. The more inputs and ways those inputs can vary, the more data you will need to properly condition and train, and when it comes to vision based systems for driving from multiple cameras and sensors, this can be peta and exabytes of data. Thus probably one of the reasons why all Teslas have the FSD sensors, cameras and cellular radios, to send real world driving data back to the mother ship for training purposes. Every newly captured sequence will be slightly different and will help to fill in any voids and better train the neural networks, making them more robust. This is a huge advantage that Tesla has over the other manufacturers, something that will take the competition years to catch up. There is only so much variation you can generate and capture from a limited set of vehicles driving around in a simulated city some manufacturers use to do their AI and FSD development.

Sounds great right? With all this data Tesla has already, it should just work? Well the problem with neural networks is that they can be unpredictable when you give them something they have not already seen in their training. So there are decisions to make and outcomes to bias in order to decide what to do in those circumstances where the outcome is not something it has already seen before, its decision is not a 100 percent certain one way or another. How the software handles these scenarios and if it should it err on the side of caution (and brake) versus it is nothing (and keep on driving) is most likely what this phantom braking is all about. What the car is sensing and seeing is not close enough to anything it has already seen in its training data, so how it will handle this situation and what it will decide to do is unknown, and I can only assume that the software will bias to braking in those scenarios.

The good news is, the more Teslas that are out there sending back this data, the more robust and complete the training set should become over time, and future versions of the software should better handle these previously unseen cases. There are also tricks you can play like pre-processing the data before sending it to the neural network to reduce the number of metrics and variations to train against, and hopefully have a better coverage and more predictable outcomes, as long as you do not throw away too much when doing that reduction. And there is the possibility of having the car learn while you drive as to what events are braking events and what should be okay, ignored, like the green light and pressing the accelerator to continue we do now. But what Tesla is actually doing is probably a closely guarded secret and we may never know the exact details, so it is a waiting game for the next software version and hopefully that fixes some of these issues and make it better and better...
 
I'm no stranger to autopilot having driven an M3P but the family went with me recently to pick up a MY in Carlsbad. On the way home, while using autopilot, I signaled a lane change. Now yes, I was aware there was a car next to me. I wanted to see how the Y would respond. Instead of gradually slowing down, it violently engaged the brakes. We were all using seatbelts of course but my 9 year old was complaining for 3 days that his neck hurt. Minor whiplash probably. Very disturbing. I didn't report it but I certainly won't do that again.