Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

AutoPilot Almost Killed Me!

This site may earn commission on affiliate links.
Well I'm not Elon and I do not work for Tesla or have any special inside knowledge of how their automated driving systems work, but I am a software engineer. So I am going to speculate at what is going on, and please take what I have to say with a grain of salt.

From what I know, the smarts in Tesla's systems are based on neural networks, with some degree of image processing to complement. I've worked with neural networks in the past for defect detection in factories, and if properly sized, designed and trained to handle enough variance, they can do a very good job at recognizing what they have been taught. But that can require a lot of data to capture all the subtle variances, depending on what information is being feed into the neural network for training. The more inputs and ways those inputs can vary, the more data you will need to properly condition and train, and when it comes to vision based systems for driving from multiple cameras and sensors, this can be peta and exabytes of data. Thus probably one of the reasons why all Teslas have the FSD sensors, cameras and cellular radios, to send real world driving data back to the mother ship for training purposes. Every newly captured sequence will be slightly different and will help to fill in any voids and better train the neural networks, making them more robust. This is a huge advantage that Tesla has over the other manufacturers, something that will take the competition years to catch up. There is only so much variation you can generate and capture from a limited set of vehicles driving around in a simulated city some manufacturers use to do their AI and FSD development.

Sounds great right? With all this data Tesla has already, it should just work? Well the problem with neural networks is that they can be unpredictable when you give them something they have not already seen in their training. So there are decisions to make and outcomes to bias in order to decide what to do in those circumstances where the outcome is not something it has already seen before, its decision is not a 100 percent certain one way or another. How the software handles these scenarios and if it should it err on the side of caution (and brake) versus it is nothing (and keep on driving) is most likely what this phantom braking is all about. What the car is sensing and seeing is not close enough to anything it has already seen in its training data, so how it will handle this situation and what it will decide to do is unknown, and I can only assume that the software will bias to braking in those scenarios.

The good news is, the more Teslas that are out there sending back this data, the more robust and complete the training set should become over time, and future versions of the software should better handle these previously unseen cases. There are also tricks you can play like pre-processing the data before sending it to the neural network to reduce the number of metrics and variations to train against, and hopefully have a better coverage and more predictable outcomes, as long as you do not throw away too much when doing that reduction. And there is the possibility of having the car learn while you drive as to what events are braking events and what should be okay, ignored, like the green light and pressing the accelerator to continue we do now. But what Tesla is actually doing is probably a closely guarded secret and we may never know the exact details, so it is a waiting game for the next software version and hopefully that fixes some of these issues and make it better and better...

I do appreciate you taking the time to write down your thoughts and I agree for the most part, however, there are some objects the neural net should be pretty familiar with by now. For example, overpasses. It seems very random in how it reacts for many people. In my case, it hardly ever slows for them anymore (which I assume is a result of recent updates), but for others, it still brakes. These are the kinds of questions I would love to get answers for. And like you mentioned, it could just be a simple case of the neural net not being properly trained and erring on the side of caution, but why would it take so long to train it to react appropriately for an object like an overpass? It would seem like a simple enough object to recognize.
 
While there seems to be a difference of opinion as to how serious this issue is in terms of safety. There are many more less serious issues on other vehicles that have resulted in full blown DOT recalls. I am surprised at the minimal amount of recalls that Tesla has had. I wonder how they stay below DOTs radar?
 
  • Like
Reactions: imola.zhp
While there seems to be a difference of opinion as to how serious this issue is in terms of safety. There are many more less serious issues on other vehicles that have resulted in full blown DOT recalls. I am surprised at the minimal amount of recalls that Tesla has had. I wonder how they stay below DOTs radar?

I think the difference is remote updates and company culture.

In this case, the phantom braking fix would be done over the air by Tesla whereas a legacy automaker would need a formal recall to force everyone to visit to a service department.

The other issue is culture. There isn't a recall because there isn't a fix. Tesla is far more willing to release software with known issues because it believes it can just update when the fixes remotely when ready. I'm confident many automakers have something similar to AP, but have not released it because they embrace a more conservative approach, waiting until it is 99.999% proven in controlled testing. Some of that may also be driven by liability avoidance, but the other is likely financial considering how expensive a recall campaign would be to have everyone visit a dealership for an update if the software wasn't right.

Nevertheless, Tesla has a known bug in the software but still doesn't have a fix -- even after many years of trying. In hindsight, perhaps they were too overconfident in pushing it out. I think we're now of the third complete rewrite of the software? For me, I would prefer Teals push the software out in shadow mode using the neural net to learn across the entire fleet. Then, once the software reaches five 9's reliability in shadow mode, move it to production. I don't appreciate having my family being put in the role of test dummy while Tesla works on a fix. Like all of you I love the concept, but I need it to work at a level that doesn't increase risk of being rear-ended.
 
  • Like
Reactions: thisisdiddy
The disappointment for me is that we traded a perfectly good 2012 PriusV for the MY because I wanted adaptive cruise control, lane assist and lane change assist. We had used adaptive cruise control on a Volvo about 7 years ago and I loved it. I foolishly assumed Tesla would have this 'basic' feature downpat. It never occurred to me to read forums to see if it was flawed.

Not getting adaptive cruise control (and no AM radio) are real disappointments for me. Oh, and the fact that my front passenger door is within spec but looks open all the time.
 
For the people who think paying attention in the key to this problem, I give you this. I will let you be actively driving (and hence you have to be paying attention since you're actively controlling the car) at 70 miles an hour, and all you have to do to overcome my challenge is push the accelerator down harder. At some random point, without warning, I am going to slam on the break as hard as I can (ie full panic stop). If you think you can override the the breaking with less then a 15 - 20 MPH decrease in speed, then you're a better driver than I am.

The non-adaptive cruise control in my 2009 Mazda 6, heck, the non-adaptive cruise control on the 1998 Subaru Legacy GT I used to drive was a safer driving experience than what is in my MY.

If you haven't encountered this, I am happy for you, I really am. But for those of us that have, we're not talking about coming off of the motor, using regenerative braking, or even adding breaking to the regen. We're basically talking about full on panic breaking.

The first time, I thought it was something I missed (and my wife screamed so loud my ears rang for hours). The second time thankfully my wife wasn't in the car (but still resulted in a road rage incident that went on for a couple of miles before the other driver finally acknowledged my apology or at least drove off). After the third time, I just gave up on using any of the assistive driving features and simply decided I'd drive myself from now on.

I love my MY. It is a great car. But people who put Tesla on a pedestal and say there is nothing wrong are simply ignoring the reality of the situation. Tesla has some real issues in build quality and in keeping the driver assistive technologies working consistently well. I'm lucky and my problems are only with the software, which thankful can be updated (knocks on wood). I will keep trying the updates while I'm alone in the car and not in traffic, but it's going to be a while before I trust my car enough to even use adaptive cruise while any of my family are in the car.
 
For the people who think paying attention in the key to this problem, I give you this. I will let you be actively driving (and hence you have to be paying attention since you're actively controlling the car) at 70 miles an hour, and all you have to do to overcome my challenge is push the accelerator down harder. At some random point, without warning, I am going to slam on the break as hard as I can (ie full panic stop). If you think you can override the the breaking with less then a 15 - 20 MPH decrease in speed, then you're a better driver than I am.

The non-adaptive cruise control in my 2009 Mazda 6, heck, the non-adaptive cruise control on the 1998 Subaru Legacy GT I used to drive was a safer driving experience than what is in my MY.

If you haven't encountered this, I am happy for you, I really am. But for those of us that have, we're not talking about coming off of the motor, using regenerative braking, or even adding breaking to the regen. We're basically talking about full on panic breaking.

The first time, I thought it was something I missed (and my wife screamed so loud my ears rang for hours). The second time thankfully my wife wasn't in the car (but still resulted in a road rage incident that went on for a couple of miles before the other driver finally acknowledged my apology or at least drove off). After the third time, I just gave up on using any of the assistive driving features and simply decided I'd drive myself from now on.

I love my MY. It is a great car. But people who put Tesla on a pedestal and say there is nothing wrong are simply ignoring the reality of the situation. Tesla has some real issues in build quality and in keeping the driver assistive technologies working consistently well. I'm lucky and my problems are only with the software, which thankful can be updated (knocks on wood). I will keep trying the updates while I'm alone in the car and not in traffic, but it's going to be a while before I trust my car enough to even use adaptive cruise while any of my family are in the car.
The situation you describe would of course be scary, however if you notice OP said his car began to rapidly slow down, no slamming on the brakes. Neither that nor sudden swerving is enjoyable or acceptable (I’ve experienced both) and I don’t see where anyone here has said it is - most have experienced these things. I think perhaps the title of the thread and the flair for the dramatic if you will has what’s elicited the most comments. Most sane peoples taste for hyperbole has been sorely tested for some time now, I’ll leave it at that.....
 
The disappointment for me is that we traded a perfectly good 2012 PriusV for the MY because I wanted adaptive cruise control, lane assist and lane change assist. We had used adaptive cruise control on a Volvo about 7 years ago and I loved it. I foolishly assumed Tesla would have this 'basic' feature downpat. It never occurred to me to read forums to see if it was flawed.

Not getting adaptive cruise control (and no AM radio) are real disappointments for me. Oh, and the fact that my front passenger door is within spec but looks open all the time.

Maybe I can help with one thing - the AM radio. I am in Toronto and have found most of the AM stations that I want are on the HD extensions to local FM stations. For instance in Ottawa, CFRA and TSN1200 were broadcast on100.3 HD2 and HD3. I'm not sure if that's still the case. Some are also in Tunein, but the FM-HD work better and aren't dependent on cell connection. The bonus is that if you find them, they sound better on HD than on AM.

Worth a try.
 
The situation you describe would of course be scary, however if you notice OP said his car began to rapidly slow down, no slamming on the brakes. Neither that nor sudden swerving is enjoyable or acceptable (I’ve experienced both) and I don’t see where anyone here has said it is - most have experienced these things. I think perhaps the title of the thread and the flair for the dramatic if you will has what’s elicited the most comments. Most sane peoples taste for hyperbole has been sorely tested for some time now, I’ll leave it at that.....

I've experienced two types AP/EAP/FSD anomalies - both scary.

The first is fill-on emergency braking. Random shadows or whatever... the computer freaks out and goes into emergency braking mode. It's panic inducing as you believe you are milliseconds away from impact. It's a full Code Brown moment.

The second is rapid deceleration due to speed limit changes as coded in the navigation software (which may be what the OP experienced). If the speed limit drops dramatically, such as 70 MPH on the highway to 25 MPH in the toll booth zone, the vehicle will rapidly decelerate from 70 to 25 within a second or two. It's not emergency braking, but it's still heavy and can be scary if in traffic.

There are some locations on Northbound US-290 in Houston that the map thinks are 45 MPH zones (leftover from old construction zones). I've learned that the vehicle will drop speed VERY quickly at the same spots every time. The speed on the display shows 45 even though it is posted at 65 on signage so it's clearly a map issue. This becomes predictable (once you know it's there). It's also fixable by Tesla, albeit still unresolved even a year or more later.

If the OP was entering into a toll zone in the "fast" lanes, the computer likely slowed quickly assuming OP was in the "cash" lanes that are marked as 25MPH (or whatever). Thus the vehicle quickly slows to 25 even thought those lanes are still 70.

The software should at least anticipate the lower speed limit and start to slow ahead of time -- something like a Chill Mode for slowing down. The other would be to use computer vision to determine what lane the vehicle is in and the assign the lane dependent speed limit as appropriate.
 
The first time I thought it was something I missed (and my wife screamed so loud my ears rang for hours). The second time thankfully my wife wasn't in the car (but still resulted in a road rage incident that went on for a couple of miles before the other driver finally acknowledged my apology or at least drove off). After the third time, I just gave up on using any of the assistive driving features and simply decided I'd drive myself from now on.

I love my MY. It is a great car. But people who put Tesla on a pedestal and say there is nothing wrong are simply ignoring the reality of the situation. Tesla has some real issues in build quality and in keeping the driver assistive technologies working consistently well. I'm lucky and my problems are only with the software, which thankful can be updated (knocks on wood). I will keep trying the updates while I'm alone in the car and not in traffic, but it's going to be a while before I trust my car enough to even use adaptive cruise while any of my family are in the car.

This. Thanks for your rational and logical explanation of a legitimate safety issue. We took a road trip last weekend with 2 other adults and had 4 phantom brake incidents. The 1st was the absolute worst and really shook my wife pretty badly. I told them I would only use it again when there was no one behind me. Well we had 3 more, albeit none as bad as the 1st. That one was on a freeway in L.A. and thank God no one was behind me at the time. I would have caused a 10 car pileup with the amount of tailgaters.
 
For the people who think paying attention in the key to this problem, I give you this. I will let you be actively driving (and hence you have to be paying attention since you're actively controlling the car) at 70 miles an hour, and all you have to do to overcome my challenge is push the accelerator down harder. At some random point, without warning, I am going to slam on the break as hard as I can (ie full panic stop). If you think you can override the the breaking with less then a 15 - 20 MPH decrease in speed, then you're a better driver than I am.

The non-adaptive cruise control in my 2009 Mazda 6, heck, the non-adaptive cruise control on the 1998 Subaru Legacy GT I used to drive was a safer driving experience than what is in my MY.

If you haven't encountered this, I am happy for you, I really am. But for those of us that have, we're not talking about coming off of the motor, using regenerative braking, or even adding breaking to the regen. We're basically talking about full on panic breaking.

The first time, I thought it was something I missed (and my wife screamed so loud my ears rang for hours). The second time thankfully my wife wasn't in the car (but still resulted in a road rage incident that went on for a couple of miles before the other driver finally acknowledged my apology or at least drove off). After the third time, I just gave up on using any of the assistive driving features and simply decided I'd drive myself from now on.

I love my MY. It is a great car. But people who put Tesla on a pedestal and say there is nothing wrong are simply ignoring the reality of the situation. Tesla has some real issues in build quality and in keeping the driver assistive technologies working consistently well. I'm lucky and my problems are only with the software, which thankful can be updated (knocks on wood). I will keep trying the updates while I'm alone in the car and not in traffic, but it's going to be a while before I trust my car enough to even use adaptive cruise while any of my family are in the car.

I use AP with Autosteer a lot. I agree with many of your points and concerns, but disagree with not being about to react to phantom braking quick enough - this is highly dependent on the individual and the level of attentiveness being the wheel while using these features.

I always have my foot on the accelerator ready to press when phantom braking occurs. It’s predictable when it’ll occur, and if you’re paying attention, it’s easy to overcome and easy to spot when it’ll probably do it.

To be honest, it’s becoming less and less. There used to be two overpasses on my way to work and a shadow spot on my way home that caused my 2019 model 3 to brake - my 2020 on the most recent software no longer brakes on two of those instances and only someone catches the other two.

It’s getting better, but I think many would appreciate a “standard cruise control” setting under the autopilot tab.
 
  • Like
Reactions: GeezerSquid
The purpose of autopilot is to make driving more convenient, less stressful and safer. Having to stay hyper-alert with foot hovering over the accelerator adds stress, is less convenient and less safe than normal driving.

Just a tip to make it more bearable. Try simply resting your foot on the accelerator instead of hovering over it (you must have calves of steel). Unless you have Shaq's feet, I haven't found it to be a nuisance/less convenient. If anything, it's a nice spot to place your right foot.

It's unfortunate phantom braking exists - Tesla really needs to figure this out.
 
I have had issues with Automatic Emergency Braking. I have owned my car less than 2 weeks. I experienced 3 phantom braking driving home from the Tesla center when approaching overpasses. But the more troubling is during 3 times driving in town, at around 40 mph, a car or truck would turn ahead in front of me. I saw them all three times, and I would not have even lifted my foot off the accelerator knowing they would clear my path. But AEB slammed on the brakes full hard when I was a good 100-150 feet away. This has happened while manually driving the car, not on traffic aware cruise control. So, the software needs to be tweaked. What it should do is let off the accelerator first, then analyze if object is not moving out of the way before slamming on the brakes. My Cadillac before I got my MS would do this.

After driving my new MS and experiencing FSD, TACC with Auto Steer and just the Automatic Emergency Braking... I think Elon is years away from a car that will drive itself anywhere without help. Shoot, trains ride on tracks and they aren't fully autonomous. I think Elon will have to swallow his pride and add Lidar in order to get where he wants. I do like the automation and assist to driving however but as of now, having a little buyers remorse in paying the extra $8000.
 
  • Like
Reactions: kavyboy
-Phantom braking suuucks.
-On the other hand there are quite a few videos on YouTube that demonstrate the autobraking feature saving some accidents in which the driver would not have anticipated. So if this would apply to "you", you would not complain.
-I have driven numerous cars in the past few years with adaptive cruise control- Tesla is by far superior to the systems I have driven.
-The difference that Tesla has compared to all of the competition is that they are continuing to improve and these changes are implemented in their pre-existing cars via OTA software updates.
(on a side note I saw a video in this forum where a gentleman reviewed his one year experience with the Audi e-tron. Key points- the paint job was horrendous, water would leak inside the car when raining, no support by Audi nor its dealers, and driving range not so good. Imagine the feedback you would get if this was a new model Tesla car.