Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

AP Changing lanes into adjacent vehicle

This site may earn commission on affiliate links.
I have a Tesla Model 3 with FSD and HW3. I have a little over 9000 mi on it. On Jan 11 2020, I was westbound on the San Mateo bridge with NAP engaged. Autopilot just decided to changes lanes (without indicating) causing the vehicle in the adjacent lane to swerve to avoid me. I only got a "Take over immediately" once my vehicle was halfway into the adjacent lane. I realize I have no proof that AP was engaged, and this just looks like I'm driving like an idiot.

I contacted Tesla about this. They told me that the lane markings were unclear and I shouldn't rely on the system. I totally get this, and had my hands on the steering wheel, and took over within a second or so. Had the guy next to me not swerved, this would've been a collision that would've been my fault.

When I told Tesla I don't agree with their assessment of unclear lane markings, and that I would post the video online, they phoned me and told me to stop threatening their employees, and if I have a problem with their response, I should contact their legal department.

I used to be a huge Tesla advocate. After this incident, and they way they've treated me, I'm posting this as a warning to everyone else.

Take a look at the dashcam footage yourself, and see if you think the lane markings are unclear.

 
I have a Tesla Model 3 with FSD and HW3. I have a little over 9000 mi on it. On Jan 11 2020, I was westbound on the San Mateo bridge with NAP engaged. Autopilot just decided to changes lanes (without indicating) causing the vehicle in the adjacent lane to swerve to avoid me. I only got a "Take over immediately" once my vehicle was halfway into the adjacent lane. I realize I have no proof that AP was engaged, and this just looks like I'm driving like an idiot.

I contacted Tesla about this. They told me that the lane markings were unclear and I shouldn't rely on the system. I totally get this, and had my hands on the steering wheel, and took over within a second or so. Had the guy next to me not swerved, this would've been a collision that would've been my fault.

When I told Tesla I don't agree with their assessment of unclear lane markings, and that I would post the video online, they phoned me and told me to stop threatening their employees, and if I have a problem with their response, I should contact their legal department.

I used to be a huge Tesla advocate. After this incident, and they way they've treated me, I'm posting this as a warning to everyone else.

Take a look at the dashcam footage yourself, and see if you think the lane markings are unclear.

Good example why we NEVER use NoAP. We have experienced enough flaky moves like this.
 
  • Like
Reactions: afadeev
Lane markings look fine to me, however, I don't think this was NOA changing lanes. There wasn't any reflections showing the turn signal. Also, NOA won't change to the left lane when therw isn't any slow cars in front of you. NOA don't like the left lane even when AP is set to 90.

Looks like AP went from following the white lines to the dark black tire marks as it tried to center the car in between. Maybe that is why the red hand of death popped up on your screen.

Also, with my experience I don't think your hands were on the wheel or just have slow reflex. At least for me when AP does stupid stuff like this it breaks right away as my hands will provide enough grip before the car veer too far. My car does stupid stuff like this too where it will try to curb the car on left side island but my hand always breaks AP as soon as it veers off an inch.

Not saying it's your fault. I'm mad as well that AP is not perfect, so be careful. It works 98% of the time but that 2% could hurt or kill you. Keep an eye on the road and both hands on the wheel when using AP.
 
...They told me that the lane markings were unclear...

That sounds like a generic reply because they didn't see the video.

...had my hands on the steering wheel, and took over within a second or so...

It's good that your hands were on the wheel.

However, your response was too slow.

I always apply a counter-torque so I can monitor the torque by the tactile feedback and I would realize that the torque was going wrong and I would easily correct the torque seamlessly without allowing it to touch the left lane.

Maybe you were waiting to see whether the system would cross the lane or not. It's a beta system and I would not even wait to see whether it can kill me or not. Yes, it can!

Maybe you didn't monitor the torque thus, you had no torque feedback continuously which explained that you had to wait for the feedback of the lane bumps to prompt you that it's time to take back the control.

...I would post the video online, they phoned me and told me to stop threatening their employees, and if I have a problem with their response, I should contact their legal department...

It does sound like a threat to go public.

I've been using Autopilot since 2017 for the past 3 years for more than 50,000 miles and there have been numerous, countless incidences that it did not work as I wanted, including the incidence mentioned in this thread, but I am fine with it because it's beta and I was able to quickly to implement torque monitor to anticipate any wrong steering after the new delivery.

It's fine to go public to educate that drivers should be in control of the car at all times and don't rely on Autopilot as in this incidence because after all, it's still beta.

But threatening is not cool!
 
Last edited:
That sounds like a generic reply because they didn't see the video.
I received this reply after they confirmed they had reviewed the video.

It does sound like a threat to go public.
The woman who called me was a manager at Burlingame service department. She told me to stop threatening her employees, implying I had threatened the employees themselves.

If you want to take the action of "going public" as a threat, then so be it. I've done that by talking about the incident here.

What I would have expected was a civilized response such as, "Thanks for your feedback. We will do our best to take the information you have provided to further improve our auto pilot system.". Instead, all they've done is anger a customer, and destroy an advocate. What possible benefit can come of them treating me belligerently?
 
Any chance you bumped it out of auto-steer? It will also do the take over immediately if auto-steer is disangaged and the car immediately starts drifting into a neighboring lane.

The videos really don't give enough info to tell what is going on. They need to show time, speed, AP/TACC status, hand recognition, etc.
 
Last edited:
  • Like
Reactions: MentalNomad
I received this reply after they confirmed they had reviewed the video.


The woman who called me was a manager at Burlingame service department. She told me to stop threatening her employees, implying I had threatened the employees themselves.

If you want to take the action of "going public" as a threat, then so be it. I've done that by talking about the incident here.

What I would have expected was a civilized response such as, "Thanks for your feedback. We will do our best to take the information you have provided to further improve our auto pilot system.". Instead, all they've done is anger a customer, and destroy an advocate. What possible benefit can come of them treating me belligerently?

They are just trying to intimidate you to keep from you exposing them, the same way Enron would do that to analysts right before their crash. If your supercharging stops working, you'll know why.
 
What speed were you going there, and what was the speed limit? Judging from the other cars who were in front of you and to your left, you were going at quite a good clip it seems.

Also i strongly suspect you were not in AP at that time.. just a speculation. The lines look good, and I never had a single issue when the lines were half decent.
 
Any chance you bumped it out of auto-steer? It will also do the take over immediately if auto-steer is disangaged and the car immediately starts drifting into a neighboring lane.

The road is turning left.
If AP/AS were disengaged, the car would have steered straight, and drifted right (relative to the lane markings). Instead, it drifted left.


thardie said:
...I would post the video online, they phoned me and told me to stop threatening their employees, and if I have a problem with their response, I should contact their legal department...

It does sound like a threat to go public.

What threat?
OP got brushed off by a Tesla rep with a generic response.
OP gave them heads up that that such a response is inadequate, and he intends to warn others about the error condition.

I certainly see no threat to anyone, least of all to Tesla employees.

O.P. - thanks for publicly sharing the video!


I've been using Autopilot since 2017 for the past 3 years for more than 50,000 miles and there have been numerous, countless incidences that it did not work as I wanted, including the incidence mentioned in this thread, but I am fine with it because it's beta

I've used AP for the past 12K miles, and have learned to expect in what scenarios it is likely fail.
Of which there are too many.
I am still fooling around with AP, but my wife has sworn off engaging it, as she deems it to be a net safety hazard.
Sadly, I've got no data with which I could change her mind.

Calling AP "beta" does not make anyone feel any better about its shortcomings.

don't rely on Autopilot as in this incidence because after all, it's still beta.
But threatening is not cool!

Sentence #1 is politically correct B.S.
Sentence #2 is paranoid B.S.

Cheers!
 
man, that other guy was flying.

Lane markings look decent to me. Cant' get any clearer than a retaining wall to the right.

Which other guy? The first car? That one looked like he might have been doing 5mph more than the OP.

The second car looks like he punched it to get past the obvious drunk driving Tesla as quickly as possible. I always want to get past bad drivers as quickly as possible too.
 
...Sentence #1 is politically correct B.S...

I consider "beta" is a scientific fact because it's reproducible.

Beta Autopilot has proven that it can kill its drivers multiple times, not just one time, it's reproducible, and many collisions as well so there is nothing political about it.

Drivers may expect that Autopilot should not steer wrongly but it has well proven itself that it does steer wrongly right into trucks, gore point, concrete median center divider and also in this thread...

However, because I know Autopilot can kill so I have constantly monitored its torque feedback with my own counter-torque force from my hands and it has made my driving much safer than when I didn't have Autopilot.
 
Last edited:
The road is turning left.
If AP/AS were disengaged, the car would have steered straight, and drifted right (relative to the lane markings). Instead, it drifted left.
Not necessarily. Weight of a hand on the wheel could pull it one way. We also don't know about the alignment in this car.

Doesn't look like the case here, but crown in the road or groves could also pull the car one way or another. I can take my hands off the wheel on my car without AP and sometimes it will pull one way or another.

It just seams to drift right after what I believe is hitting a bump as I can see the horizon bounce a bit.

In any case, I use AP all the time and I keep my hands on the wheel ready for anything. I would not trust it with my life.
 
Like @brkaus, I would not trust AP with my life, but I use it all the time, as an assist (and sometimes I assist it). The one feature I have set, which would probably have avoided this particular situation (though I too I’m a skeptic as to whether auto-steer was actually active in this case), is: “Require lane change confirmation?” set to ON. Then you don’t have to worry about “Autopilot just decided to changes lanes (without indicating)”.
 
  • Like
Reactions: Electroman and tvad
That doesn't look like an Autopilot lane change. Plus, Autopilot cannot change lanes without indicating.

I'm sure you were upset about this. But getting into a heated argument with Tesla employees such that they ask you to stop harassing employees doesn't look good. I'm inclined to give Tesla the benefit of the doubt. There are specific reasons why managers say "contact legal" - usually it's threatening legal procedures. You threaten to sue me, and I just say "Go talk to legal. This conversation is over"
 
That doesn't look like an Autopilot lane change. Plus, Autopilot cannot change lanes without indicating.

I'm sure you were upset about this. But getting into a heated argument with Tesla employees such that they ask you to stop harassing employees doesn't look good. I'm inclined to give Tesla the benefit of the doubt. There are specific reasons why managers say "contact legal" - usually it's threatening legal procedures. You threaten to sue me, and I just say "Go talk to legal. This conversation is over"
Agreed. There seems to be more to the story how it got to this point.
 
To the OP: instead of contacting the service center (who wouldn't have any influence over AP decision issues), you should be using the form on the tesla.com website and submit a bug. Based on what you're describing (moving to a different lane without a turn signal and also the red hands warning), it seems like AP was thrown off by that huge bump in the highway. The lane markings were fine, but the car might log the event as "I couldn't see the lines properly" and that's what the service tech saw when s/he pulled the logs.

That way your issue will get escalated to the AP dept at headquarters and they can use your incident as training data for the NN.

I had an experience a year ago where I was in bumper-to-bumper traffic, and the truck in front of me was invisible to the car, so my car kept trying to accelerate into it. I recorded a video of it. Tesla followed up asking me for the approximate time this was happening and also asked me to send them the video. Here's a screenshot of that vid:

upload_2020-1-24_9-19-53.png


As it turns out, the coloring of the truck just happened to match the background sky, tree line and road, so when I got to the right distance from that truck, it became invisible to the cameras. I guess the cameras trumped the radar, but they told me this was a very interesting situation and would use it to make AP better.
 
To the OP: instead of contacting the service center (who wouldn't have any influence over AP decision issues), you should be using the form on the tesla.com website and submit a bug. Based on what you're describing (moving to a different lane without a turn signal and also the red hands warning), it seems like AP was thrown off by that huge bump in the highway. The lane markings were fine, but the car might log the event as "I couldn't see the lines properly" and that's what the service tech saw when s/he pulled the logs.

That way your issue will get escalated to the AP dept at headquarters and they can use your incident as training data for the NN.

I had an experience a year ago where I was in bumper-to-bumper traffic, and the truck in front of me was invisible to the car, so my car kept trying to accelerate into it. I recorded a video of it. Tesla followed up asking me for the approximate time this was happening and also asked me to send them the video. Here's a screenshot of that vid:

View attachment 503794

As it turns out, the coloring of the truck just happened to match the background sky, tree line and road, so when I got to the right distance from that truck, it became invisible to the cameras. I guess the cameras trumped the radar, but they told me this was a very interesting situation and would use it to make AP better.

Pretty sad if the Service Center can't be your interface for reporting bugs. That doesn't show a very good commitment by the organization to improve AP. It doesn't matter where an owner interfaces with Tesla, they should be ready to help with these issues to at least take the info an relay it. Just another example of bad leadership at Tesla.

And your case shows how poorly the code is being written that it wasn't using the cameras to look at things that we look at, like brake lights, and it's not trusting its own radar.