Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilot nearly allows side collision on highway

This site may earn commission on affiliate links.

On the drive home after work, I had AP engaged in my Model 3 (EAP on HW2.5). After confirming a lane change to the right to get around a slower semi, I passed it, while the other traffic passing on the semi's left maintained about the same relative position to me.

The tan Honda which passed me in the first 10 seconds of the clip is mostly out of view, off my front left corner, and begins changing lanes to the right, from two lanes over, to one lane over, but then apparently doesn't see me, and begins entering my lane. When the Honda first began changing lanes, I was poised to take over, but curious to see how conservative the autopilot would act, and if it would give any breathing room to the car veering towards me.

Much to my surprise, it did absolutely nothing, even as the car entered my lane, at which point, I immediately swerved to avoid what would have otherwise been a side to side collision at highway speeds, and laid on the horn for a few solid seconds after making sure I had control of the vehicle. The Honda slowly retreated back to it's lane, and then after a few seconds, completed it's original intended lane change in front of me with plenty of space this time.

I took away a couple things from this:
  • 1.) Autopilot is far from perfect. If this is any indication of Tesla's Full Self Driving, they have a lot of work to do. You really, really do have to be prepared to take over in an instant at any time.
  • 2.) The front facing side cameras should be viewable in dashcam downloads. The Honda spends way too much time out of view of the 3 cameras I can access, even though it was plainly visible to me. As far as sentry mode goes, if an entire car can hide in the 3-cam blind spot, a person easily can too. If the other cameras are accessible or viewable, it's not apparent where that footage can be obtained.
 

Attachments

  • Untitled.png
    Untitled.png
    1,006.2 KB · Views: 120
From what I've found in this forum, this is how the AP side collision avoidance is meant to work:

"In addition to the warnings previously described, Lane Assist may provide steering interventions if Model 3 drifts into (or close to) an adjacent lane in which an object, such as a vehicle, is detected. In these situations, Model 3 automatically steers to a safer position in its driving lane. This steering is applied only when Model 3 is traveling between 30 and 85 mph (48 and 140 km/h) on major roadways with clearly visible lane markings. When Lane Assist applies a steering intervention, the touchscreen briefly displays a warning message"

How close was the shoulder on the right? Maybe the Model 3 thought it had no additional room to move over?

As a side note, I've heard a few people say that AP near-misses don't bode well for FSD. I think FSD will consist of completely different neural networks and much better processing, so I'm not sure if we can really say that until we see FSD in the wild.

Glad everything turned out alright for you and your car!
 
1.) Autopilot is far from perfect. If this is any indication of Tesla's Full Self Driving, they have a lot of work to do.

FSD code and logic is unrelated to AP. It's a completely different product. Why would you think one has any indication on the other?


You really, really do have to be prepared to take over in an instant at any time.

That's literally what the car tells you when you enable AP.
 

On the drive home after work, I had AP engaged in my Model 3 (EAP on HW2.5). After confirming a lane change to the right to get around a slower semi, I passed it, while the other traffic passing on the semi's left maintained about the same relative position to me.

The tan Honda which passed me in the first 10 seconds of the clip is mostly out of view, off my front left corner, and begins changing lanes to the right, from two lanes over, to one lane over, but then apparently doesn't see me, and begins entering my lane. When the Honda first began changing lanes, I was poised to take over, but curious to see how conservative the autopilot would act, and if it would give any breathing room to the car veering towards me.

Much to my surprise, it did absolutely nothing, even as the car entered my lane, at which point, I immediately swerved to avoid what would have otherwise been a side to side collision at highway speeds, and laid on the horn for a few solid seconds after making sure I had control of the vehicle. The Honda slowly retreated back to it's lane, and then after a few seconds, completed it's original intended lane change in front of me with plenty of space this time.

I took away a couple things from this:
  • 1.) Autopilot is far from perfect. If this is any indication of Tesla's Full Self Driving, they have a lot of work to do. You really, really do have to be prepared to take over in an instant at any time.
  • 2.) The front facing side cameras should be viewable in dashcam downloads. The Honda spends way too much time out of view of the 3 cameras I can access, even though it was plainly visible to me. As far as sentry mode goes, if an entire car can hide in the 3-cam blind spot, a person easily can too. If the other cameras are accessible or viewable, it's not apparent where that footage can be obtained.

A couple of comments... if the videos are lined up in time correctly, it looks like you swerved before or just as the other car touched the lane line. Just because you think that the Tesla should have acted sooner doesn't mean that is the way it is designed.

If you take over at any point, you CANNOT definitively say that there is any kind of deficiency in the Tesla's programming. I'm sorry but it's true.

At that point the Tesla had no Radar coverage on that object and had to use vision and ultrasonics. I think the threshold on side impact is probably a lot closer that what you as an attentive human feels comfortable with.

Edit: The car has 8 external cameras. You can't impart a "3 camera blind spot" onto what your car is actually processing.
 
A couple of comments... if the videos are lined up in time correctly, it looks like you swerved before or just as the other car touched the lane line. Just because you think that the Tesla should have acted sooner doesn't mean that is the way it is designed.

If you take over at any point, you CANNOT definitively say that there is any kind of deficiency in the Tesla's programming. I'm sorry but it's true.

At that point the Tesla had no Radar coverage on that object and had to use vision and ultrasonics. I think the threshold on side impact is probably a lot closer that what you as an attentive human feels comfortable with.

Edit: The car has 8 external cameras. You can't impart a "3 camera blind spot" onto what your car is actually processing.
With your reasoning, you would have said the same thing if the guy who got killed had swerved 10 yards before going under the semi.
 
  • Like
Reactions: Darren Donovan
With your reasoning, you would have said the same thing if the guy who got killed had swerved 10 yards before going under the semi.

To an extent yes. You have to look at the specific collision avoidance/reduction features individually though.

As far as AEB goes, yes, you wouldn't be able to say that there was a failure there, UNLESS you can prove that all the factors the car is looking at for AEB initiation should have been there at 10 yards. But we don't know the programming right?

As far as TACC goes, well we already know that TACC isn't explicitly designed to recognize or react to stopped objects.

I'm just trying to say you can't say there was a failure when you disengage the system unless you can prove that the system SHOULD have(based on it's programming as designed) reacted before you did.

Yes you can say it's a "technicality" but it is an important technicality.
 
With your reasoning, you would have said the same thing if the guy who got killed had swerved 10 yards before going under the semi.


And since the guy who got killed did go under the semi, there clearly was some kind of deficiency that needs to be corrected for. Was there a technical failure with the Tesla, we don't know, could be, but we don't know the specifics on what the Tesla saw or how it viewed the world in those 10 seconds after activation of AP.
 
  • Like
Reactions: cwmagui
From what I've found in this forum, this is how the AP side collision avoidance is meant to work:

"In addition to the warnings previously described, Lane Assist may provide steering interventions if Model 3 drifts into (or close to) an adjacent lane in which an object, such as a vehicle, is detected. In these situations, Model 3 automatically steers to a safer position in its driving lane. This steering is applied only when Model 3 is traveling between 30 and 85 mph (48 and 140 km/h) on major roadways with clearly visible lane markings. When Lane Assist applies a steering intervention, the touchscreen briefly displays a warning message"

How close was the shoulder on the right? Maybe the Model 3 thought it had no additional room to move over?

It seems like I ticked all the boxes for that to work, It was a beautiful clear evening, with a well lit road without even any direct sun glare, as it was late in the evening. Travelling on a very well striped highway at reasonable highway speeds (definitely not close to or over 85 mph). It's pretty much the perfect testing environment with pretty limited variables.

There was plenty of room to the right, which in the video you can see I ended up using. The shoulder (on both sides) on that stretch of highway is effectively an additional lane, probably to help emergency vehicles in heavy traffic. In a couple instances on different roads, I've noticed AP seems to treat solid white lines as concrete barriers, and will absolutely not cross them. There's one particular intersection I turn right at most mornings that has a nice long approach where the right turn lane starts with a solid white line, and even though other cars switch to that lane, AP doesn't draw an extra lane on the screen, or allow my car to switch to it. I'll turn on the blinker with AP activated, and it just sticks to the lane I'm in, regardless of the amount (or absence) of other traffic. So, knowing how it treats solid lines, it seems there's a priority to respect that over an encroaching vehicle, which obviously is not desirable, at least in my experience.
 
  • Like
Reactions: willow_hiller
As far as TACC goes, well we already know that TACC isn't explicitly designed to recognize or react to stopped objects.

While I agree with most of your statement, I feel TACC has gotten much better. Originally, my AP1 would have gladly slammed into anything stopped at an intersection. Sometime in 2018 the problem was fixed to my satisfaction. May not be perfect but I trust it now.

And to clarify, I don’t mean vehicles in front of me that slowed to a stop, I mean vehicles that were at a dead stop when I came over a hill or around a curve or when the vehicle in front of me suddenly swerved to another lane to avoid a stopped vehicle.
 
FSD code and logic is unrelated to AP. It's a completely different product. Why would you think one has any indication on the other?

Because the programming is all written by the same company for one product, and as we're learning from Boeing, safety is not supposed to be an add-on option. My car alerts me frequently when I'm driving with no automation when it thinks I'm not slowing down fast enough to avoid the car in front of me (my foot is already on the brake in many instances). This would suggest it's projecting several seconds forward what may happen, and is at least alerting to a potential crash. There was absolutely nothing in this instance, no alert, no intervention, nothing. I'm not expecting it to navigate streets, I'm expecting it to keep me safe in what's supposed to be the easiest, most predictable driving scenario.



That's literally what the car tells you when you enable AP.
Yes, and too many people think that's a suggestion, and put too much trust in it. This is a reality check that should hopefully warn others to not let go as much as they may be doing. I hope that my reactive experience helps others be more proactive. Besides all of that, what's the point if I can't relax a bit while AP is engaged? What part of the driving process am I delegating if I'm still fully engaged, and now not only making driving decisions, but also evaluating whether or not AP is making the correct decisions on top of that?
 
It amazes me that people think that they are so tapped out mentally with just their normal driving that they think that AP makes them have to work more. Take an experienced driver and a new 16 year old driver. Do you really think that you as an experienced driver is using just as much brain power to process your environment as that 16 year old? How about between you as an experienced driver and a NASCAR driver, Indy car driver, military vehicle in a convoy in a war zone driver?

My car alerts me frequently when I'm driving with no automation when it thinks I'm not slowing down fast enough to avoid the car in front of me (my foot is already on the brake in many instances)

Then either your settings for the alert are set to be too sensitive, or you "frequently" wait too long to meaningfully brake or are just following too closely in general.
 
While I agree with most of your statement, I feel TACC has gotten much better. Originally, my AP1 would have gladly slammed into anything stopped at an intersection. Sometime in 2018 the problem was fixed to my satisfaction. May not be perfect but I trust it now.

And to clarify, I don’t mean vehicles in front of me that slowed to a stop, I mean vehicles that were at a dead stop when I came over a hill or around a curve or when the vehicle in front of me suddenly swerved to another lane to avoid a stopped vehicle.
Sheesh!! Glad I just now got my M3 then! I find TAAC to be super helpful and I use it all the time. I DO trust it to come to a stop but there are certainly moments where it makes me nervous and those typically involve coming over a crest or around a corner with stopped traffic on the other side. It's never failed to stop but sometimes it takes big brass balls to wait on it to catch up.

I feel like it's relying on the radar more than the cameras because the cameras can easily see the cars in front and they show on the display but then it reacts suddenly and violently (ok... that may be too strongly worded) a few seconds later as if it's surprised that something is stopped ahead and it needs to slow down.
 
And since the guy who got killed did go under the semi, there clearly was some kind of deficiency that needs to be corrected for. Was there a technical failure with the Tesla, we don't know, could be, but we don't know the specifics on what the Tesla saw or how it viewed the world in those 10 seconds after activation of AP.

Please read your manual, AP is in Beta and AP is nothing to do with FSD. NOTHING. Your driving the car.

AP is assistive only to take the pressure off maybe. Please read your manual. FSD is not approved on any road or car on this whole planet. What your saying has everything to do with FSD.

There more then likely no deficiencies, the only one I see is people not understanding these are two different systems and nobody is reading and comprehending perhaps the manual. Elon Musk is not helping either with his hands not on the wheel.

This message is not only intended for you.
 
Last edited:
FSD code and logic is unrelated to AP. It's a completely different product. Why would you think one has any indication on the other?
“Well, we already have Full Self-Driving capability on highways. So from highway on-ramp to highway exiting, including passing cars and going from one highway interchange to another, Full Self-Driving capability is there. In a few weeks, we’ll be pushing an update that will allow the option of removing stalk confirm (for Navigate on Autopilot) in markets where regulators will approve it, which would be the case in the US, for example."
- Elon Musk
 
  • Like
Reactions: ckharrison10
“Well, we already have Full Self-Driving capability on highways. So from highway on-ramp to highway exiting, including passing cars and going from one highway interchange to another, Full Self-Driving capability is there. In a few weeks, we’ll be pushing an update that will allow the option of removing stalk confirm (for Navigate on Autopilot) in markets where regulators will approve it, which would be the case in the US, for example."
- Elon Musk

That statement doesn't make much sense in light of HW3. What's the purpose of the new hardware if the software is just incremental improvements to Navigate on Autopilot?

Instead of interpreting that statement as "FSD is already deployed on highways in the form of NOA," it makes more sense to interpret it as "We've already got the concept of highway FSD down with NOA." We know Karpathy has been working on new neural networks just for the FSD computer, so the behavior between NOA and FSD will presumably be different.
 
  • Funny
Reactions: AlanSubie4Life