Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilot appears to turn car against incoming car

This site may earn commission on affiliate links.
In this scenario the last 4 sentences and the "note" in the the release notes for autopilot apply. This is not a highway (which it states is best). It is a road with lots of curves (and hills though the release nots don't state that specifically). There is no car that it is following so it is depending solely on road markings. And, the driver does not have his hands on the steering wheel. This adds up to a risky situation. Would I have done something similar if I had AP. Maybe. But, I would also acknowledge that it was a bad and risky thing to do.

Tesla expects people to use common sense with this technology. It is going to be hard for them to put 10 pages of legal disclaimers in the release notes. Nor should they have to.
 
English is not my first language, but I searched from several sources a definition of highway. For instance, in wikipedia

"In American law, the word "highway" is sometimes used to denote any public way used for travel, whether major highway, freeway, turnpike, street, lane, alley, pathway, dirt track, footpaths, and trails, and navigable waterways;[SUP][4][/SUP] however, in practical and useful meaning, a "highway" is a major and significant, well-constructed road that is capable of carrying reasonably heavy to extremely heavy traffic.[SUP][citation needed][/SUP] Highways generally have a route number designated by the state and federal departments of transportation.[SUP][clarification needed][/SUP]"

Source https://en.wikipedia.org/wiki/Highway#United_States

This is picture from M-185 (Michigan highway)

https://upload.wikimedia.org/wikipe...jpg/1280px-M-185_Biking_near_Mile_Marker1.jpg

An other definition of hiwghway

https://en.wikipedia.org/wiki/Highway_systems_by_country#United_States

In the United States, "highway" is a general term for denoting a public way, including the entire area within the right-of-way, and includes many forms:

  1. a high-speed, limited-access road like expressways, freeways, and large toll highways.
  2. an important road that connects cities and large towns.
  3. any road or street, or a travel way of any kind, including pedestrian ways, trails, and navigable waterways, to which the public has a perpetual right of use
 
What I don't understand is why anyone would think a car would stay in the lane when auto-steer is disengaged and the driver isn't touching the steering wheel. When auto-steer is completely disengaged as it was after the 10 second mark in this video, the steering is entirely dependent on the human and irregularities in a road WILL pull the car out of it's lane if no one is steering. The only difference at that point between the Tesla and any other car on the road is that the Tesla was at least audibly and visually warning the driver that nothing was driving the car.
 
the sun blocked the camera. It wasn't the collision warning but the I can't see take the wheel now alarm.

No, the only audio alert the car issued was the "collision imminent" fast-beep, which started at the same instant the blue Autosteer icon blinked off and the red "collision imminent" warning and graphic were displayed, at 0:10.

So Autosteer did not turn the car into oncoming traffic: it just chose to give up control at an instant when external forces were acting to turn the car left. To me the worrying thing is not that it couldn't handle the situation, but that it didn't orchestrate the handoff to manual control in a more timely and orderly fashion: Autosteer was clearly struggling (note the steering input toward and then away from the center line at 0:02-0:03) and didn't completely shut down for another six or seven seconds. The only things I can tell for sure from the video are that 1) there was, early on, a visual indication that it was not confident of the lane markings (no gray or blue lane lines displayed) and 2) that there was no audible indication prior to Autosteer disengaging and the "collision imminent" warnings.

Moderator request: please update the thread title. Autopilot did not steer the car into oncoming traffic: Autopilot just punted at a bad time.
 
Last edited:
Sufficiently good "lane markings" (plural) required, and this road appeared to have
only the center double yellow line, and no well-defined road-edge indication.

The unexpected shadow might have been the cause of the collision warning,
and perhaps that warning turned off the auto-steering?

Using a beta feature in a potentially deadly situation (head on high speed crash,
not just a fender scrape), is ... perhaps exciting, but probably unwise.
 
Moderator request: please update the thread title. Autopilot did not steer the car into oncoming traffic: Autopilot just punted at a bad time.

How do you know collision avoidance did not turn the wheel? Maybe AP mistake the tree shadow to be an object on the road.

- - - Updated - - -

Using a beta feature in a potentially deadly situation (head on high speed crash,
not just a fender scrape), is ... perhaps exciting, but probably unwise.

How would you describe putting beta software, which controls cars driving, to public release?
 
How do you know collision avoidance did not turn the wheel? Maybe AP mistake the tree shadow to be an object on the road.

Matias - asking for an accurate thread title is not unreasonable. We don't know either way (though it seems reasonable that AP did not steer into incoming traffic). It was definitely an adrenalin-inducing situation and I'd guess the AP engineering team is looking at the video. But to title the thread to indicate that AP turned the car into incoming traffic is drawing a conclusion without the facts.

You don't have an issue with a factual thread title, do you?
 
From the video the lane-steering indicator goes from blue to off. So it likely lost tracking well before that point.

The user was just completely ignorant of the warnings the car was giving, and completely ignorant of the previous history just a few seconds before than where it was obvious the car wasn't tracking the lanes.
Yes. It's the user's fault. Clearly. It always is. Because they turned on autosteer.
The problem is, everything comes with dozens of warnings and caveats. And no one reads them. And things work great and users try the new exciting feature and get comfortable. Until something scary happens.
I think this will turn into a PR disaster for Tesla. And we will see the AP enforcing "only on access controlled freeways" before we get to 7.1.

I have no intention of upgrading to v7 (the UI is way too craptastic). But in part this decision is easier for me because I have absolutely ZERO intention of using autosteer. Certainly not in the current pre-alpha implementation. Not when it eventually reaches beta. And likely not when it reaches "ready for general population use" in about 2020. I worry how long Tesla will let me reject "upgrades", but I'll cross that bridge when I get there. For now videos like this just confirm what I felt about v7 all along.

- - - Updated - - -

How would you describe putting beta software, which controls cars driving, to public release?
I call it reckless endangerment. But I'm not a lawyer.
And I'm sure Tesla's lawyers are comfortable that the disclaimer and the lack of precedent will protect them in the inevitable lawsuits once we see the first crashes.
In the meantime I'm thinking of adding a "still on FW6.2" sticker to my car to assure drivers around me that I'm not being driven around by pre-alpha quality software...
 
How do you know collision avoidance did not turn the wheel? Maybe AP mistake the tree shadow to be an object on the road.


Matias, I watched the video frame by frame, and noted that the Autosteer icon disappeared before the car turned toward the oncoming traffic: hence my conclusion that it was not an Autosteer-commanded turn. If the car is still actively steering after the UI reports it has disengaged...well, I'd be very surprised if that's the way Tesla programmed it.
 
Matias, I watched the video frame by frame, and noted that the Autosteer icon disappeared before the car turned toward the oncoming traffic: hence my conclusion that it was not an Autosteer-commanded turn. If the car is still actively steering after the UI reports it has disengaged...well, I'd be very surprised if that's the way Tesla programmed it.

I slowed it down to 25% right before auto-steer disengages if anyone was interested. What's notable when you watch it slower is that it seems clear the cars would not have collided if he didn't grab the wheel. The other car was even when his hand touched the wheel and well clear by the time the driver turned the steering wheel. Not sure that means much other than the Model S in this particular case did not steer "into" the other vehicle in a manner that would have actually caused a collision.

25percent speed - this is not my video - YouTube
 
Last edited:
Well, it is made for highway driving with hands on the wheel from on-ramp to off-ramp. Quite explicit. The use case in that video was with oncoming traffic, not a highway... I guess the temptation to use it for all kinds of situations is large, just as TACC - as it mostly works - not the best idea, more Darwin Award territory.

If it can only be used on the freeway, then why does Tesla allow its use when not on the freeway? Also, when did Autopilot come with a list of EXCEPTION scenarios where it should not be used?

- - - Updated - - -

In this scenario the last 4 sentences and the "note" in the the release notes for autopilot apply. This is not a highway (which it states is best). It is a road with lots of curves (and hills though the release nots don't state that specifically). There is no car that it is following so it is depending solely on road markings. And, the driver does not have his hands on the steering wheel. This adds up to a risky situation. Would I have done something similar if I had AP. Maybe. But, I would also acknowledge that it was a bad and risky thing to do.

Tesla expects people to use common sense with this technology. It is going to be hard for them to put 10 pages of legal disclaimers in the release notes. Nor should they have to.

Do you honestly believe a jury of 12 will take Tesla off the hook because of their fine print and disclaimers when vehicle fatalities are involved? Do you think it's the driver's job to figure out which definition of "highway" Tesla was using when they wrote the release notes? My gosh.

- - - Updated - - -

Using a beta feature in a potentially deadly situation (head on high speed crash,
not just a fender scrape), is ... perhaps exciting, but probably unwise.

What is unwise is releasing this feature as a beta in the first place.
 
I call it reckless endangerment. But I'm not a lawyer.
And I'm sure Tesla's lawyers are comfortable that the disclaimer and the lack of precedent will protect them in the inevitable lawsuits once we see the first crashes.
In the meantime I'm thinking of adding a "still on FW6.2" sticker to my car to assure drivers around me that I'm not being driven around by pre-alpha quality software...

I tend to agree with you. Tesla's lawyers are not the ones on the hook, it's Tesla. The lawyers will find other jobs when Tesla goes bankrupt over a class action lawsuit or a NHTSA recall to disable Autopilot. The rest of us will be left holding $100,000 bags of worthless. Tesla will not survive the sh*tstorm, and I fear it's coming. I love Tesla, but releasing this feature in this state is irresponsible.

- - - Updated - - -

Matias, I watched the video frame by frame, and noted that the Autosteer icon disappeared before the car turned toward the oncoming traffic: hence my conclusion that it was not an Autosteer-commanded turn. If the car is still actively steering after the UI reports it has disengaged...well, I'd be very surprised if that's the way Tesla programmed it.

Isn't Autopilot supposed to pull you safely over to the side if you do not take control after it tells you? Why did AP simple disengage in this case and not pull over to the side of the road, like it is supposed to?
 
I tend to agree with you. Tesla's lawyers are not the ones on the hook, it's Tesla. The lawyers will find other jobs when Tesla goes bankrupt over a class action lawsuit or a NHTSA recall to disable Autopilot. The rest of us will be left holding $100,000 bags of worthless. Tesla will not survive the sh*tstorm, and I fear it's coming. I love Tesla, but releasing this feature in this state is irresponsible.

- - - Updated - - -



Isn't Autopilot supposed to pull you safely over to the side if you do not take control after it tells you? Why did AP simple disengage in this case and not pull over to the side of the road, like it is supposed to?

Because it before it could stop it saw something and tried to avoid it?

Oh, and it's worth pointing out that nobody was hurt, because the meatsack did what it was supposed to do and corrected the car

Feel free not to use it, if you're not comfortable correcting the mistakes it makes.
 
Do you honestly believe a jury of 12 will take Tesla off the hook because of their fine print and disclaimers when vehicle fatalities are involved? Do you think it's the driver's job to figure out which definition of "highway" Tesla was using when they wrote the release notes? My gosh.

---

If you ignore everything else listed in the release notes and play semantics with the word "highway" I guess I can see where you are going... However, this is reality so you can't ignore the other details in the release notes.
 
Last edited by a moderator:
If you ignore everything else listed in the release notes and play semantics with the word "highway" I guess I can see where you are going... However, this is reality so you can't ignore the other details in the release notes.

People are going to use it how they are going to use it. If Tesla truly only wanted us using this feature on the highway only, it would geofence the feature appropriately and only allow it to function when the car is actually on the freeway. But that's not the case, is it?
 
Nice job Zarwin!!! I drove 200 freeway miles yesterday on AP and it performed near flawlessly. Very good job by Tesla.

I slowed it down to 25% right before auto-steer disengages if anyone was interested. What's notable when you watch it slower is that it seems clear the cars would not have collided if he didn't grab the wheel. The other car was even when his hand touched the wheel and well clear by the time the driver turned the steering wheel. Not sure that means much other than the Model S in this particular case did not steer "into" the other vehicle in a manner that would have actually caused a collision.

25percent speed - this is not my video - YouTube