Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

What are the guidelines for using Autopilot on roads with cross traffic?

This site may earn commission on affiliate links.
I'm just saying, it's working EXACTLY as they said it would, nothing more. So until/unless they change the stated expectation, we really shouldn't be upset if it doesn't exceed that stated expectation.

I totally agree. I wasn't up in arms about your post, but if someone (not you) says that as a user group, we should not even be activating AP on non-limited access roads where is intended to be used, I will disagree.

Yes, there is an intention that it will work best in that narrow domain, but for me, it works pretty well everywhere. I am fully aware of the limitations and expect it to do something funky at any time. But I still marvel at how well it does in many situations that I think it should fail and I celebrate the fact that it has gotten way better in the last six months.
 
I totally agree. I wasn't up in arms about your post, but if someone (not you) says that as a user group, we should not even be activating AP on non-limited access roads where is intended to be used, I will disagree.

Yes, there is an intention that it will work best in that narrow domain, but for me, it works pretty well everywhere. I am fully aware of the limitations and expect it to do something funky at any time. But I still marvel at how well it does in many situations that I think it should fail and I celebrate the fact that it has gotten way better in the last six months.
Ah, gotcha. Yes... I often do the "let's see what happens here" approach :D
 
  • Like
Reactions: HighZ
This scenario is quite different than your examples.

This is something that you have to actively engage AND the car has to allow you to engage it.


Err... that's exactly the same as several of my examples.

Pushing the accelerator to override TACC braking for example. You have to actively engage the pedal to do it- and the car has to allow that input to override its own sensors saying you're about it hit something.

Exceeding the posted speed limit (which on most roads the car not only knows- but uses as the default set-speed for cruise control) requires the human actively engaging the accelerator to do it- and the car allowing you to do so since it knows the speed limit in most places.

Driving the wrong way down a 1-way street requires you actively turning the wrong way down such a street- and requires the car to allow you to do it (since in most cases the car KNOWS you're doing that based on map data).


All of those, just like engaging AP in places it's not intended to be used, require you to actively engage in the behavior AND requires the car allowing you to engage in that behavior.


It does let you do them though because it (and systems on cars in general) assume the human knows better than it does- and so human input is ultimately what rules.



T
By allowing us to engage autopilot in these situations one could reasonably assume that Tesla intends for you to do this.



Only if you by the same "logic" assume that Tesla "intends" for you to (among many other examples)- speed, drive without a seatbelt, drive the wrong way down a one-way street, and be able to crash into other cars if you choose do so.

Because those are also all things you can actively choose to do that the car "allows" you to do even though its automatic systems will tell you doing so is a bad idea.
 
Unless you are working for Tesla as a programmer, we do not know.

We absolutely know

Because some hackers have root access to the driving computer and have posted exactly what it is "seeing" and doing.

Drive on a 2 lane road (one lane each way) and the computer sees BOTH lanes as "driveable"

Because it assumes all traffic is always going the same direction

Thus the idea of "cross traffic" makes no sense to the system.

In a 1-way world there's never an oncoming car turning ahead of you across your lane.

Such a car MUST be another car going your way that spun sideways is now either stuck blocking your lane (BRAKE!) or in the middle of a spinout across your lane right now (BRAKE)

Either way- braking is the correct decision based on the fact the system assumes everyone is going the same direction inherently.


So getting mad at the system for working correctly is user error- not software error.
 
Err... that's exactly the same as several of my examples.

Pushing the accelerator to override TACC braking for example. You have to actively engage the pedal to do it- and the car has to allow that input to override its own sensors saying you're about it hit something.

Exceeding the posted speed limit (which on most roads the car not only knows- but uses as the default set-speed for cruise control) requires the human actively engaging the accelerator to do it- and the car allowing you to do so since it knows the speed limit in most places.

Driving the wrong way down a 1-way street requires you actively turning the wrong way down such a street- and requires the car to allow you to do it (since in most cases the car KNOWS you're doing that based on map data).


All of those, just like engaging AP in places it's not intended to be used, require you to actively engage in the behavior AND requires the car allowing you to engage in that behavior.


It does let you do them though because it (and systems on cars in general) assume the human knows better than it does- and so human input is ultimately what rules.







Only if you by the same "logic" assume that Tesla "intends" for you to (among many other examples)- speed, drive without a seatbelt, drive the wrong way down a one-way street, and be able to crash into other cars if you choose do so.

Because those are also all things you can actively choose to do that the car "allows" you to do even though its automatic systems will tell you doing so is a bad idea.
Again I disagree.

If they TRULY didn't want us to use autopilot on city streets they could easily not allow us to do so. You know as well as I do that they want us to do this so that they can help teach the NN how to handle all sorts of situations.
 
Again I disagree.

If they TRULY didn't want us to use autopilot on city streets they could easily not allow us to do so.

If they didn't want us to be able to override TACC braking and slam into another car they could easily allow us not to.

If they didn't want us to exceed the known speed limit, they could easily not allow us to do so.

if they didn't want us drive the wrong way down a known 1-way street they could easily not allow us to do so.

If they didn't want us to drive without a seatbelt on they could easily not allow us to do so.

And on and on.

they don't prevent any of it. But easily could.

So according to you Tesla "wants" us to do those things.

That's nonsensical of course but you seem really committed to that line of thinking for some reason.


A
You know as well as I do that they want us to do this so that they can help teach the NN how to handle all sorts of situations.

That's not how any of that works though.

They can collect all the AP data about what the sensors see (and think they see) without AP being engaged (and do so in fact).

Here's Green (A well known Tesla hacker who has direct access to what the AP computer is seeing) on twitter-

greentheonly said:
Tesla cars don't learn from mistakes and interventions. Do not try to engage the system where it does not work and hope it'll generate some useful data - it won't.
 
If they didn't want us to be able to override TACC braking and slam into another car they could easily allow us not to.

If they didn't want us to exceed the known speed limit, they could easily not allow us to do so.

if they didn't want us drive the wrong way down a known 1-way street they could easily not allow us to do so.

If they didn't want us to drive without a seatbelt on they could easily not allow us to do so.

And on and on.

they don't prevent any of it. But easily could.

So according to you Tesla "wants" us to do those things.

That's nonsensical of course but you seem really committed to that line of thinking for some reason.




That's not how any of that works though.

They can collect all the AP data about what the sensors see (and think they see) without AP being engaged (and do so in fact).

Here's Green (A well known Tesla hacker who has direct access to what the AP computer is seeing) on twitter-
That seems to contradict what Tesla has said themselves but who knows.

Either way - I'm going to keep using Autopilot on City Streets and I'm still going to grumble when it makes mistakes about which lane to stay in. You haven't convinced me otherwise and it's highly unlikely that you will. I appreciate your points and you do make some good ones but they mostly equate to semantics and technicalities as far as I'm concerned.
 
That seems to contradict what Tesla has said themselves but who knows.

Yes- Tesla has not been entirely transparent on how "shadow" mode works.

Thankfully Green has, based on actually seeing what the computer is doing and what data it's sending where. So once again we do actually know how this works. And it's not how you seem to think.

green on Twitter

Read through his comments where he explains that mostly it's Tesla just collecting passive data- and sometimes sending out things like "Capture pictures of motorcycles" so it can better train the NN to know what a motorcycle is.

That doesn't require turning AP on in places it's not intended to work- it just requires you driving (manually or not) any place there's motorcycles- and you don't even know if your car was picked to be on the lookout for them.


Either way - I'm going to keep using Autopilot on City Streets and I'm still going to grumble when it makes mistakes about which lane to stay in. You haven't convinced me otherwise

That's weird- because I've explicitly told you, for a fact that as far as AP is concerned- all lanes are valid lanes because it's inherently assuming ALL lanes are going the same direction.

And we have factual, visual, proof of this from the AP computer itself thanks to hackers like Green- which show that it sees all lanes, even those going "the wrong way" as "driveable" when used on an undivided road as it's not intended to be used

Indeed he's even posted a video of his car moving into the "wrong way" lane while on AP as a great example of why he suggests owners do not use it where it's not intended to be used- because the car isn't intended to operate there and you can count on it making "mistakes" eventually since it's operating on assumptions that aren't true in that domain.




and it's highly unlikely that you will.

Clearly. No amount of facts seem compelling to you on this.
 
  • Funny
Reactions: turnem
Yes- Tesla has not been entirely transparent on how "shadow" mode works.

Thankfully Green has, based on actually seeing what the computer is doing and what data it's sending where. So once again we do actually know how this works. And it's not how you seem to think.

green on Twitter

Read through his comments where he explains that mostly it's Tesla just collecting passive data- and sometimes sending out things like "Capture pictures of motorcycles" so it can better train the NN to know what a motorcycle is.

That doesn't require turning AP on in places it's not intended to work- it just requires you driving (manually or not) any place there's motorcycles- and you don't even know if your car was picked to be on the lookout for them.




That's weird- because I've explicitly told you, for a fact that as far as AP is concerned- all lanes are valid lanes because it's inherently assuming ALL lanes are going the same direction.

And we have factual, visual, proof of this from the AP computer itself thanks to hackers like Green- which show that it sees all lanes, even those going "the wrong way" as "driveable" when used on an undivided road as it's not intended to be used

Indeed he's even posted a video of his car moving into the "wrong way" lane while on AP as a great example of why he suggests owners do not use it where it's not intended to be used- because the car isn't intended to operate there and you can count on it making "mistakes" eventually since it's operating on assumptions that aren't true in that domain.






Clearly. No amount of facts seem compelling to you on this.
Interesting information from Green. You put a lot of faith in a hacker that's for certain! He may be right. I'm sure I'm wrong about everything. But I'm going to keep doing what I'm doing anyway. I'm just weird like that I guess.
 
Here's Green (A well known Tesla hacker who has direct access to what the AP computer is seeing) on twitter-
Tesla cars don't learn from mistakes and interventions. Do not try to engage the system where it does not work and hope it'll generate some useful data - it won't.​
He also found autopilot trip logs that he noted was quite boring and don't contain images, but this data can be quite useful in aggregate. As noted in his tweets below, this data is always sent to Tesla over cell connection and even if data collection is turned off, so Tesla probably thinks this data is quite valuable too.

green on Twitter

This telemetry data includes when and where Autopilot is available or active as well as locations of where and how the driver disengaged Autopilot. This means across many users using Autopilot on city streets, Tesla can figure out which intersections are problematic and if they're improving or not.

Tesla knows that owners use Autopilot on city streets, and they actively leverage the data from explicitly not "intended" usage. Quite a few images and videos shown on Autonomy day of fleet data collection were on city streets, so Tesla is collecting rich data in addition to telemetry data too.
 
I read through the CBS news article that talked up the three recent crashes. I post it here since it does mention the push by Jason Levine from the Center for Auto Safety who has "called on the agency to require Tesla to limit the use of Autopilot to mainly four-lane divided highways without cross traffic."

I get that this article is supposed to generate news traffic, talking about 13 crashes since 2016 where Autopilot was supposedly engaged.

But who is collecting the data on the number of crashes prevented and the number of lives potentially saved BECAUSE of Autopilot? I have to guess that it is a huge magnitude above 13.

There will be those who put too much reliance on Autopilot as the article states, but I see people every day who choose to text, put on makeup or read the newspaper while driving who rely only on themselves. That's when I get out of their way.
 
Last edited:
Interesting information from Green. You put a lot of faith in a hacker that's for certain! He may be right. I'm sure I'm wrong about everything. But I'm going to keep doing what I'm doing anyway. I'm just weird like that I guess.

If it were just him claiming "I think it works like this" that'd be one thing.


But he has repeatedly, for years, not only revealed info confirmed over and over elsewhere to be accurate on many topics, he also for years has posted direct capture of the output of the computers and shown actual video of the overlays, labeling, and other elements of what the AP systems are doing.



This telemetry data includes when and where Autopilot is available or active as well as locations of where and how the driver disengaged Autopilot. This means across many users using Autopilot on city streets, Tesla can figure out which intersections are problematic and if they're improving or not.

All the report contains, as your link tells us, is: coordinates, how did you disengage, speed, heading and time

No video or pictures are included



Quite a few images and videos shown on Autonomy day of fleet data collection were on city streets, so Tesla is collecting rich data in addition to telemetry data too.

But none of that collection requires AP to be "on" ever.

It collects that data passively regardless of AP being active or not as Green describes.

And indeed when it DOES log disengagement, such video/images are not part of the data sent. Just where, when, and how you disengaged the system.
 
40.50.7 still unnecessarily brakes for cross traffic which makes cruise control unsafe around town. If they can't figure out how to not have this happen, they need to let me select a dumb cruise setting so I can use cruise control just like my 25 year old truck I had in high school worked.
 
  • Disagree
Reactions: Knightshade
40.50.7 still unnecessarily brakes for cross traffic

No, it doesn't.

It necessarily brakes for what the system has to assume is another driver either stalled across your lane sideways, or currently spinning out across your lane sideways.

Since the fundamental assumption of the system is that everyone is driving the same direction and there is no cross traffic

your inability to accept this fact doesn't make what it's doing incorrect- it makes your assumptions about the system incorrect.
 
  • Disagree
Reactions: diezel_dave
No, it doesn't.

If you read Diezel_Dave's statement, what he says is not wrong from the perspective of someone who is expecting the car to operate as other cars already do. My Jeep and my wife's Subaru with adaptive cruise do not brake for cross traffic.

40.50.7 still unnecessarily brakes for cross traffic

His statement that the software does brake and his opinion (and mine) that it is unnecessary are both true. You can say that the software as written is operating correctly according to its code, but you cannot deny a) it does brake and b) it is unnecessary in the real world. It doesn't mean that there is anything wrong with the software.

which makes cruise control unsafe around town.

This is also his opinion - I happen to expect that braking action so I will react to it before my wife gets too upset with me. I will let it brake sometimes to gauge whether this is being improved in the software.

But we can state the obvious and it shouldn't cause a response that we are wrong. I can have my opinion even if we don't agree.
 
If you read Diezel_Dave's statement, what he says is not wrong from the perspective of someone who is expecting the car to operate as other cars already do.


That's an unreasonable expectation.

Different cars operate differently.

Especially driver aid systems- where if you read some of the cross-reviews of various makers systems against each other you see a wide range of different behaviors and abilities.

If you're going to own such a car it is incumbent on you to learn how your cars system works, not operate assumptively based on how you IMAGINE it PROBABLY works from how some other makers system worked.




His statement that the software does brake and his opinion (and mine) that it is unnecessary are both true.

They're not though. Because it's absolutely necessary and correct for the domain the system is intended to operate in.



You can say that the software as written is operating correctly according to its code, but you cannot deny a) it does brake and b) it is unnecessary in the real world.

Again- I can. It's neccsary because based on the design of the system if it sees a car sideways across it's path, braking is absolutely necessary

In a domain where all cars are going the same direction and there's no cross traffic, seeing such a thing requires braking because it means something up ahead is or has gone terribly wrong and slowing down is what you ought be doing.


But we can state the obvious and it shouldn't cause a response that we are wrong. I can have my opinion even if we don't agree.


You can have an opinion. But this is not an opinion.

It's a fact the AP system is designed and intended to be used where all traffic is going the same direction and there's no cross traffic.

So it's a fact that "something is sideways ahead in my lane" means "brake" is the correct response to that input.


The fact you wish to use the system in places where the fundamental assumptions of the system are not true doesn't change these facts.
 
That's an unreasonable expectation.

You need to re-read my post more slowly. Maybe we just have divergent intrepretations of the english language. If I buy something that operates correctly as expected, but still does something that I judge is unnecessary, I can still call it unnecessary. You can call it whatever you want, but don't tell me how I see it.

If you're going to own such a car it is incumbent on you to learn how your cars system works, not operate assumptively based on how you IMAGINE it PROBABLY works from how some other makers system worked.

I don't get why you are saying that - no one has ever said we don't know how the car is operating - we are just saying it is doing something that is unnecessary. We are not imagining anything other than making our own subjective determination of how we would like it to work.

Because it's absolutely necessary and correct for the domain the system is intended to operate in.

Being "necessary" and operating as programmed are too different things. Are you a programmer? I am and I can certainly program something that is unnecessary to an objective. Maybe it is due to limitations I have as a programmer or because of system components. But it can both be unnecessary and operate as I intended. If you disagree, please look up the word necessary.

So it's a fact that "something is sideways ahead in my lane" means "brake" is the correct response to that input.

I agree it is a correct response as programmed, but it can still be unnecessary and the fact is, it still braked. So the statement made is still true on both accounts.
 
  • Like
Reactions: diezel_dave
You need to re-read my post more slowly. Maybe we just have divergent intrepretations of the english language. If I buy something that operates correctly as expected, but still does something that I judge is unnecessary, I can still call it unnecessary. You can call it whatever you want, but don't tell me how I see it.

You can see it however you like, but unnecessary is factually wrong based on the design and intent of the system.




I don't get why you are saying that - no one has ever said we don't know how the car is operating

Actually yes they have. In this very thread if you go back and read some of the posts.

You own for example where you said

YOU said:
Unless you are working for Tesla as a programmer, we do not know.

Now you're claiming "nobody" said we don't know.

Apparently even you don't agree with you.


Anyway, we DO know how it operates, and thus why such braking is necessary for the operational domain of the system.

Choosing to use it outside that domain, then being surprised it doesn't work how you wish it would for a purpose it's explicitly not intended for, doesn't make anything "unnecessary" except said complaint :)



Being "necessary" and operating as programmed are too different things. Are you a programmer?

Yup, among other engineering work I've done.


I am and I can certainly program something that is unnecessary to an objective.

You can, but this is the opposite of that.

This is you not understanding the objective and why braking in that situation is necessary

The OBJECTIVE is to safely operate on limited-access divided freeways.

One way they do that is they program on the assumption all lanes are going the same direction. This greatly reduces the complexity of the solution.

As a result, in such a system it is necessary to brake when you see a car sideways across your driving lane because that is the only safe and correct behavior in the situation the system is explicitly intended to operate in.

failing to brake for such input would be dangerous. In fact- 2 of the most famous deaths on autopilot were exactly the case of older AP1 systems failing to brake for cross traffic for users who chose to engage the system someplace it was explicitly not intended to be used.
 
  • Disagree
Reactions: diezel_dave
Now that we are hopefully back on topic and past those silly side-bars, what behavior would you guys like to see in these situations? Personally, I wish the car took the crossing vehicle's speed and trajectory in to account and if it calculated with reasonable certainty that it won't be a collision hazard, it simply colored that car yellow on the screen to let the driver know the car sees what is going on and the car just continues on it's normal course exactly like a human would do.
 
  • Like
Reactions: HighZ