Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilot disengagement, driving etc. out of main.

This site may earn commission on affiliate links.
I wish the car would stay a bit closer to the one in front on the most aggressive setting for dense highway driving.

Yes, but I think the issue then, if you recall the guy doing 80 mph on the PA Turnpike with it set to "1" following distance, the car will not have enough time to react if the guy in front of you locks up the brakes.
Personally I think that until the majority of cars have FSD, or communicate with each other, there is no proper way to deal with people who will cut right over on your bumper. It seems to be the new way of driving, and its spreading across America. I hate it. The "2 second rule" they taught us in Driver Ed 45 years ago is a joke today.
 
Yes, but I think the issue then, if you recall the guy doing 80 mph on the PA Turnpike with it set to "1" following distance, the call will not have enough time to react if the guy in front of you locks up the brake.
Personally I think that until the majority of cars have FSD, or communicate with each other, there is no proper way to deal with people who will cut right over on your bumper. It seems to be the new way of driving, and its spreading across America. I hate it. The "2 second rule" they taught us in Driver Ed 45 years ago is a joke today.

Aren't the following distance proportional to speed?
Radar could detect the car in front braking before the incandescent brake lights even have a chance to illuminate (possibly overestimating, but the initial release of the accelerator would start the response).
As long as there are human drivers or communication failures/ interference/ mechanical failures V2V is only a way to make implementation easier in some cases.
Yeah, there is fail safe driving, and then there is practical driving...
 
Yes, but I think the issue then, if you recall the guy doing 80 mph on the PA Turnpike with it set to "1" following distance, the car will not have enough time to react if the guy in front of you locks up the brakes.
Personally I think that until the majority of cars have FSD, or communicate with each other, there is no proper way to deal with people who will cut right over on your bumper. It seems to be the new way of driving, and its spreading across America. I hate it. The "2 second rule" they taught us in Driver Ed 45 years ago is a joke today.

Can’t agree more.

There seems to be some huge gap in understanding basic physics, as if driving is just operating a video game where all the cartoon cars have zero mass. And, good grief, hanging back a safe distance gets you there at almost the same time, and at a much lower fatality rate.

So you have the fool cutting in front of you at 80mph a foot or two in front of your bumper, and the fool following behind you at a foot or two while texting, so if you or your car brakes at all you have a nice chain reaction. And all of these folks think their driving skills are “better than average”.
 
Yes, but I think the issue then, if you recall the guy doing 80 mph on the PA Turnpike with it set to "1" following distance, the car will not have enough time to react if the guy in front of you locks up the brakes.
Personally I think that until the majority of cars have FSD, or communicate with each other, there is no proper way to deal with people who will cut right over on your bumper. It seems to be the new way of driving, and its spreading across America. I hate it. The "2 second rule" they taught us in Driver Ed 45 years ago is a joke today.
Even the two second rule isn't enough unless your reflexes are equivalent to a professional prize fighter. Three seconds gives you a slim chance. Who cares if others pass you? If you are traveling at a set speed, you will get to your destination at the same time regardless of those who can't stand a car to be in front of them.
 
  • Like
Reactions: APotatoGod
The interior camera should be able to tell.
How? What does a picture tell you about the attentiveness of a driver? A video doesn't really add anything. I think its cadillac that is doing eye monitoring, but no matter what system you devise you have to have a tolerance, a threshold, some arbitrary cut off. You can throw a neural network at it to avoid specifically designating the threshold, but then you are left with a probability.

There really isn't a good solution.
 
FYI, my solution, since I figured out it works is: keep only 1 hand on the wheel, at ~9 o-clock. It just needs to detect an unbalanced torque on the wheel, so anything of weight on one side but not the other will keep it happy indefinitely.
That doesn't work for me. In fact, it is much more likely to complain when I only have one hand on the wheel.

Don't get me wrong, I'm glad you have a solution that works for you -- but I haven't found anything that reliably keeps the nagger at bay on a long drive -- particularly, but not necessarily, with extended straight stretches. Think Oklahoma and Texas.
 
To answer @KarenRei's original question, I drive with NoA as much as I can. I want to contribute to the learning. On an average commute day I disengage twice per 30 miles. Here are the reasons (from most to least common):

1. There's a place when merging from CA-52 to I-5, where if I take control I can get over to the HOV express lanes, but AP won't even try (yes, I have it selected). This happens on every trip.

2. Once on the express lanes, they have a movable center barrier. Autopilot recognizes that there are two lanes, but sometimes it will want to change into the left lane, try, pingpong, try again, then cancel. I have to disengage to actually make the lane change, then re-engage. Usually this happens on sunny days, I think the shadow of the movable barrier confuses it. I've reported it.

3. Sometimes too timid.

3.5. If my wife is in the car, she won't let me use it because it leaves more space behind the car in front and this offends her, especially in stop-start traffic.

4. Often wants me in the right lane to exit miles before the actual exit, so I often either cancel its change, initiate a change back, or override and disengage.

5. Twice in my memory (about 2000 miles) it has failed to take an exit.

6. Once near Disneyland southbound, there is a carpool lane that splits and part goes elevated for a couple of miles before exiting on (I think) CA-55. This is the only time it has totally freaked out on me, it couldn't decide which side of the split to take. I was wondering what it would do... so it took no time for me to resume control.
 
Recent trip I had many disengagements and overall the worst NoA performance I've ever had. Too many issues to enumerate, best case explanation is that I was on roads/traffic that they lacked much training on, but is there really any excuse for capping maximum speed at 60mph when the posted (and acknowledged/displayed by Tesla) speed limit is 75mph?

Fortunately there was no one beside me, in front of me, or behind me for hundreds of yards when my car did a nose dive from 80mph to 60mph. More than once.
 
How recent and what software version?
almost said, knew I should've. 2019.8.5 -- AFAIK that is the latest version.

Mostly, this was on roads I've never been on, and to the extent it was on previously driven it was better -- but that isn't saying much because that was in December (a few versions back) and I had to turn it off that time. Each successive version has mostly gotten better, sometimes significantly. This one has -- for the driving I've done -- not been an improvement.

The impression (right or wrong) that I've gotten is that it performs better where more Tesla's have driven and this may be in part an issue of uneven distribution of their miles driven, but the reason I mentioned the speed being capped is that I can't think of a single reason for that one. Even if it was just a matter of the roads not being well driving by Teslas its still bad. If AP/NoA can't handle interstates across the country that isn't good.

Its one thing when Tesla thinks the speed limit is 45mph (or less) when it is 55mph -- a situation I've encountered frequently when off interstate. Kind of silly when it reaches 55mph leaving town and then drops to 45mph when you get farther out, but at least you know why autosteer is capping the speed. But capping speed at 60mph when the speed limit is being (correctly) displayed as 75mph? That boggles.

Its also pretty bad when it can't change lanes to pass because the slow lane (to the right of current lane) is ending and the display of the lane lines is doing a spirograph gone mad. IIRC it didn't even disengage on its own, I took over.

Or when it firmly tries to keep me from taking the exit correctly indicated by the navigation. Moving into the exit lane triggered an immediate return to the traffic lane. Repeatedly. In the end I disengaged it because after they split it still tried to veer left to return to the traffic lane. In order to take the exit that I was already on. SMH.

I know others have said this update has been a significant improvement for them -- so I think it is important to realize that that is not a consistent experience.

For what its worth, I make extensive use of autosteer and use NoA anytime I'm on a highway. Well, when it lets me. I was on one that was too hilly and it disengaged cresting any hill due to cruise control. I gave up after the third hill. Another case is the lane centering often ends up crowding left and on narrow two lane highways I don't accept the risk -- its bad enough with oncoming traffic crossing the line, I don't need to be crowding it. But for not having quite reached 4,000 miles yet -- and around half of that on the interstate -- I still have a good amount of exposure on an individual basis.
 
That's definitely our experience as well. Interventions are probably now 95%+ for overly conservative actions rather than incorrect decisions/actions.

That was true for me until 2019.8.x. Now, it is mostly because the car did something excessively stupid. :(

In particular, since that "upgrade", my car is freaking out at exits a lot more often (wide lane bugs), suddenly thinking the speed limit is 20+ MPH slower than it actually is and slamming on the brakes, hugging walls, crossing the double-yellow line, and various other behaviors that I had not experienced in a long while.
 
  • Informative
Reactions: humbaba
Do you have Use HOV selected in the navigation menu?

Dan

Yes, I do have that enabled. I have to clarify my statement: I do not mean it never uses the HOV, but that it is rarely using it, prefers to leave when I would have stayed in the HOV.

To further elaborate on my problems (including the use of HOV lane), I have documented my problems with NoA on my trip this morning using tesla dashcam clips:

1. The first clip shows that I am traveling on the left-most main lane of the busy 3-lane highway, when the HOV lane becomes available on the left. NoA makes an attempt to switch (about 30s in the clip), but it is too slow, does not accelerate, a car comes up (HOV lane moving significantly faster) so it cancels attempt, I see it gives up on lane change so I take over (at about 37s) and make the change manually on the next opportunity.
Front camera clip.
Left camera of same.

2. A couple of minutes later a highway interchange is coming up for which I need to leave the HOV lane. NoA indicates that it wants to switch, but shows red line as another car travels parallel to me in the next lane. There is space both in front and behind the car, so NoA would need to either accelerate or decelerate a bit to be able to make the switch, but it fails, keeps driving side-by-side. When the dashed section between HOV and main lanes is about to end, I take over and quickly make the switch manually. NoA would have stayed in HOV and miss the opportunity for the upcoming highway interchange.
Front camera footage.

3. Next mistake is still visible on the same clip above (at around 20 seconds in). I have turned back on NoA and now traveling in leftmost lane of the 3, but for the highway interchange I need to be at least in the middle lane (which splits), I am watching NoA and see no indication that it would plan to make the switch, so I take over and switch manually. This mistake would have sent me further down on the wrong highway, causing a significant detour to get back to my destination.

4. The fourth mistake of NoA is still on the above clip (at around 39 seconds in). After the highway split, a new HOV lane starts on the left, but NoA makes no indication that it would make the switch for it, it wants to stay on the busy main lane, while the HOV lane is sparse, so I take over and manually change to HOV.

5. The next time there is a dashed line between the HOV and the main lanes, NoA indicates that it wants to switch to the main lanes leaving the empty HOV for the busy main lanes as shown in the clip here. I have manually cancelled this lane change to stay on the HOV.

6. At the next opportunity, NoA again wants to leave the HOV lane, this time the main lanes are not so busy (but the HOV is completely empty) as shown in this clip, but this time I have allowed NoA to make the change (at about 50s in the video) even though I could have traveled another 10km in the HOV before needing to leave for my destination.

7. Further down the main lanes got busier again, while the HOV was clean (which is why I would have preferred to stay in HOV, I know this traffic pattern from experience). The next clip shows how NoA decides to switch into the busier and slower middle lane from the empty left lane at around 38 seconds into the clip. This is still about 6 km before my exit, so unnecessarily early to make the move to the right.

So that is 7 bad decisions of the NoA within the time frame of roughly half an hour. None of them was safety concern, but two of them (#2 and #3) would have caused me to stay on the wrong highway and miss my interchange.

ps: My sw version number is 2019.8.5. 3aaa23d
 
Last edited:
Yes, I do have that enabled. I have to clarify my statement: I do not mean it never uses the HOV, but that it rarely using it, prefers to leave when I would have stayed in the HOV.

To further elaborate on my problems (including the use of HOV lane), I have documented my problems with NoA on my trip this morning using tesla dashcam clips:

1. The first clip shows that I am traveling on the left-most main lane of the busy 3-lane highway, when the HOV lane becomes available on the left. NoA makes an attempt to switch (about 30s in the clip), but it is too slow, does not accelerate, a car comes up (HOV lane moving significantly faster) so it cancels attempt, I see it gives up on lane change so I take over (at about 37s) and make the change manually on the next opportunity.
Front camera clip.
Left camera of same.

2. A couple of minutes later a highway interchange is coming up for which I need to leave the HOV lane. NoA indicates that it wants to switch, but shows red line as another car travels parallel to me in the next lane. There is space both in front and behind the car, so NoA would need to either accelerate or decelerate a bit to be able to make the switch, but it fails, keeps driving side-by-side. When the dashed section between HOV and main lanes is about to end, I take over and quickly make the switch manually. NoA would have stayed in HOV and miss the opportunity for the upcoming highway interchange.
Front camera footage.

3. Next mistake is still visible on the same clip above (at around 20 seconds in). I have turned back on NoA and now traveling in leftmost lane of the 3, but for the highway interchange I need to be at least in the middle lane (which splits), I am watching NoA and see no indication that it would plan to make the switch, so I take over and switch manually. This mistake would have sent me further down on the wrong highway, causing a significant detour to get back to my destination.

4. The fourth mistake of NoA is still on the above clip (at around 39 seconds in). After the highway split, a new HOV lane starts on the left, but NoA makes no indication that it would make the switch for it, it wants to stay on the busy main lane, while the HOV lane is sparse, so I take over and manually change to HOV.

5. The next time there is a dashed line between the HOV and the main lanes, NoA indicates that it wants to switch to the main lanes leaving the empty HOV for the busy main lanes as shown in the clip here. I have manually cancelled this lane change to stay on the HOV.

6. At the next opportunity, NoA again wants to leave the HOV lane, this time the main lanes are not so busy (but the HOV is completely empty) as shown in this clip, but this time I have allowed NoA to make the change (at about 50s in the video) even though I could have traveled another 10km in the HOV before needing to leave for my destination.

7. Further down the main lanes got busier again, while the HOV was clean (which is why I would have preferred to stay in HOV, I know this traffic pattern from experience). The next clip shows how NoA decides to switch into the busier and slower middle lane from the empty left lane at around 38 seconds into the clip. This is still about 6 km before my exit, so unnecessarily early to make the move to the right.

So that is 7 bad decisions of the NoA within the time frame of roughly half an hour. None of them was safety concern, but two of them (#2 and #3) would have cause me to stay on the wrong highway and miss my interchange.

ps: My sw version number is 2019.8.5. 3aaa23d
My take on all of this is that Tesla will address these issues (directly or indirectly) in a future update. I remember seeing comments about regressions in autopilot before I bought a Tesla so I'm not exactly surprised, just disappointed to have finally experienced one myself.

I still think that AP is a significant driver aid and my driving is no longer limited to my daily (small town) commute, but I am (with AP/NoA) the primary driver for any trip out of town. So I'm just looking forward to the next version and the improvements it will bring.
 
That doesn't work for me. In fact, it is much more likely to complain when I only have one hand on the wheel.

Don't get me wrong, I'm glad you have a solution that works for you -- but I haven't found anything that reliably keeps the nagger at bay on a long drive -- particularly, but not necessarily, with extended straight stretches. Think Oklahoma and Texas.
That really sounds weird to me. Not saying you aren't experiencing it. Just saying that it shouldn't be that way. Putting two hands on the wheel often does not do away with the nags since the load on the wheel is even on both sides. With one hand on the wheel though (assuming you allow the weight of your arm to pull down slightly) should be enough to eliminate any nags. I have driven indefinitely this way with no nags whatsoever. What I had to get over was how weird it felt to actually trust putting weight on the wheel. You intuitively think it is going to cause it to turn. Takes some getting used to.

If that method is not working for you I would think it is time to call Tesla. Something doesn't sound right.

Dan
 
I may be in the minority, but I appreciate that Tesla has the nags. In previous years there was no nag, or the nag time was greater and people took advantage and did dumb things (remember the orange/grapefruit and other nag defeating devices). As the system improves and there's less need for a driver to take over, the time between nags will increase. It's a safety features that may be annoying to those of us using AP responsibly, but there are far too many new owners out there that may not be as responsible.
 
  • Like
Reactions: APotatoGod
There is lot’s of talk about NoA performance on HW2. I think this will be irrelevant shortly.

The new chipset will allow vastly more neurons in the network, vastly larger matrices to solve, and give dramatically improved performance.

What’s neat is that none of us have seen HW3 perform yet.

I don’t expect the investor event to show a polished, fully automated experience. But I believe it will be able to show general handling of stoplights, intersections, and be able to navigate from one place to another (perhaps with some safety driver intervention).

Even if Tesla can do a demo with a few interventions for unusual cases while getting from one end of town to the other, this will show how far along Tesla is. Zero intervention should not be expected or needed at this point. It’s the pace of innovation that matters. Tesla has gotten where they are, from scratch, in just a few short years. That’s amazing.

Tesla already has the system in a deployable package. It’s already on cars being sold. Tesla is collecting or has the potential to collect more data than anyone else.

Tesla’s solution is vision-based and therefore cheaper than LIDAR systems.

If they can even demonstrate that they are roughly on a path to autonomy within a few years, I think the stock will see a big uptick.

Look at what the hugely-valued Waymo has going against it (and I don’t mean to pick on Waymo, their research has obviously pushed the boundaries of autonomous driving):

1. Their system is not packaged for production. Sensors hanging all over the roof. Not reasonably deployable in a production vehicle without significant repackaging and a deal with auto manufacturers.

2. Includes LIDAR, which raises the price of the system.

3. Relies on geofenced, detailed mapping data in a small geographic area which means it doesn’t scale and can only be used in a limited area.

NONE of these problems apply to Tesla. Their system is already packaged and deployed, they avoid the extra cost of LIDAR, and it’s a general solution.

If Tesla can even show that they’re getting close—even if they have to take over a few times during a demo—that will still be worth a lot to the share price, I think.

Based on Mr. Musk recent interview, it appears there is at least one vehicle that can do ;0)
 
Well to be honest, I am not a huge fan of navigate-on-autopilot. I have a 2 hour commute per work day (1 each direction), most of it (~90%) on highway where I use autopilot extensively. But, NoA, I do not like and mostly due to its choices of lanes. It does not want to use the HOV lane, which is dumb when the traffic is heavy (usual) on the main lanes. It also wants to got out to the slowest lane ~3km before the exit, which is again dumb as it gets behind slow trucks. Then it starts overtaking them, sometimes very close to the exit, yet another dumb thing.

While AP itself is getting better, even lane changes requested by indicator is quite good now (used to be finicky, undecided, taking too long), but the lane choice logic of NoA is far too primitive. I am sure it is more useful for driving routes that you do not know much and would need to rely on the navigation to know which interchange to take. However, for routine daily driving I can make much better lane choices based on experience. For this reason I rarely use NoA, once in a while after each sw update to see how they improved it, but so far I am not impressed so I stick to basic AP on daily use.

I agree with the first post. Biggest complaints: Always trying to get out of the "passing" (HOV) lane even if no cars behind; Moving over to the right hand lane when there is a freeway on ramp within a quarter of a mile or so, moving over behind slow trucks for no apparent reason or because it want to speed up which then causes it to change lanes again within 30 seconds or so. Had a 50 mile drive south on 101 yesterday and finally disengaged and went back to "regular" autopilot.
 
How? What does a picture tell you about the attentiveness of a driver? A video doesn't really add anything. I think its cadillac that is doing eye monitoring, but no matter what system you devise you have to have a tolerance, a threshold, some arbitrary cut off. You can throw a neural network at it to avoid specifically designating the threshold, but then you are left with a probability.

There really isn't a good solution.

I am holding out that the ramp up to FSD and beyond is the solution.. as far as accidents, etc. are concerned.

I’m not sure the non-believers of AI travel will adapt as quickly. Any protection from a speeding, 2 ton, projectile will be utilized, by me anyway. Once we prove it can work with others, the surviving car manufacturers will begin the roll out of their version of the FSD solution.
 
  • Like
Reactions: humbaba
I have at least 6.000 miles on EAP specifically so, I’m not the seasoned vet here. Failure of the autonomous technology should include the newly annoying lane changes mentioned... I didn’t read it but assume some one mentioned it’s new deal that it thinks it needs to change lanes to the exit side if the highway, 2 miles from your exit.

How many times I take over for “auto failures” on long trips (>1,000 mi) is dependent on the traffic and terrain. At least once between charges, which is anywhere from 180 to 270 miles per.

Coming into a city the frequency increases exponentially depending on exits, traffic, conditions, sometimes more than 1 per mile. Because of all the conservative maneuvers AP thinks it needs to “suggest”, at times, it quickly becomes safer to turn off the pilot and just keep TACC running.
 
  • Disagree
Reactions: Barensn