Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

7.1 AutoPilot Nag

This site may earn commission on affiliate links.
On my recent ~1700 mile road trip (1500 trip, 200 destination driving) I used AP extensively. The nag was annoying, but not aggravating enough to not use it. What is aggravating is to know that the vehicle is capable of much more, and that it has been crippled because there are idiots out there who had to put videos on YouTube of themselves doing idiotic things. Tesla had to do something. I would much prefer a waiver that I have to OK each time I get into the car vs. the nags. Don't let the .001% of owners ruin it for the rest of us. I hate to say it, but I suspect the only way we're going to get Tesla to change the nag behavior if we start a letter/feedback campaign. Enough owners have to speak up and be vocal about their disappointment that Tesla has to respond.

Another letter! I'm excited! :cool:
 
... personally be willing to sign a waiver absolving Tesla of any and all
liability regarding autopilot since I'm confident enough in the tech to believe that
it's not going to override me and drive me into a wall or anything crazy and unavoidable.

This (over) confidence is exactly where the problem lies.

This Beta AutoSteering software APPEARS to work rather well,
as long as it has good input ... and many will too easily start
to "trust" this Beta AS (and TACC) to not get confused and make
serious mistakes.

However, the reason, for now, to try and tell you, beg you, remind
you, and even force you to keep your hands on the steering wheel
is simple ... the software is not yet sufficiently bug-free to make
hands-free driving sufficiently failsafe. You need to be able to
sense a too-large twitch of the steering wheel, and immediately
take over, not wait to feel the car swerve dangerously and try, too
late, to take over control of the car.

Occasionally, rarely, there might be an AP software failure such that the car
just abruptly tries to veer to the side, possibly into some obstruction.

During your AP driving, you might never have experenced this kind of
software failure (abort, freeze, or crash), but it is extremely rare for
complex software to not have these "features", and you have been lucky.

I, personally, have experienced a "going out of lane" swerve type failure
twice, once on the original 7.0 software, and once with the newer 7.1
revision of software. The first time there was no other traffic nearby
on the freeway, and the second time was when using one of two
long, parallel, exit-only lanes. Having my hands on the wheel enough
to sense the unusual, and unexpected, steering maneuver ... that
is what saved us, I am convinced.

So, be safe, and keep at least one hand on the wheel to learn to
sense what the AP is doing normally, and to learn when it is not
doing the right thing ... and instantly be able to take over. If the
software is broken, it will not be able to ask you to take over,
or give you time to find your hands and get them on the wheel.

At around 80 feet per second (or faster), one does not have much
time to recognize a disaster, react to it, achieve control, and avoid
the collision, wall, cliff, or obstacle. Please choose safety for
the sake of the entire EV movement.

At this point in its development, safer use of AS requires
MORE attention and awareness, not less.

Eventually, we might be able to trust the AP more.
 
Last edited:
Actually, Tesla seems to have already thought about potential abrupt steering. In short, the car doesn't allow it even when autopilot thinks that's what it needs to do.

They have in the dev settings a steering rate limit which is basically how much steering change is autopilot allowed to make in a given time. If the value is exceeded the car softens the steering rate for a timeout period while displaying a dev error for "Steering rate limited." It happens any time the car makes a jerky motion. It isn't displayed to normal users, though. But it seems to be a front-line defense against the car doing any crazy steering, which is a great plan.

In short, autopilot isn't capable of making a very abrupt steering change. Any of the videos you've seen of the car supposedly steering into things like the stupidly labeled "autopilot tried to kill me" one was after autopilot had disengaged steering control and the road surface was what caused the direction change.

In any case, *I* know how much I can trust it.

@garygid: You don't have to hit enter or make a new line when you reach the end of the text box. It makes your posts difficult to read, especially on a smaller device since... Word wrap has been around for a long time. Just keep typing and double-enter for paragraphs only. This is what your post looks like to me on my phone:

Screenshot_2016-01-29-15-23-13.png
 
I will toss in one situation where the Model X with 7.1 suddenly slowed on the east bound 91 Express Lanes with a curve to the left in front of me and normal heavy traffic in the non-express lanes. The AP apparently thought that cars were stopped ahead. The only problem, the cars were in a lane I cannot access. With the Express Lanes fairly empty, it was easy to push the accelerator in time and not have someone crash into the back of the car.

One other anomaly occurs when there are two yellow lines on the left with a white line on the far left. AP sees the white line, so the tires were hitting the reflectors on the yellow line. Not very intelligent if you ask me!
 
Actually, Tesla seems to have already thought about potential abrupt steering. In short, the car doesn't allow it even when autopilot thinks that's what it needs to do.

They have in the dev settings a steering rate limit which is basically how much steering change is autopilot allowed to make in a given time. If the value is exceeded the car softens the steering rate for a timeout period while displaying a dev error for "Steering rate limited." It happens any time the car makes a jerky motion. It isn't displayed to normal users, though. But it seems to be a front-line defense against the car doing any crazy steering, which is a great plan.

In short, autopilot isn't capable of making a very abrupt steering change. Any of the videos you've seen of the car supposedly steering into things like the stupidly labeled "autopilot tried to kill me" one was after autopilot had disengaged steering control and the road surface was what caused the direction change.

In any case, *I* know how much I can trust it.

@garygid: You don't have to hit enter or make a new line when you reach the end of the text box. It makes your posts difficult to read, especially on a smaller device since... Word wrap has been around for a long time. Just keep typing and double-enter for paragraphs only. This is what your post looks like to me on my phone:

View attachment 109684

Sorry to disagree with you.

WK057 is on right on target.

I am perfectly capable of monitoring and reacting to any abnormal AP steering changes without having to have my hands on the steering wheel at all times. In fact, requiring this completely defeats the primary purpose of AP, which is to reduce the workload on the driver so that he/she can more closely monitor the vehicle and its vector/path.
 
While the AP software is working properly, wk057 is probably correct, unless he is looking at features that are not yet activated. However, his experience is partly based, it appears, on the assumption that the software never fails. Poor assumption, I believe.

He might say that he believes that the car cannot swerve abruptly, but my car HAS swerved abruptly enough to be a problem.

Have you ever had software that you are using unexpectedly pause, freeze, crash, or fail to do what you expected it to do?

I have about 54 years experience with software. Subtle timing interactions with hardware or other software are hard to debug.

Even in this 21st century, large "vehicles" on tracks still crash. But, believe what you must. Cheers, Gary
 
The car still can swerve to the left on a undivided road right when it crests a hill on 7.1 and does not see the road ahead. I have had that happen multiple times, but it happens less than it did on 7.0

Putting a small weight on the side of the wheel disables the nag. Hopefully tesla does it on its own in 7.2.
 
The car uses the onboard nav maps to make the decision when it can't see and doesn't disengage. It tries to track its offset from the center of the road to know its lane relative to the nav maps.... but it's always wrong because the built in maps are terrible and outdated, resulting in it veering too far to one side or the other. This triggers the steering rate reducing I mentioned above, which lasts around 2 seconds in my experience.

Depends on confidence prior to the hill on whether or not it will go to "take over immediately" it seems.
 
The car uses the onboard nav maps to make the decision when it can't see and doesn't disengage. It tries to track its offset from the center of the road to know its lane relative to the nav maps.... but it's always wrong because the built in maps are terrible and outdated, resulting in it veering too far to one side or the other.

EDIT: n/m I see you answered my question on a different thread - thanks!

wk057 - how fast do you think Tesla can build these much-talked-about "high precision maps" that are supposedly being built by the actions of the entire customer fleet in their normal daily driving?
 
EDIT: n/m I see you answered my question on a different thread - thanks!

wk057 - how fast do you think Tesla can build these much-talked-about "high precision maps" that are supposedly being built by the actions of the entire customer fleet in their normal daily driving?

It doesn't appear that Tesla is collecting sufficient data from production vehicles to produce such maps, currently. I could be wrong, but I dont see it.
 
The car uses the onboard nav maps to make the decision when it can't see and doesn't disengage. It tries to track its offset from the center of the road to know its lane relative to the nav maps.... but it's always wrong because the built in maps are terrible and outdated, resulting in it veering too far to one side or the other. This triggers the steering rate reducing I mentioned above, which lasts around 2 seconds in my experience.

Depends on confidence prior to the hill on whether or not it will go to "take over immediately" it seems.

But it seems to jerk the wheel and the car to the left. It does not just make the steering loose and let the car go straight ahead.
 
I just got through driving about 2 hours back and forth to a new location ill occasionally be working at. I figured this would be a good time to see if the nag was there or not since the majority of the time is spent on the Ohio turnpike. Sure enough, seems to nag about every 5 minutes, which was awfully disappointing! It regularly nagged when I was driving on a straight portion of road with perfect lane markings and no apparent reason to nag aside from being timed. Sucks that Tesla has degraded the quality of autopilot based on a few idiots. I'm usually looking at the road, not the down at the screen, so I was typically reminded every 5 minutes (at least) to touch the wheel by an annoying pause in whatever I was listening to. It also seems to have a lower tolerance for which curves it can handle without nag. I understand the nag if the car isn't sure what it's doing, but there was no reason for the nag in the vast majority of curves which were really very mild and the car had no trouble navigating on its own immediately before it started beeping at me or after the approximately 1 second I actually had my hand on the wheel.

Given the lack of utility of the summon feature for me, the new 5 minute nag, and the 5mph limit on non highway roads (I occasionally drive on long, straight, country roads with >50mph speed limit where there's no reason to not use autopilot), I'm extremely disappointed in the 7.1 update and wish I could go back! Tesla added none of the promised UI changes, seemingly pointlessly removed a UI element i frequently used (source selection), degraded the autopilot functionality pretty significantly, and added only a beta feature than hardly anyone needs. Pretty damn upsetting.

Wk- any chance you'd be willing to help a few of us get rid of these restrictions while Tesla gets things straightened out and stops degrading already released features? Would be much appreciated!
 
Last edited:
There is one simple but very effective way to keep from being nagged ...
just keep your hands on the steering wheel.

Is it just possible that Tesla knows more than we do about the reliability of
the current Beta Auto-Steering software, and that there are very good reasons
to keep telling, requesting, and reminding us to keep our hands on the wheel?
 
There is one simple but very effective way to keep from being nagged ...
just keep your hands on the steering wheel.

Is it just possible that Tesla knows more than we do about the reliability of
the current Beta Auto-Steering software, and that there are very good reasons
to keep telling, requesting, and reminding us to keep our hands on the wheel?
Elon doesn't think so, he admits he emails while on AP... I'm sure he has the nag disabled on his car.
 
If the passenger is touching the touchscreen, we'll assume the driver is still awake. One of the great things about Tesla USED TO BE that they trusted the driver, hence the lack of lawyer dialogs every time you use the nav.


They likely did...until videos of people doing backseat driving and driving without hands for extended periods of time started popping up. They have their interests to protect, and people are already suit-happy as it is. People being lawyers and other influential persons as well as the gov't and organizations like the Department of Transportation, NHSTA, etc.