Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Don't take your hands off the wheel

This site may earn commission on affiliate links.
It's a fact of life with autopilot that eventually you're going to have to take over instantaneously. And that's why you get the warning right off the bat to keep your hand(s) on the wheel when you engage autopilot. I'm always astonished at the people who complain about the steering wheel nag - properly used, there's no nag.

I'm glad you were able to recover from your bad situation with only minor damage!
Really, what that would have been a person...... Hands on wheel folks.....
 
My Model S is AP1. I've found that I need to focus more intently when in AP mode than when I'm doing the driving. The reason is, when AP makes a mistake, you've got very little time to recognize the mistake and seize control to prevent a crash. AP can get you into situations that you wouldn't get into in the first place if you were doing the driving.
 
It's a fact of life with autopilot that eventually you're going to have to take over instantaneously. And that's why you get the warning right off the bat to keep your hand(s) on the wheel when you engage autopilot. I'm always astonished at the people who complain about the steering wheel nag - properly used, there's no nag.!

I disagree that there is no nag when used properly. Some roads are dead straight and autopilot will get so locked onto the lane that I will repeatedly get warnings. Both hands are on the wheel, but no input has been needed (or my guidance has 100% matched the autopilot steering action.
 
I drove on AP through some road works today with concrete barrier less than 18 inches from the car on both sides. I held the steering wheel firmly, but annoyingly that is just the sort of occasions I get the nag when I least want to turn the wheel. But for Autopilot - it performed perfectly keeping me far most securely centred between the walls than I could manually.

Maybe there was some marking on road that confused Autopilot for OP?
 
I drove on AP through some road works today with concrete barrier less than 18 inches from the car on both sides. I held the steering wheel firmly, but annoyingly that is just the sort of occasions I get the nag when I least want to turn the wheel. But for Autopilot - it performed perfectly keeping me far most securely centred between the walls than I could manually.

Maybe there was some marking on road that confused Autopilot for OP?

Rather than turn the wheel, flick the volume knob up and then down. That'll do the trick. But please, please keep your hands on the wheel. Especially running through the cattle chutes!
 
  • Like
Reactions: SammichLover
I learned early on that it is only highly reliable on nicely marked, relatively straight roads, like I-5.

We're seeing a lot of Phase X people here, who are into
a second doubt phase. NOBODY is telling you to close
your eyes and expect the car to always do all the work.

But, and this is what a lot of the "haha FSD, sure" folks
are ignoring, the Tesla NOA is built on a neural net that
gradually adjusts with every error. It's not so much that
it understands anything, it's just that the number of test
data points increases. Initially a neural net gives equal
value to say put hand on hot stove vs do not put it there.
As the results accrue, it favors "not". As the database
grows, the improvements go up, non-linearly. That's
why each update is noticeably better. That's totally
different from an "algorithmic" approach that improves
only as the programmers learn and refine the program.

Tesla's huge lead over other car makers lies in the
size of the database, the number of miles driven on
autopilot. Looking at how it's acting now, and how a
neural net operates, it's 100% certain that it will keep
improving at an ever faster rate. It's never "done", so
it's not something with limitations you "learn early on".

Speculating on how soon it will perfectly handle which
cases is rather pointless, and more likely to be wrong
than right. It will keep getting better regardless.

Meanwhile, understand we are part of an actively
self-training system, not the beneficiaries of a clever
program that you should bless or bash. Every time you
manually take over, you're slapping the baby's butt,
saying WRONG, and that's exactly the correct and
necessary thing to do.

It's unfortunate that Elon Musk doesn't communicate
more and better with the naysayers, most of whom
have never driven a NOA Tesla, or even enjoyed
the handling, power and joy of driving one manually.
 
Last Thursday, I was headed home from San Francisco on 24 Eastbound. Went thought the Caldecott tunnels. Was in the right most lane of the right tunnel. A couple of hundred feet before the end of the tunnel, AutoPilot suddenly swerved right and hit the curb. I had my hand on the wheel and reacted quickly. Quick enough that the only damage was a curbed rim and a messed up section of my aero hubcap.

This was on 2019.12.1.1. I forgot to hit the steering wheel button and say "Bug Report WTFU HAPPENED" The next morning I received 2019.12.1.2 and AutoPilot handled the same tunnel perfectly on Saturday.

I love my car, but I try to keep at least one hand on the wheel 99% of the time.

I have on road I drive on every morning on my way to work. When I go by this one gas station, the line in front of the entrance is dashed instead of solid. Every single time, it tries to swerve into that entrance, even if there is a car sitting there. Not sure why it wants to do that. But I did notice that in front of driveways, the line is solid and only dashed at cross streets. Wondering if the line is painted wrong in front of the gas station entrance and that it should be a solid line. Either way, the pressure from my hand on the wheel always causes it to come out of auto pilot. I keep thinking that one day it will learn from its mistakes, but five days a week since Oct 17, 2018 and it still hasn't learned yet.
 
In my 2015 Model S, there is a left curve on Interstate 5 north of Burbank, where AP tries to shove me into the second lane if I'm in the first (left) lane. I think this is due to a groove in the concrete that migrates left to right. AP seems to try following that groove, despite the lane lines. That's when I overpower AP to stay in the lane. As I said in my post earlier, a human would have no problem staying in lane, so a human would not get into the situation above. But when AP messes up, you need to be instantly ready to take over. Either that or bitch to CalTrans about their grooves. LOL
 
OP, thanks for sharing and letting us learn from your experiences.

I've used Autopilot (autosteering) on my AP1 Model S from the first day that software was enabled (October 2015). Over the last few years I've made it a habit to understand under what circumstances I'm comfortable letting AP control the car and when I want to do it all myself. The eastbound Caldecott Tunnel bores are definitely "do it myself" territory. The westbound bores, being somewhat wider, are "maybe", but in general I will opt to disengage AP before getting to the tunnel. In general I've concluded that tunnels and bridges are not strongpoints of the AP1 system (or rather the potential for something bad to happen is higher). So if I see a narrow bridge or tunnel coming up, I am more than likely just going to turn off AP to prevent the possibility of any problems. This also means that I get to choose when to take over manual control, and I don't have to do it on the spur of the moment because I was surprised.

It might be the case that AP "should" handle those scenarios better. That's a different discussion. Knowing the limitations of the system, due diligence means I should take over manual control in some circumstances (tunnels, narrow bridges, construction zones, emergency vehicles/workers, etc.), in order to maintain safety.

(AP2/2.5/3 works differently, but the principles for driving it are the same.)

I'll just leave this pointer to something in the Model S forum...this thread is older but still applicable IMHO.

A flight instructor teaches Tesla Autopilot

Bruce.
 
  • Like
Reactions: Silicon Desert
Musk had an interesting interview with Lex Fridman. I think Fridman is too invested in his vigilance research and General AI extrapolation. In practice people DO pay attention in AP. Musk is probably right in saying that a Tesla is a smart toaster, not an AI, and that at some point humans contribute more negative than positive involvement.
Sure, I got the impression that the findings surprised Fridman, that he was expecting a lot worse findings. Maybe in the future it will get worse, too. That's pretty hazy right now, as improvements in AP ability is something that can drive [over]confidence. He's also got a wider interest than just automobiles, obviously.

However I watched that interview and found it really good. Whatever your take on Lex Fridman's opinions, he knows enough and has enough insight to formulate really good questions for drawing out Musk's technical knowledge and thinking on the subject.
 
  • Like
Reactions: Fernand
This is why many of us AP users report after a trip that the car did 90% of the driving. It's that 10% that hypervigilance made us decide to take over or not engage it to begin with. Compared to when I got my car last June, it's vastly improved. This is a growing tech and it learns over time which explains why it keeps getting better. For those that don't own a Tesla that use this tech, your criticism is showing the ignorance of such opinions.
Reasons I "take over" and don't use AP, complete list:
1) I see a gap in multilane traffic that AP isn't aggressive enough to make, especially at the normal constant speed set point. So I disengage, maneuver into place, and re-engage.
2) No lane lines! And I had to turn/brake to get to this place, so I couldn't just let AP go on and sort out the driving using the edge of pavement (which it can sometimes do, if you don't do something to disengage). So these are parking lots, tiny side roads, etc.
3) Lots of cornering, turning off the road, and such where I don't really have time to engage between each maneuver that requires manual intervention.
4) Road is getting really curvy, fun time! :p
5) AP flunks out and forces me to take over, or on the very rare case it is in the process of doing something very stupid and dangerous. I try to give it time to sort out its alarms and "red hands" warning, and it will recover often as it regains enough situational awareness that it is confident and is making good decisions, but it will also at times flunk out for one of the two reasons above.

I might add "snow" if I'm ever lucky enough to be driving there. Partially for #4 but also sounds like #2 is the case for AP for certain types of snow conditions. However rain I mostly definitely use AP. I'm actually a lot more comfortable now with AP in the rain and heavy traffic because it clearly has better "vision" than me.
 
@bmah

I've used Autopilot (autosteering) on my AP1 Model S from the first day that software was enabled (October 2015)

In your honest opinion, do you think AP, EAP, FSD has made progress since you first used AP1 in 2015? I know your opinion will be anecdotal bc you do not have AP2/AP2.5 -- but as a mod here you must read a lot first hand accounts of these systems....

Does it seem like people are looking out for the same things you have to look out for?

Thank you for your time
 
Last Thursday, I was headed home from San Francisco on 24 Eastbound. Went thought the Caldecott tunnels. Was in the right most lane of the right tunnel. A couple of hundred feet before the end of the tunnel, AutoPilot suddenly swerved right and hit the curb. I had my hand on the wheel and reacted quickly. Quick enough that the only damage was a curbed rim and a messed up section of my aero hubcap.

This was on 2019.12.1.1. I forgot to hit the steering wheel button and say "Bug Report WTFU HAPPENED" The next morning I received 2019.12.1.2 and AutoPilot handled the same tunnel perfectly on Saturday.

I love my car, but I try to keep at least one hand on the wheel 99% of the time.[/QUOTE

I live on the west side of the Caldecott and often drive through it exactly like you do. Glad that the accident was not more severe than it was and thank you for posting this information. I've had my Tesla since July 2015 and have always been concerned about driving through this tunnel. One of the first times I did so, the AP swerved as we were about to enter the tunnel going west. That alerted me to the unreliability of AP in tunnels so I have used 2 hands ever since, even if I was using AP. Hopefully your post will alert others in our area of the dangers of relaxing and using AP in the tunnel even with one or both hands on the wheel.
 
I have found that the hands-on requirement gives me carpal tunnel in that the pressure requirement is far more than if I was driving the car. Also, once while in Nav on AP, and taking the exit off of the interstate, it totally missed a car parked half in the lane, half in the median. Glad my hands were on the wheel. It woulda’ been wicked nasty.
 
I have on road I drive on every morning on my way to work. When I go by this one gas station, the line in front of the entrance is dashed instead of solid. Every single time, it tries to swerve into that entrance, even if there is a car sitting there. Not sure why it wants to do that. But I did notice that in front of driveways, the line is solid and only dashed at cross streets. Wondering if the line is painted wrong in front of the gas station entrance and that it should be a solid line. Either way, the pressure from my hand on the wheel always causes it to come out of auto pilot. I keep thinking that one day it will learn from its mistakes, but five days a week since Oct 17, 2018 and it still hasn't learned yet.
I have a nearly identical situation on my daily commute to work. I've often wondered if it would still try to slam me into the other lane if there was a car there?

The part that surprises me the most, however, is that it will still do this even while tracking a car in front of me.
 
@bmah



In your honest opinion, do you think AP, EAP, FSD has made progress since you first used AP1 in 2015? I know your opinion will be anecdotal bc you do not have AP2/AP2.5 -- but as a mod here you must read a lot first hand accounts of these systems....

Does it seem like people are looking out for the same things you have to look out for?

Thank you for your time

I can say that at least for me, AP1 is way better than it was in 2015. Earlier this year, I drove an AP2.5 service loaner for a few days (with EAP) and I thought that its lane-keeping quality was generally on-par with my AP1 car (not dramatically better or worse). This was a pleasant surprise to me, to be honest. I would have been comfortable taking that loaner on a long road trip and driving it the same as my usual car. Also, this was one of the first releases that had Nav on Autopilot, and even though I still had to confirm every lane change, the potential in this system was pretty exciting.

Things you don'l hear much about anymore that were once commonplace for AP1: Exit diving (where AP1 cars used to prefer taking exit lanes rather than staying on the freeway). Truck lust (the tendency of AP1 cars to sometimes scoot closer to semi-trucks that they're passing). Not saying that these never happen, but in general they seem to happen much much much less for both AP1 and AP2/2.5/3.

I'm trying to think of problems nowadays that we didn't have before...phantom braking is one that comes to mind.

I remember when AP2 cars first got autosteering capability. If I could characterize the general opinion at the time, it sucked. It was weavy and speed-limited. What you guys are driving nowadays in the Model 3 is way better than that.

I'd say that when AP1 was actively being worked on, there were various releases that were better in some ways, worse than others. We sometimes had a feeling of regression for some releases. It seems that's what AP2/2.5/3 are going through now. Like there's progress being made, but it's not a perfect process of improvement either. It's difficult to generalize because across the population of Tesla owners, people can have widely differing experiences, and it's difficult to exactly characterize the system's behavior as a whole ("the plural of anecdote is not data").

Bruce.
 
  • Like
Reactions: SammichLover