Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Model 3 Autopilot First Impressions

This site may earn commission on affiliate links.

Ben W

Chess Grandmaster (Supervised)
Feb 27, 2009
964
1,172
Santa Barbara, CA
I've now logged about 50 miles on the freeway with Autopilot, which is a new experience for me. (Coming from a 2012 pre-Autpilot Model S.)

My first impression is that the driving style of Autopilot is still squarely on the robotic side of the Uncanny Valley. Though the driving was safe and adequate overall, I encountered many "no human would ever drive this way" moments. It remains to be seen whether some of the Autopilot-relevant FSD-type refinements might make their way into standard Autopilot, like detecting when cars are trying to merge into your lane ahead of you and politely slowing down to let them in, or braking immediately when brake lights appear ahead rather than continuing at full speed until the last minute, or keeping toward the left side of the lane when there's a big truck close to you to the right, or minimizing time spent in other cars' blind spots.

This evening I ran into two Autopilot failure cases in quick succession. I was curious whethe Autopilot could handle the 270-degree offramp from 101W to 405S in moderate traffic. The whole turn is confined to a single lane, so I expected it to be a straightforward "lane-keeping" task, thus falling under the domain of Autopilot and not FSD. Here's how it looked on dashcam (KDLinks DX2, powered from console 12V):


You'll see that Autopilot first tried to drive me into the offramp guardrail (it either didn't see the curvature of the road, lost track of the car in front of it, or didn't recognize the lower speed limit), causing me to yank the wheel and slam the brakes. Then seconds later, getting onto the 405, it had trouble recognizing the lane lines and almost ran me into a series of guard posts. (The buzzing camera is due to its mount coming slightly loose, possibly as a result of these hijinks.)

I'm curious what you more experienced Autopiloters think of this. Was I expecting too much? Obviously the system is still "in Beta", but I wonder whether Tesla would consider these incidents to be bugs that need fixing, or cases of driver over-expectation? (I really do hope it's the former.)

All that said, I love the car intensely, and look forward to seeing how it develops over the next few years. It's such an exciting time for Tesla and for the industry!
 
Last edited:
I've now logged about 50 miles on the freeway with Autopilot, which is a new experience for me. (Coming from a 2012 pre-Autpilot Model S.)

My first impression is that the driving style of Autopilot is still squarely on the robotic side of the Uncanny Valley. Though the driving was safe and adequate overall, I encountered many "no human would ever drive this way" moments. It remains to be seen whether some of the Autopilot-relevant FSD-type refinements might make their way into standard Autopilot, like detecting when cars are trying to merge into your lane ahead of you and politely slowing down to let them in, or braking immediately when brake lights appear ahead rather than continuing at full speed until the last minute, or keeping toward the left side of the lane when there's a big truck close to you to the right, or minimizing time spent in other cars' blind spots.

This evening I ran into two Autopilot failure cases in quick succession. I was curious whethe Autopilot could handle the 270-degree offramp from 101W to 405S in moderate traffic. The whole turn is confined to a single lane, so I expected it to be a straightforward "lane-keeping" task, thus falling under the domain of Autopilot and not FSD. Here's how it looked on dashcam (KDLinks DX2, powered from console 12V):


You'll see that Autopilot first tried to drive me into the offramp guardrail (it either didn't see the curvature of the road, lost track of the car in front of it, or didn't recognize the lower speed limit), causing me to yank the wheel and slam the brakes. Then seconds later, getting onto the 405, it had trouble recognizing the lane lines and almost ran me into a series of guard posts. (The buzzing camera is due to its mount coming slightly loose, possibly as a result of these hijinks.)

I'm curious what you more experienced Autopiloters think of this. Was I expecting too much? Obviously the system is still "in Beta", but I wonder whether Tesla would consider these incidents to be bugs that need fixing, or cases of driver over-expectation? (I really do hope it's the former.)

All that said, I love the car intensely, and look forward to seeing how it develops over the next few years. It's such an exciting time for Tesla and for the industry!

You should not expect it to work well on extreme curves, nor to handle the compelexities of on/off ramps yet. Wait until you are settled on the highway.
 
The first incident is hard to explain. I think you're right about losing the lead car. It might have confused the guardrail with the lead car. The second incident clearly seems to be due to the coloration of the lanes. I'm certain Tesla does not consider either to be normal behavior so they're surely working to improve these behaviors.
 
  • Like
Reactions: jsmay311
I think you were expecting too much: in california where the lanes are often re-striped and wiped and the road is mixed concrete and asphault the Model S AP2 car will still struggle. And a 270 degree turn is probably next to impossible.

Auto-pilot works best on freeway lanes. I have to pretty much always turn it off on any exit ramp.
 
I'm also new to AP with the 3 - my 2014 S just missed the AP h/w install date. Doh.

The radar cruise control is the star, although it can be a little surge-prone. Example, I'm traffic and the car in front of me changes lane AND the car in front of him was more than my follow distance, the 3 thinks it's time to step on it. Only to immediately jam on the brakes once we get closer to the traffic again. My wife is not a fan of this.

The EAP steering is nice, but I have no reference point. It wanders a little in the lane sometimes, and has occasional, surprising failures. I mostly like using it if I'm going to be fiddling with some setting buried on the touch screen. While on straight, well-marked roads.

Both worked well in the massive downpour of rain we had Monday on the 210, although the wet roads made me really nervous about trusting the stopping distance.
 
My daily commute is 40 miles round trip. About 1/2 on a 3 lane boulevard and 1/2 on a 3 lane interstate. I use EAP everywhere except for interchanges, on-off ramps and (obviously) turns. I 've been making this drive with autopilot for about 5 months now, and I can say it performs flawlessly with regard to safety. It holds the lanes perfectly, maintains the right speed and has never made any mistakes. I lock into the center lane asap, engage AP, then sit and watch for 20 minutes until I come to my interchange. Then I switch to TACC until I'm back in the center lane and re-engage AP. It. is. WONDERFUL!!!! The first post is right-on with areas that need improvement--mostly related to polite driving. But I adjust as needed, slowing using the scroll bar or temporary taking over.
 
The autopilot may learn how to handle the first situation. Have a similar situation I've driven a dozen or so times. The first time I took over because I did not think the autopilot was handling the exit quickly enough. It also did not slow down for the ramp curve. The second time I let the initial exit go a little further and it did make the turn, albeit a bit aggressively. The entire experience -- exit and ramp -- gradually improved over time and now it makes the exit cleanly at speed and slows down at an appropriate time for the ramp.

My only advice would be, if it 'looks' unusual be ready to take over. For oddly designed interchanges, slow down a bit so you're comfortable letting it go a little further and it should learn over time.
 
AP has been pretty good on clearly marked highways as a nice drivers aid to help reduce driving fatigue on the highways but I can’t trust it for much else though.

From my perspective, the biggest issue with AP is whether it will ever be able to anticipate what is going to happen like like human drivers do. As drivers we make decisions all the time, some even subconsciously. We anticipate certain traffic patterns, we avoid other suspect drivers, stay out of blind spots, avoid being behind the upcoming bus stop, we avoid potholes and other suspect stuff in the road that may damage our vehicle. We hear sirens and determine where to place our vehicle to let the first responders pass, how the hell are autonomous vehicles going to deal with that situation? We slow down and create more distance between cars when it is raining or snowing/ice. We avoid standing water so we don’t hydroplane. This list goes on and on. We do all these things because we learn from experience and we can anticipate the pattern.

I am all for FSD but let’s be realistic, replicating the neural network of the spectacular brain God gave us is going to take a while.
 
The autopilot may learn how to handle the first situation. Have a similar situation I've driven a dozen or so times. The first time I took over because I did not think the autopilot was handling the exit quickly enough. It also did not slow down for the ramp curve. The second time I let the initial exit go a little further and it did make the turn, albeit a bit aggressively. The entire experience -- exit and ramp -- gradually improved over time and now it makes the exit cleanly at speed and slows down at an appropriate time for the ramp.

My only advice would be, if it 'looks' unusual be ready to take over. For oddly designed interchanges, slow down a bit so you're comfortable letting it go a little further and it should learn over time.



FYI- The individual car doesn't actually learn anything.

The only time its behavior will change, given the same situations and inputs, is after a firmware update.

Tesla has the NN back at home base "learn" or train based on inputs and data from the fleet, and then when they're confident it's learned something useful they push that out to the whole fleet.

Doing otherwise would leave Tesla with a fleet of hundreds of thousands of cars all behaving differently from each other which would be a nightmare for them to manage or ever troubleshoot- not to mention it'd make the fleet data feeding back vastly less useful.
 
  • Like
Reactions: JulienW
I only engage autopilot on freeways and turn it off if I’m driving in the rightmost lane with lot of merging traffic or in construction areas. I also tend to start dialing down the speed or just disengage AP if I approach a standstill traffic and there are no cars in front of me. While you certainly need to stay attentive and ready to intervene in any second it does reduce the workload a lot.

I like how it can see a car (or even a few) in front of the car you are following and start slowing down BEFORE the car in front of you slams on brakes. The first time it happened I thought it was one of those fantom braking things, as I was following an SUV and it didn’t have the brake lights on yet. A second later that SUV started braking hard, but my car didn’t have to.

I have only TACC and Autosteer as a part of AP package, and don’t have FSD option (yet), so no NoA or lane change assist for me. I tried NoA during the test drive and I wasn’t too impressed, but lane change assist would be quite handy so I don’t have to re-engage autosteer and hear that chime every time I change lanes. Not sure if it worth $6K for me at this moment...
 
Just curious, if the actual car doesn’t learn anything, why does a new car need to “calibrate” for a while before autopilot functions reasonably well?
Because the newly installed cameras, ultrasonic and radar must be real world tested for things like alignment and coordination before they are “trusted” to be accurate. Nothing to do with net learning and all to do with.....calibrating.
 
Last edited:
  • Like
Reactions: Dogen
AP has been pretty good on clearly marked highways as a nice drivers aid to help reduce driving fatigue on the highways but I can’t trust it for much else though...

I am all for FSD but let’s be realistic, replicating the neural network of the spectacular brain God gave us is going to take a while.

Yes, FSD won’t happen this month, but... have you met your fellow drivers? They likely do have a “spectacular brain”, but it’s not always used. They text and fumble with their cell phones, daydream and often are distracted while driving (note they swerve a lot nearly driving off the road or into oncoming traffic...)

The Autopilot pays attention all the time. Today it is a better routine driver than I am. I’m good for construction zones, badly marked roads and obstructions, potholes, pedestrians and large animals (deer). Autopilot is good for boring driving, congestion, and almost all expressway driving. With FSD I await stop sign recognition and right and left turns with gps routing on regular roads. That should be available soon I hope without need for HW3. But I bought FSD and HW3 upgrade just in case it IS needed.

AP and FSD as a driver assist system today is a safer better driving experience than driving without these computer systems. Is it complete, no. Is it safer, yes. Is it worth the money? Oh yeah, esp if you consider accident and hospital costs avoidance.

Admittedly, I’m biased since I paid for it. But I also think I’m safer for myself and others as I drive now. Amusingly, I’m a less emotional driver now, I drive within the speed limits more, I’m more observant of the conditions of the road and my attention is 50-200 feet ahead of my path. The Autopilot handles the local stuff while I look for upcoming issues.

The change in driving behavior as I get used to Autopilot is fascinating. But it does take experience to get there. And i still don’t know what the screech, pink steering wheel error message meant except that autopilot gave up and I had to take control Right Now. Abit startlingly.
 
Yes, FSD won’t happen this month, but... have you met your fellow drivers? They likely do have a “spectacular brain”, but it’s not always used. They text and fumble with their cell phones, daydream and often are distracted while driving (note they swerve a lot nearly driving off the road or into oncoming traffic...)

The Autopilot pays attention all the time. Today it is a better routine driver than I am. I’m good for construction zones, badly marked roads and obstructions, potholes, pedestrians and large animals (deer). Autopilot is good for boring driving, congestion, and almost all expressway driving. With FSD I await stop sign recognition and right and left turns with gps routing on regular roads. That should be available soon I hope without need for HW3. But I bought FSD and HW3 upgrade just in case it IS needed.

AP and FSD as a driver assist system today is a safer better driving experience than driving without these computer systems. Is it complete, no. Is it safer, yes. Is it worth the money? Oh yeah, esp if you consider accident and hospital costs avoidance.

Admittedly, I’m biased since I paid for it. But I also think I’m safer for myself and others as I drive now. Amusingly, I’m a less emotional driver now, I drive within the speed limits more, I’m more observant of the conditions of the road and my attention is 50-200 feet ahead of my path. The Autopilot handles the local stuff while I look for upcoming issues.

The change in driving behavior as I get used to Autopilot is fascinating. But it does take experience to get there. And i still don’t know what the screech, pink steering wheel error message meant except that autopilot gave up and I had to take control Right Now. Abit startlingly.

Agreed. I think AP and all the additional safety features are great driving aids and can help reduce accidents as long as folks don’t completely trust the system, otherwise AP can actually cause more accidents. I have had to step in and take over many times while using AP and if I didn’t there would be a serious problem. That said, I am a fan but I think AP is marketed incorrectly considering its current capabilities. It should really be referred to as a co-pilot in its current form.
 
That said, I am a fan but I think AP is marketed incorrectly considering its current capabilities. It should really be referred to as a co-pilot in its current form.


What's funny is- based on what those terms actually mean, your way is worse

Actual aircraft autopilot can't generally replace a human pilot. It's just an aid to the human pilot. Just like in a Tesla.

An actual aircraft co-pilot can replace the human pilot. It's one of the reasons they're there.
 
  • Like
Reactions: Eugr
I’m surprised, as those are situations mine handles fine every time and generally selects a conservative speed. Yours looked a bit fast. The lines in the 2nd one might be hard to discern, I wasn’t sure. Nice manual recovery though. Test further, and see if it can handle them. If not, might need to talk to sc.
 
What's funny is- based on what those terms actually mean, your way is worse

Actual aircraft autopilot can't generally replace a human pilot. It's just an aid to the human pilot. Just like in a Tesla.

An actual aircraft co-pilot can replace the human pilot. It's one of the reasons they're there.

Touche.

I think once you get past the marketing hype you can accept the features for what they are but I know a lot of new buyers of a Tesla are going to be misled by the expectations and term Autopilot. I have to explain it to all the folks that ask me about my car. Some are scared of it and say don't turn it on because of negative media press they hear about. Others are interested to see it in action and then they are surprised to find out it is just a fancy form of cruise control for now. Then I have to differentiate between AP and FSD, which is a whole other level of marketing and future selling.
 
By the way, I see the original post in this thread is 1.5 years. I just took a highway off-ramp with AP Auto-steer and I had to intervene as well. It wasn't even a very sharp bend but I got about 1/3 way into the turn and then alerted for me to take over. Point being, it is 1.5 years later and I experienced the same thing with AP as the OP. I am remaining optimistic though that this will improve.