Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
I was driving southbound on 101 with FSD about a week ago. The navigation correctly announced I should get off at the Embarcadero exit, but much to my surprise FSD decided to get off at University Ave exit (one exit too soon). I have never had it do anything like that before. I am running v11.4.7.3.

Anyone else had it do something like that?

I decided to see what it would do as it was still possible to get to my destination from that exit, but it started to turn left where it should turn right, I stopped it.
 
Which product/feature are you referring to? NoA (Navigate on Autopilot) or the more advanced FSD beta (Navigate on City Streets)?

Regardless, this sort of thing happens with both still. That’s why Tesla doesn’t call either product FSD yet and warns drivers that FSDb “may do the wrong thing at the worst time” and in multiple places repeatedly states, “the currently enabled features do not make the vehicle autonomous” and that the driver is responsible to intervene when necessary.
 
I was driving southbound on 101 with FSD about a week ago. The navigation correctly announced I should get off at the Embarcadero exit, but much to my surprise FSD decided to get off at University Ave exit (one exit too soon). I have never had it do anything like that before. I am running v11.4.7.3.

Anyone else had it do something like that?

I decided to see what it would do as it was still possible to get to my destination from that exit, but it started to turn left where it should turn right, I stopped it.
Sure. Same destination 5 days in a row, it may plot 3 different routes and then will fail to follow any of them. Turning early, check. Turning late, check. Turning late, realizing it has erred, whip a 1/2 turn, slam on the brakes in the traffic lane and demand I take over immediately, check. Further, FSDb frequently doesn't follow the NAV guidance voice and wanders off until it is so hopelessly lost it gives up and hands over control. Was driving along in Texas and suddenly the satellite view display switched to something completely different, the destination changed to somewhere in Connecticut as I recall, 1,332 miles away and arrival time changed to next Tuesday or something like that.

It's always an interesting adventure and getting excited about anything it does would just be bad for your mental health. Best to just take it with a grain of salt, deal with the errors as best you can and enjoy the game.
 
I was driving southbound on 101 with FSD about a week ago. The navigation correctly announced I should get off at the Embarcadero exit, but much to my surprise FSD decided to get off at University Ave exit (one exit too soon). I have never had it do anything like that before. I am running v11.4.7.3.

Anyone else had it do something like that?

I decided to see what it would do as it was still possible to get to my destination from that exit, but it started to turn left where it should turn right, I stopped it.
I have this frequently with an exit that I take daily on a major interstate.. FSD shows everything correctly on the display but 50% of the time FSD will try and take one exit too soon. Even when FSD takes the right exit it moves into the exit lane for the exit it sometimes takes by mistake. So it's always a game to see what is actually going to happen. Lots of disengagements. I have had several map updates and nothing changes.
 
Which product/feature are you referring to? NoA (Navigate on Autopilot) or the more advanced FSD beta (Navigate on City Streets)?

Regardless, this sort of thing happens with both still. That’s why Tesla doesn’t call either product FSD yet and warns drivers that FSDb “may do the wrong thing at the worst time” and in multiple places repeatedly states, “the currently enabled features do not make the vehicle autonomous” and that the driver is responsible to intervene when necessary.
FSD beta (v11.4.7.3).
 
  • Informative
Reactions: TresLA
I've long held the belief that fully autonomous driving is still far off because so much of the driving task is (sub-consciously) social. AGI (however one defines this) is needed to read/predict other human drivers' intent, let along getting path-planning to follow a navigation's route... and that will be a scary day for other (Matrix/Westworld/1984) reasons.
 
Got my 2023 MYLR with 2023.44.30.6 December 23, 2023. In +1000 miles, have experienced 4 times on multi-lane divided road where FSD turned one exit before actual exit. In two of those, the turn should have been to another divided highway, but instead went to 1. A hotel, 2. A corporate office park. I assume this is Tesla keeping me honest and alert. Whether intentional or not, it is working rather well. As Jack Colton said to Joan Wilder "what a ride".
 
  • Like
Reactions: BobHinden