Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilot - Tesla hugs OUTSIDE of curves---why?

This site may earn commission on affiliate links.
I simply cannot use autopilot. The highways in Colorado are curvy but not excessively. My M3, in a curve, hugs the outside of the curve. All humans hug the INSIDE of curves. So I always come dangerously close the cars on my right when in a left curve, etc. This has been consistent for a couple years now.

It's yet another quirk of the Tesla autopilot software that goes against what a human would do (another example-- edging away from a giant tractor trailer instead of bearing down the middle or, in some cases, my M3 will drift towards a truck).

So Elon- why isn't it a design goal to put the human driver at ease by imitating what a human driver would do? And why not if no?
 
it doesnt hug the outside the curves. it drives exactly the center of the lane (you can also confirm looking at the rendering).

but the result is the same: the car converges towards the outside neighboring car. and i end up overriding it often

i agree it should also take distance to the neighboring cars as part of the centering algo...
 
I would normally agree, it stays exactly centered in rendering in normal driving. But no, my car does not stay centered in a curve, it drifts to the outside before re-centering. It's a real phenomenon; most highways don't have curves like here in the mountains.
 
  • Like
Reactions: BooMan
I find it very annoying and dangerous at times. On smaller roads with curves, it will hug the outer curve on a turn, making it very dangerous with oncoming traffic barreling the opposite way. Only use it on large thruways. That and it slamming on the brakes, has me using it only in super ideal situations.
 
  • Like
Reactions: BooMan
it doesnt hug the outside the curves. it drives exactly the center of the lane (you can also confirm looking at the rendering).
For many people (myself included), it very often understeers and goes to the outside of the curve, especially at the
beginning of the curve. The display may indicate that it is between the lines, but you can see a clear bias to the outside.

I have one
freeway transition I take on an almost daily basis and after 11 months of driving it, it still scares the crap out of me as it always looks like it is about to slam into the K-rail.

All humans hug the INSIDE of curves.
MANY do, perhaps even "most", but not all.

Would you be driving a lot higher than the speed limit and the car has difficulty adjusting when entering a corner?
If one is using AP, then the car (for me at least),
usually lowers the speed before going into the corner. In my example above, the TACC speed drops WAY down (much slower than I would have done if I were in control).
 
My car also hugs the outside of curves on autopilot to the degree that it often runs over the lane marker bumps.

It absolutely does not stay centered in curves and accurately displays its position in the lane on the screen, so it knows it’s drifting to the outside.
 
The issue is not just that it tends to hug the outer curve but that it doesn’t always gradually decelerate on those turns and that is very anxiety inducing.

Another quirk that has been around for ages is the car wanting to center itself when there’s an adjacent merging lane when it should at least keep straight if not slightly move to the left. I am not sure why it fails to see the angular lane next to the one it is driving on and detect that it is a merger. I’m sure there are programmatic complexities but autosteer is unusable in these two situations.
 
The issue is not just that it tends to hug the outer curve but that it doesn’t always gradually decelerate on those turns and that is very anxiety inducing.

Another quirk that has been around for ages is the car wanting to center itself when there’s an adjacent merging lane when it should at least keep straight if not slightly move to the left. I am not sure why it fails to see the angular lane next to the one it is driving on and detect that it is a merger. I’m sure there are programmatic complexities but autosteer is unusable in these two situations.

Agreed. The last two issues for me with autopilot are exactly what you said. If/when they fix these the system will be perfect for my daily use.

FSD is a whole different ball if wax but lane centering and lane merging are keeping it from being a perfect L2 setup.
 
Another quirk that has been around for ages is the car wanting to center itself when there’s an adjacent merging lane when it should at least keep straight if not slightly move to the left. I am not sure why it fails to see the angular lane next to the one it is driving on and detect that it is a merger.
What, you don't like the random slalom run of the car not being able to decide where it wants to be? It's such an adventure!
 
The issue is not just that it tends to hug the outer curve but that it doesn’t always gradually decelerate on those turns and that is very anxiety inducing.

Another quirk that has been around for ages is the car wanting to center itself when there’s an adjacent merging lane when it should at least keep straight if not slightly move to the left. I am not sure why it fails to see the angular lane next to the one it is driving on and detect that it is a merger. I’m sure there are programmatic complexities but autosteer is unusable in these two situations.
My 3 does that, too. It upsets my wife, and a driver following me might assume I'm about to exit and speed up. This could lead to a rear-end collision when my 3 abruptly swerves back into the travel lane when the painted line appears between the travel lane and the exit lane.Elon should fix this issue promptly
 
Yes, this is my second biggest complaint. Tesla should hug the inside lane when a merge lane occurs-- instead it veers to the right then hopefully back to the left as the merge lane closes.

And okay, maybe not "all" humans hug inside, but I argue it's the obvious reaction for a human to do. Show me the strange human who hugs the outside.

Bottom line, Tesla cars should mimic acceptable smart/normal human behavior (until the road is full of autonomous cars, aware of each other, and can do inhuman maneuvers because they don't care what humans normally do).

The issue is not just that it tends to hug the outer curve but that it doesn’t always gradually decelerate on those turns and that is very anxiety inducing.

Another quirk that has been around for ages is the car wanting to center itself when there’s an adjacent merging lane when it should at least keep straight if not slightly move to the left. I am not sure why it fails to see the angular lane next to the one it is driving on and detect that it is a merger. I’m sure there are programmatic complexities but autosteer is unusable in these two situations.
 
  • Like
Reactions: captanzuelo
With gym closures due to covid, we have a lot of walkers and joggers around the curvy thoroughfare in my town. The walkers stay on the sidewalk, and most runners hop onto the street to pass them. Autopilot recognizes these runners, yet will still try to stay dead center in the lane, getting uncomfortably close. This is exacerbated on the outer curves, where it looks like autopilot is intentionally trying to run these humans off the road! How about hugging the inside of the lane, maybe even crossing a little over the line and giving the runners some space? Isn't that what a normal driver would do?
 
With gym closures due to covid, we have a lot of walkers and joggers around the curvy thoroughfare in my town. The walkers stay on the sidewalk, and most runners hop onto the street to pass them. Autopilot recognizes these runners, yet will still try to stay dead center in the lane, getting uncomfortably close. This is exacerbated on the outer curves, where it looks like autopilot is intentionally trying to run these humans off the road! How about hugging the inside of the lane, maybe even crossing a little over the line and giving the runners some space? Isn't that what a normal driver would do?

YES that is what a normal human would do...
 
I hope Elon and his engineers are reading this!

Another issue I have is with cars parked on the side of the street. Autopilot treats them the same as those joggers, often passing by uncomfortably close. How about utilizing all that 4 feet of extra space on the left, instead of staying dead center and passing the parked cars with 2 feet of wiggle room on the right??
 
  • Like
Reactions: pilotSteve
Has anybody experienced the following situation? Happens to me often enough to notice and ask...

Sometimes I drift to ones side of the lane or another (AP off, but have lane departure intervention / help on) and as I am at the edge of the lane it turns blue on the screen. As I realize I am drifting a bit far I make a minor adjustment to get myself back into the center of the lane but the car jerks the car back toward the blue side of the lane that I drifted to, feels like a magnet pull... its quick and sudden but releases right away. Not sure what that is or what is happening in that process.
 
The drift toward oncoming traffic in a two way situation is scary and dangerous. Likewise, when a curve comes and the outside of the curve is favored, danger alarms go off in my head and the car makes me take control and move back to the inside of the curve to be safe, especially in traffic. The other day a truck pulled out of a driveway on the SAME two way street, while the car is favoring the outside of the curve and the truck came over the dividing line in the middle of the street, BUT THE M3 did not react at all. VERY DANGEROUS and I had to take control away again. Seems the NVIDIA cpu/gpu chip is not calculating the situation well or fast enough. NEEDS UPGRADES to really manage real time situations !
 
So I have a theory I've been throwing out for a while. I suspect the machine learning/AI approach that I've read Tesla is using just can't optimize everything. It is intensively compute and time intensive, they look at thousands of Monte Carlo like sims across a large variety of situations, labeling certain behaviors as "good". They may be prioritizing other things, primarily safety, and NOT "human comfort level" with behavior of driving.

I also see some releases fix some behavior, but the next compute results de-prioritize that behavior and breaks it again. AI is not an exact science, and different releases fix and re-break different things.

So...I am a bit pessimistic that Tesla can center in on an optimal safe AND human-comfortable driving compute model...which could explain why it's taking so long to get to FSD.

That's my theory...