Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
it would provide incorrect results if the perception is wrong.
This is what I mean when I say the perception has noise. It's not precisely determining the drivable area. And this will affect the path planner.

When what you have described has happened to me, the drivable area looks right on the screen. It's just that the planner wants to move me abruptly into the wrong lane. I don't know the cause of this. Perhaps incorrect map data. Do you see improperly displayed drivable areas that cause the path to be wrong?
 
It's not a static path. It's constantly projecting it's path forward at every instant. And it has to respond to an ever changing environment. Then they fit splines where points that come later can affect the interpolation between earlier point.
When the environment is not changing and the car not moving, why is the steering wheel moving ? It’s not rocket science.

It reminds me of when new devs would close the bug as “working as coded”.

Don’t explain to me how or why it might be happening, just fix it. As I said it does not happen when the car is already moving when making the turn. All your probabilistic world ideas are true in that case as well.
 
  • Like
Reactions: Phlier
. Do you see improperly displayed drivable areas that cause the path to be wrong?
Seems to understand the situation, though note the massive break in the solid double yellow (this is right after a massive wheel jerk to the left as you can see - this is not a reassuring steering angle, even if I am going only 17mph). I've reported this scenario to Tesla in detail a month ago, FWIW. This break in the solid double yellow comes and goes.
Incomprehensible.png

Nearly has it, stay the course!
HadIt.png


Nope. Plow into the prohibited area, with a swift, though small, jerk of the steering wheel to the left. Moving violation occurs shortly thereafter. Yes, the path shows the way forward into the promised land, but why? Is it the path planning, or is it the perception? In this case it looks like path planning to me, but why is it doing clearly illegal things? I would think that would be extremely high cost? I think it's somehow weighting the perception of breaks in the solid double yellow that occurred in multiple frames earlier in the video, in spite of being highly confident in there being no breaks in the lines in the latest image, and decided that there's possibly no double yellow there at all. This is not good; the computer vision has to be far better and far more certain to be successful here, shadows or no shadows.
PlowingForward.png

Here are a few examples of breaks in the double yellow:
Screen Shot 2021-11-09 at 10.53.32 PM.png
Screen Shot 2021-11-09 at 10.53.59 PM.png
Screen Shot 2021-11-09 at 10.54.22 PM.png


Finally, bonus, off topic, a very haunted wall. This is where the car thinks the bodies are buried. The perception is superhuman. Or perhaps supernatural. Look at all those faces. (I reported this.)
HauntedWall.png
 
Last edited:
As I said it does not happen when the car is already moving when making the turn

I already answered this one. The faster you are going the less you have to turn the wheel to get to the slightly offset path.

if you are going 1 mph and the control loop calls for moving to the left at 1 mph to get to the new path position, you have to change your direction by 90 degrees to the left. If you are going 20 mph, you only have to deviate by 3 degrees.

When the environment is not changing and the car not moving, why is the steering wheel moving ?
It doesn't stop calculating new paths just because it's not moving. Something could move into the currently selected path.

When you are stopped the drivable area is still changing because there is uncertainty (probability) involved in determining the boundary. This affects the paths.
It appears FSD anticipates the direction it will move and aligns the wheels with that direction while stopped.
 
  • Like
Reactions: Phlier
Seems to understand the situation, though note the massive break in the solid double yellow (this is right after a massive wheel jerk to the left as you can see - this is not a reassuring steering angle, even if I am going only 17mph). I've reported this scenario to Tesla in detail a month ago, FWIW. This break in the solid double yellow comes and goes.
View attachment 731317
Nearly has it, stay the course!
View attachment 731318

Nope. Plow into the prohibited area, with a swift, though small, jerk of the steering wheel to the left. Moving violation occurs shortly thereafter. Yes, the path shows the way forward into the promised land, but why? Is it the path planning, or is it the perception? In this case it looks like path planning to me, but why is it doing clearly illegal things? I would think that would be extremely high cost? I think it's somehow weighting the perception of breaks in the solid double yellow that occurred in multiple frames earlier in the video, in spite of being highly confident in there being no breaks in the lines in the latest image, and decided that there's possibly no double yellow there at all. This is not good; the computer vision has to be far better and far more certain to be successful here, shadows or no shadows.
View attachment 731319
Here are a few examples of breaks in the double yellow:
View attachment 731321View attachment 731322View attachment 731323

Finally, bonus, off topic, a very haunted wall. This is where the car thinks the bodies are buried. The perception is superhuman. Or perhaps supernatural. Look at all those faces. (I reported this.)
View attachment 731320
It hard to see light yellow on a video captured screen. I can't tell if the double yellow is completely missing or just low probability. It seems FSD did select a path around them, though.

Fsd wanting to place you between the double yellows is the kind of thing that happens to me. Drivable area is correct but for some unknown reason it wants to abruptly move me over a lane. I would say that if you are up to speed and the wheel starts turning quickly, abort. This is different than what the wheel does when you are starting a turn from a stop.

We do know that Fsd takes double yellow lines as more of a guideline and may completely ignore the rule about pairs of double yellows separated by 2 feet. But this is a different issue.

It seems I've become the whipping boy for Tesla on this topic. I only wondered why the wheel was twitching on turns and came up with what I thought was a reasonable explanation which I so magnanimously shared. I've repeatedly expressed that this is something they need to eventually address.
 
It seems I've become the whipping boy for Tesla on this topic. I only wondered why the wheel was twitching on turns and came up with what I thought was a reasonable explanation which I so magnanimously shared. I've repeatedly expressed that this is something they need to eventually address.
I think it was mostly the comments about it not being a bug and the comments about how they can come back and fix this much later after the car can drive (paraphrasing on the second one - too lazy to go back and find your exact quote).

The overall reason for this behavior that you propose seems reasonable. But the idea that this is not a major issue is where we seem to differ.
 
I think it was mostly the comments about it not being a bug and the comments about how they can come back and fix this much later after the car can drive (paraphrasing on the second one - too lazy to go back and find your exact quote).

The overall reason for this behavior that you propose seems reasonable. But the idea that this is not a major issue is where we seem to differ.
I don't find the annoying twitches when starting a turn to be a "major" issue, and it's not a bug. It's working properly to follow the path.

Now, the fact that the planner/perception will suddenly decide to erroneously guide the car sharply to the side I agree is a major issue. This should be corrected before the correct but annoying tendencies of path following at slow speed.
 
I don't find the annoying twitches when starting a turn to be a "major" issue, and it's not a bug. It's working properly to follow the path.
That’s where we disagree. It’s a symptom of a system that has a high degree of uncertainty about the correct course of action, and vacillates between different options, in the absence of any changes in the environment.

I think it’s actually the same issue as the one where the path planner will erroneously guide the car sharply to the side….or at least, I don’t know how you can confidently separate the two issues. They may very well be related.

This should be corrected before the correct but annoying tendencies of path following at slow speed.
Again, I don’t know how you’re separating these. Why is the path all over the place at low speed? If you watch the visualization in certain static environments when stationary, it is moving/wavering all the time, along with the path. Why? Because it has too much uncertainty, I think.

I just think this points to there being a huge amount of work left to do on stabilizing perception. As I’ve said elsewhere, a world which is constantly changing shape and size is not going to lead to a smooth driving experience, and it also negatively impacts achievable safety. So they’re going to have to fix this. And there’s probably work to do on the path planner as well.

Hopefully in 6-12 months we have much more sophisticated and stable visualizations, reflecting these improvements in how the car understands the world and constructs an accurate model of that world. I think if we do we’ll see more stable steering behavior, both while stopped and while moving.

It’s an extremely difficult problem though.
 
  • Like
Reactions: Phlier and Sporty
I am not an FSD Beta tester but does Tesla not provide more details on these aspects of FSD development like e.g. Comma ai is providing when they release. For example

 
  • Informative
Reactions: EVNow
You don't have AP/FSDbeta engaged so why do you care?

I see it doing a bunch of stuff while stopped and waiting for a light. For example, if I'm at a light and have my left turn signal on then it should take the hint and plan to make a left and not plan to go straight ahead.


But it's so wrong. So. Wrong. Path planner. Planners of paths. Path planning daemon of whatever domain. It's just wrong.

Here I'm stopped. This is the predicted path :confused:

View attachment 731262

Or, you can look at the world and realize the path is ... a straight line forward.

View attachment 731263
 
From a non technical steering point of view. I feel like I need a 5 point harness, and nothing movable that I care about on any seat. Passagers, need to be warned, and glad there are seatback pockets for airplane bags...

My 15 year old is driving the 3 on a learners permit. He laughs at the crazy turns and the creeks in the cars suspension and body when I drive in FSD Beta. For me it feels like the Mine Ride at Disney World.

Early on the AP did crazy ivans in the middle of the road. I guess it just needs a little more work. I say 3 or 4 more revisions on the herky jerky steering.
 
perception of breaks in the solid double yellow that occurred in multiple frames earlier in the video, in spite of being highly confident in there being no breaks in the lines in the latest image
I think this is "just" a case of late predictions and earlier low confidence that should be "easily" fixed with more training data to make a confident prediction from further away. I've noticed various road layouts where lines and road edges are still blurry in the visualization even when just several feet away with a clear view, so I'm especially careful when the light turns green as most likely the predicted path was jumping around too.

This should indicate the neural networks can predict it with the current architecture/stack, so the weights need to be biased appropriately.

My hunch is that we haven't really seen any road layout or moving object improvements in 10.x since public beta because they're busy training with autolabeling to those items "rewritten" to use the full stack (360° video with memory). It has taken longer than they expected, so that's why FSD Beta 11 keeps getting pushed out with more 10.x releases.
 
I don't find the annoying twitches when starting a turn to be a "major" issue, and it's not a bug. It's working properly to follow the path.
Again, this is a very narrow view of what a bug is. I expect a new dev to take this view - not a user.

As Elon would say - lets go back to basics.

The basic idea is - FSD should drive like humans, only better (remember all that talk about why vision is enough because obviously humans don't have lidar ?). No half-decent human driver ever drives like that. Current behavior is probably in the bottom 5% of human drivers. How is that not a bug that needs fixing ?

BTW, I've no idea why you are defending the crazy jerky behavior. I get it that you don't care for the issue - its not your priority issue. But for others it is.
 
I think this is "just" a case of late predictions and earlier low confidence that should be "easily" fixed with more training data to make a confident prediction from further away. I've noticed various road layouts where lines and road edges are still blurry in the visualization even when just several feet away with a clear view, so I'm especially careful when the light turns green as most likely the predicted path was jumping around too.

This should indicate the neural networks can predict it with the current architecture/stack, so the weights need to be biased appropriately.

My hunch is that we haven't really seen any road layout or moving object improvements in 10.x since public beta because they're busy training with autolabeling to those items "rewritten" to use the full stack (360° video with memory). It has taken longer than they expected, so that's why FSD Beta 11 keeps getting pushed out with more 10.x releases.
I hope you’re right about that.

Agreed that there are many cases where the perception is surprisingly blurry when it seems like it should have high confidence. And hopefully a “major rewrite” gets them on a firm footing where they have very stable perception. And hopefully they can merge the voxel data with the labeled environment.

Anyway, I guess it seems like at least some of us are in agreement that the steering issues are as much tied to perception as they are path planning errors.

Getting to really high confidence (when it is warranted) in all of this perception seems like a big challenge.
 
Again, this is a very narrow view of what a bug is. I expect a new dev to take this view - not a user.

As Elon would say - lets go back to basics.

The basic idea is - FSD should drive like humans, only better (remember all that talk about why vision is enough because obviously humans don't have lidar ?). No half-decent human driver ever drives like that. Current behavior is probably in the bottom 5% of human drivers. How is that not a bug that needs fixing ?

BTW, I've no idea why you are defending the crazy jerky behavior. I get it that you don't care for the issue - its not your priority issue. But for others it is.
I'm not defending the crazy jerky behavior. I have repeatedly said it needs to eventually be addressed. I'm just saying it's behaving as a properly functioning, early iteration, control loop to follow a path.

I would say you have an overly broad definition of a bug. You seem to consider any software that doesn't behave the way you think it should to be buggy. My view is narrower than this; it's note behaving as it was designed to. You just don't like Tesla's current design, which is fine.

As for priorities, I would much rather have them work on not turning into oncoming traffic first.
 
At speed on the highway, the turns are smooth and graceful. Perhaps there are fewer variables in the turn calculations than sitting still at an intersection and starting a 90º turn. It's making constant adjustments as it tries to decipher it's surroundings inch by inch. It can be comical, unsettling or downright scary. A human can take in the complete context of the situation and make a seemless steering maneuver. FSD cannot yet do that. Remains to be seen if it can someday.
 
  • Like
Reactions: outdoors
I'm not defending the crazy jerky behavior. I have repeatedly said it needs to eventually be addressed. I'm just saying it's behaving as a properly functioning, early iteration, control loop to follow a path.

I would say you have an overly broad definition of a bug. You seem to consider any software that doesn't behave the way you think it should to be buggy. My view is narrower than this; it's note behaving as it was designed to. You just don't like Tesla's current design, which is fine.

As for priorities, I would much rather have them work on not turning into oncoming traffic first.
I think it is fair to say that this is the type of bug which is normally referred to as a “feature.”
 
  • Like
Reactions: Phlier
You seem to consider any software that doesn't behave the way you think it should to be buggy. My view is narrower than this; it's note behaving as it was designed to. You just don't like Tesla's current design, which is fine.
What would you characterize errors in analysis & design as ? Not bugs ?

This is not new I came up with - studied about this in my masters 30 years back. Practiced this for 30 years in the industry - raised and fixed thousands of bugs on Others and my analysis & design. You remember the famous cartoon ... ?

Anyway, my last post on this. I think we both know where we stand on this.

0*ipMPQxjqxX7nkHdK.png
 
  • Love
  • Funny
Reactions: Phlier and impastu