Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta 10.69

This site may earn commission on affiliate links.
Ah, serves me right for only half-reading through the new posts. The viz does not include enough detail to see what is most relevant to the car's decision making - need the dev view to better understand. I suspect the most common issue is that it sees what it thinks is a valid lane but isn't (let alone other attributes like "is this a turn lane"/etc) and this screws up the decision tree further, even if the visualization itself looks ok.
In the particular instance this morning for me, fairly sure it was the lead vehicle that caused the problem. As soon as the yellow light went, the truck started to slow, so the car (which had just changed into the lane behind the truck and started tailgating it) decided to go around. All sorts of issues with that.
 
  • Helpful
Reactions: Silicon Desert
Just checking, how would you classify this example of switching out of a right-turn lane for a right turn just 200 feet away but starts from 500 feet away:
View attachment 852133

(Here's a transcript 😜: No. It should not go… oh it's moving over for that jogger… interesting. Don't. NO! <disengage> Ugh! It's not supposed to go in that lane. <reengage> <disengage> Still wants to go over there though. <reengage> <disengage> Why? <reengage> <disengage> It wants to go around <reengage> those cars? I see the path. It's think… No! <disengage> No. Wow. This is SUPER annoying. <reengage> <disengage> I'm just gonna wait. I'm gonna wait until I start going forward to reengage. <reengage> <disengage> Wow… What is THAT? Is that a map thing? Is it just poor lane selection <reengage> going on here? What is it DOING? <disengage> Oh my gosh. I'm so annoyed. <reengage> So annoyed.)
Yep, lead vehicles again. It makes no sense; it obviously knows where it is going and where it is. Just bad planning. Very common. Obviously more common with stationary vehicles and maybe tougher but the vehicles definitely don't need to be stationary!
 
I hope I didn't make a typo. No I am not saying it detects when it did something wrong. From what I understood (and maybe wrongly) is that the car captures braking and disengagements to report after a trip even without manually clicking the report button. Not sure if that is correct or not, so I don't want to be misleading without going out there on Friday and asking again about that in detail.
Of course, which is why I said it's helpful to Tesla to press the accelerator when the vehicle is slowing erroneously, that data is also sent to help them train the system. @sleepydoc was saying that the seemingly erroneous slowing is due to the system "working" (aka. processing) and that by pressing the accelerator you may be depriving Tesla of information on how it would have eventually decided to proceed. I questioned how one can distinguish between slowing or pausing due to working/processing time and slowing or pausing due to the system having already made an incorrect calculation. That question remains open.
 
  • Funny
Reactions: AlanSubie4Life
It might be easy for Tesla engineers to add a slight delay to the NN handling speed-based lane changes, and base that delay on the assertive setting (Chill, Standard, Assertive). In Chill mode, it will sit happily in a lane and wait for X number of seconds/minutes before deciding to go around. Standard would be 50% of Chill's value, and Assertive would be 50% of Standard's value. Let's say Chill is set to 2 minutes. It'll wait 2 minutes before attempting to go around. Standard would wait 1 minute before trying to go around, and Assertive would wait 30 seconds.

I think that could solve a decent number of problems people are experiencing with the car being "impatient" and moving around cars when it clearly needs to stay in its lane to make a turn.
 
  • Like
Reactions: FSDtester#1
I wager Highway AP is much better than 100 miles/disengagement. I suspect most of those disengagements are probably from lane changes (for normal AP people) and executing more assertive lane changes in moderate/heavy traffic (for NOA users). I suspect safety disengagement rate for AP could be in the thousands of miles/disengagement.
It may depend on the highway. I did a 1500-mile roadtrip recently on NoA (from LA to Oregon and back), and had at least 10-15 safety-critical disengagements. Some were when the car incorrectly swerved into turnout lanes or left-turn lanes and I had to yank it back; some were when it put itself on a collision course with safety cones/posts/guardrails; some when it got itself into a leftmost lane that was ending and refused to slow down to merge with traffic, then panicked when it ran out of room. Some were when it slowed down for construction zones and failed to speed up again afterward, thinking the speed limit was still 25mph when all the surrounding cars were going 70mph. (These are absolutely safety disengagements.)

The thing about highway driving is that it's much more predictable when NoA/AP will make mistakes, and when it won't. If you're on a perfectly ordinary wide-open stretch of highway, you can pretty much relax. (Though of course keep watching and supervising.) On city streets with FSD Beta, you have to watch it like a hawk; it's far less predictable when it will make mistakes, and that's why it's so mentally taxing to use it at present.
 
Last edited:
  • Informative
Reactions: momo3605
It may depend on the highway. I did a 1500-mile roadtrip recently on NoA (from LA to Oregon and back), and had at least 10-15 safety-critical disengagements. Some were when the car incorrectly swerved into turnout lanes or left-turn lanes and I had to yank it back; some were when it put itself on a collision course with safety cones/posts/guardrails; some when it got itself into a leftmost lane that was ending and refused to slow down to merge with traffic, then panicked when it ran out of room. Some were when it slowed down for construction zones and failed to speed up afterward, thinking the speed limit was still 25mph when all the surrounding cars were going 70mph. (These are absolutely safety disengagements.)

The thing about highway driving is that it's much more predictable when NoA/AP will make mistakes, and when it won't. On city streets with FSD Beta, you have to watch it like a hawk; it's far less predictable when it will make mistakes, and that's why it's so mentally taxing to use it at present.
I assume these weren't "limited access" highways? What you're saying makes sense. If there are turning lanes showing up in the middle of the highway, i absolutely expect AP could get confused. I was thinking more like multi-lane limited access freeways, where there's nowhere to go if you just stay in your lane. Construction zones is another good example of where AP would very likely require a disengagement
 
Thought it would be relevant to add some article with a good clickbait title into this discussion of 10.69.2. This article is relevant as it links to a couple of videos I haven't seen (as well as another example of the blind hill problem video being discussed in the other thread).

Overall the article seems very fair. I'm not scared of FSD Beta though, because I am responsible for accidents when I am driving.

 
Last edited:
I assume these weren't "limited access" highways? What you're saying makes sense. If there are turning lanes showing up in the middle of the highway, i absolutely expect AP could get confused. I was thinking more like multi-lane limited access freeways, where there's nowhere to go if you just stay in your lane. Construction zones is another good example of where AP would very likely require a disengagement
They were not all limited access per se, but they were still "highways" as far as Autopilot is concerned. (FSD Beta would fall back to NoA here.) Even limited-access highways can have construction and dedicated exit lanes and turnouts and such. It's cherry-picking to exclude the parts of highways one doesn't like from the highway statistics. Granted, I'm sure you could find individual stretches of highway on which the current disengagement rate is less than 1 per 1000mi, but that's not representative of all highways in general.
 
  • Like
Reactions: momo3605
They were not all limited access per se, but they were still "highways" as far as Autopilot is concerned. (FSD Beta would fall back to NoA here.) Even limited-access highways can have construction and dedicated exit lanes and turnouts and such. It's cherry-picking to exclude the parts of highways one doesn't like from the highway statistics. Granted, I'm sure you could find individual stretches of highway on which the current disengagement rate is less than 1 per 1000mi, but that's not representative of all highways in general.
This is why I would like to see single-stack released soon. I drive high speed 2-lane rural highways that FSD beta hands over to NOA. For the most part they are fine, but I worry about how NOA would handle cross traffic as these roads have no on or off ramps and plenty of intersections outside of the small city limits where NOA takes over.
 
Finally a Beta that doesn't "upset 🤢 the stomach" and is so much smoooooother. I'm thinking this version will be the first version that passengers wil be able to tolerate. All the unnecessary and unpredictable jerk is what bothers passengers the most and instills a complete lack of confidence. Has anyone tried 69ing (get your mind out of the gutter 😇) with someone who has been reluctant to let you use in the past?
I'm just waiting for 10.16.2 to actually be smooooother than 10.12.2 but so far for me it's actually jerkier. It did seem a little better today though so I'm going to remain optimistic.
 
  • Like
Reactions: bsf29
Yep, lead vehicles again. It makes no sense; it obviously knows where it is going and where it is. Just bad planning.
I agree the lead vehicles blocking the view of the road did complicate things, but I would think this resulted in bad perception causing the planner to act on bad information.

One set of mispredictions happened as the van entered this immediate intersection temporarily blocking the view of the upcoming intersection. This resulted in 10.69.2 wanting to switch lanes to "follow route" maybe believing it had gotten into a right-turn-only lane for this intersection that the van just came from:
kim follow route.jpg


Another set of mispredictions could have been identifying lead vehicles as parked and/or occupancy network indicating static objects in the road ahead resulting in "path blockage." The latter could have been complicated by the fact that the road starts to curve left without a clear view of the curb:
kim path blockage.jpg


Interestingly, Tesla theoretically could take this "lead vehicle problem" and turn it into inputs for various neural networks (currently seems like moving network is separate from lanes and occupancy networks), so that seeing multiple lead vehicles behave a certain way could imply the lane connectivity continues through the first example and there's nothing parking/occupying in the space ahead for the second example.
 
but I would think this resulted in bad perception causing the planner to act on bad information.
I mean, sure, there might transiently be bad results in perception due to occlusion, but seems like it's easy enough to construct the scene. It's not like the scene is changing all the time.

And I agree with your points about refinement to their perception and how to handle stopped vehicles (that seems harder than vehicles that are moving quickly - which it also screws up).

As I said, there are probably examples with actual bad perception but there are tons and tons of just straight bad decisions (often, but not always, involving a lead vehicle). I agree that a lead vehicle might lead to different perception confidence about the scene directly ahead, and a resultant change in the path planner, but to me that doesn't seem like a problem with perception. It's fine if the vehicle can't see something, as long as it's seen it before, and knows that it needs to be in the rightmost lane (which it usually has an excellent handle on). The planner can keep track of that stuff in its memory. Who cares if the car can't see in front of the vehicle ahead? Planner should just stay in the right lane.

I'm assuming that FSD at SOME level is using maps and needs to know that it needs to be turning right, and for that it needs to be in the rightmost lane. So it's just bad planning to change lanes in most circumstances especially if there is not time to pass traffic and get back in the correct lane.

Anyway in the above case it seems like it was fine to change lanes; can get in front of all those vehicles and then cut in at the last minute. Ha. (I haven't gone to look at the Google Maps to see if that's legal here.).


Here's an example from me, what perception explanation is there for this?

- If I (and most Tesla owners) had been driving I would have been going 50mph in 3 seconds (after making sure the intersection was clear with nobody running lights!), and I would have made the next light, to be clear. This was an awful performance. If I hadn't been deliberately letting FSD do its thing, I would have floored it. Its lack of assertiveness just got itself in trouble here and made driving much more difficult. No one wants to be behind a pickup with a ladder, apparently including FSD. But it never would have been successful with this approach.

 
Last edited:
This is why I would like to see single-stack released soon. I drive high speed 2-lane rural highways that FSD beta hands over to NOA. For the most part they are fine, but I worry about how NOA would handle cross traffic as these roads have no on or off ramps and plenty of intersections outside of the small city limits where NOA takes over.
I was out running errands today and used a mixture of NoA and FSD. I had more disengagements with NoA then I did with FSD. Most of them were because NoA is so slow to change lanes that it kept missing exits that had short merging distances.
 
so we are now in the timeline of "FSD Beta saved my life"? in the same video where it puts you in dangerous situations that required safety disengagements?

Tesla clout chasers smh.
OP here for that video... Clout chaser? Maybe for some of the new guys just getting these releases, not me... Like Chuck Cook (who I know) I've spent countless hours testing and posting videos when I can and have been beta testing since Oct 2020. My goal is to show the progress and improve safety with this technology.
 
The planner can keep track of that stuff in its memory. Who cares if the car can't see in front of the vehicle ahead? Planner should just stay in the right lane.
Hah. This made me think of the common "just workaround with software" (e.g., hardware behaves incorrectly or here network mispredicts). Maybe Tesla will need to do something like that, but their approach so far has been to push more of the problem into neural networks to avoid these "quick fixes" (that often result in other quick fixes instead of actually fixing the underlying issue that could fix multiple classes of issues at the same time).

In the case where it was just 200 feet / 3 vehicles away from the turn, what if the 2nd vehicle ahead truly was parked? Now the planner workaround to just stay in the right lane needs to have exceptions with adjustable parameters of how long can it stay in the lane with maybe extra caveats of ignoring time when it was actually a red light or maybe it was actually a green light but pedestrians were in the crosswalk. What if the lead vehicle inched up a little bit, should all of these timers be reset, etc.?

It does seem like various networks are still lacking temporal memory for consistency even a year after 2021 AI Day where it was presented, so maybe Tesla knows they'll have these mispredictions fixed "soon," so it's not worthwhile to engineer these workarounds in the planner. But then again, maybe they won't be able to get the networks to make the correct predictions and will end up needing a workaround anyway. Or maybe they have a different approach that they'll show off at 2022 AI Day in less than 3 weeks.
 
  • Like
Reactions: AlanSubie4Life
Hah. This made me think of the common "just workaround with software" (e.g., hardware behaves incorrectly or here network mispredicts). Maybe Tesla will need to do something like that, but their approach so far has been to push more of the problem into neural networks to avoid these "quick fixes" (that often result in other quick fixes instead of actually fixing the underlying issue that could fix multiple classes of issues at the same time).
I definitely don't mean to trivialize the task. I'm just saying it's easy for me to tell that those vehicles aren't parked. I have no idea how they're going to figure it out though. To me it seems like it would be difficult to code something up to determine what to do if you can't figure out which vehicles are parked.

However, in order to not to get distracted by parked vehicle problems, see my video above for a case where it has nothing to do with vehicles being parked. Isn't this just trivial planning?
 
Last edited:
Of course, which is why I said it's helpful to Tesla to press the accelerator when the vehicle is slowing erroneously, that data is also sent to help them train the system. @sleepydoc was saying that the seemingly erroneous slowing is due to the system "working" (aka. processing) and that by pressing the accelerator you may be depriving Tesla of information on how it would have eventually decided to proceed. I questioned how one can distinguish between slowing or pausing due to working/processing time and slowing or pausing due to the system having already made an incorrect calculation. That question remains open.
I think you misunderstood my post - I said that pressing the accelerator was essentially as bad as a disengagement and in response to a follow up post said that it didn’t matter whether it was because the system was slow or just unable to proceed. I also hypothesized that pressing the accelerator *may* send a report to Tesla.
 
  • Like
Reactions: Silicon Desert
If I had been driving I would have been going 50mph in 3 seconds (after making sure the intersection was clear with nobody running lights!), and I would have made the next light, to be clear
This first issue is most likely caused by the planner. It knows it needs to get multiple lanes to the right. The visualization unhelpfully shows nothing when stopped although navigation does indicate "Upcoming lane change," so the planner is choosing to go very slowly to let the adjacent vehicle get ahead so that it can complete its lane change as it can control its own speed but not the speed of other vehicles, e.g., it's "easier" to go say 10mph slower than the adjacent vehicle than trying to pass 5mph faster, which also makes you travel more distance and more likely to miss your turn if you can't complete the multiple lane changes. When the light turns green, it right away highlights the adjacent lane and vehicle blue. Should the planner be allowed to go 0-50mph in 3 seconds?

Then the planner "follows too closely" behind the truck because it decreases the follow distance "budget" in order to achieve the higher level goal of getting to the correct lane. Of course, FSD Beta will still only actually make the lane change if it believes it can still stop in time with a hard minimum follow distance limit based on what it perceives is the lead vehicle's position and velocity, etc., and here it seemed to complete the lane change with about 0.5 second follow distance indeed shorter than what a "1" setting would normally allow.

The unnecessary left lane change was initiated just before Scripps Highland Drive as the map data most likely indicated there was a right-turn-only lane (OSM: turn:lanes=reverse;left|left|||||right), however if we believe FSD Beta currently has about a 15-frame delay in seeing something to acting on it, we need to go about half a second back from here where the message shows "follow route:"
subie bad follow.jpg


Only here does the solid white line become dashed for the upcoming right-turn-only lane, so half a second earlier, it's even less clear that there will be a dedicated turn lane instead of a wide bike lane. But this is exactly what "deep lane guidance" introduced with 10.69 is designed to solve with more training data where your video snapshot a few seconds later with a clear view of the "future" lanes will make later versions of FSD Beta be able to predict "it's okay to stay in this lane and ignore map data as it looks like there's enough space on the right for a new forked lane that will actually be the right-turn-only lane."

If future versions don't improve with more training, maybe there needs to be more beer… 🍺😜
 
Last edited: