Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta 10.69

This site may earn commission on affiliate links.
Yep. Either that or just have map data that is right.
FSD will never work without a better navigation system.
Apple Maps tells you exactly what lane to be in at all times when navigating. I don’t know why this is so hard. Seems much more simple than solving something crazy like chuck’s turn.
It's interesting, I was thinking about this problem while using FSDb on my morning commute. What if this bit of common sense ("seems much more simple than solving something crazy's like Chuck's UPL") is entirely wrong? What if having the car understand where it is on the road in relation to where it needs to be incredibly difficult? Think about it: you need to be able to image recongition not just of a stop sign or a traffic light or of a VRU or of a car (all of which are hard in and of themselves)...you need to be able to recognize EVERYTHING around the car and then infer where the car is based on what it can see. I'm starting to think that the reason why lane selection is so bad is that the car is doing a bunch of guessing and maybe 75% of the time it guesses right (mostly because there are only a few even remotely correct answers) but when it fails 25% of the time, it does so in a big and very noticeable way. Add to this bad mapping data and you have a really, really hard problem to solve, much harder than Chucks' crazy turn.
 
  • Funny
Reactions: AlanSubie4Life
It's interesting, I was thinking about this problem while using FSDb on my morning commute. What if this bit of common sense ("seems much more simple than solving something crazy's like Chuck's UPL") is entirely wrong? What if having the car understand where it is on the road in relation to where it needs to be incredibly difficult? Think about it: you need to be able to image recongition not just of a stop sign or a traffic light or of a VRU or of a car (all of which are hard in and of themselves)...you need to be able to recognize EVERYTHING around the car and then infer where the car is based on what it can see. I'm starting to think that the reason why lane selection is so bad is that the car is doing a bunch of guessing and maybe 75% of the time it guesses right (mostly because there are only a few even remotely correct answers) but when it fails 25% of the time, it does so in a big and very noticeable way. Add to this bad mapping data and you have a really, really hard problem to solve, much harder than Chucks' crazy turn.
It’s just poorly coded. In many of these situations, the visualizations show the correct lane. It just does the wrong thing all the time, like changing to the next lane to the left immediately before a right turn it has to make.
There is no confusion on where the car is or where it is going. It just does nonsensical things with that input.

On my drive I have six turns and on the right turn to the freeway since there was a pickup in front going a normal speed, it decided to try to change lanes to the left before the freeway on-ramp on the right. I just disengaged and reported. Re-engaged, a few seconds later it failed to signal right as it went right, so reported again.

3-4 disengagements this morning on 4-5 miles of surface streets. Probably about 10 interventions.

Are there cases with bad perception? Sure. But not the explanation of most behavior.
 
Last edited:
Has anybody else stopped to think that allowing FSD Beta to complete a drive with errors that the system ended up correcting but for any other human driver would have been an error or even “kind of” unsafe is not the right thing to do?

If you don’t disengage when FSD does something wrong but not deadly you are telling the system / program that it did it right. Meaning no changes necessary to that drive.

Those errors and “close calls” are never reported as such if you don’t disengage and send a snapshot of what happened. Just kind of odd to read here and watch videos of people saying great job / no disengagements when a lot of those drives had obvious errors no human would have made.
Either people will eventually start reporting those minor issues once we get past the major issues (1), or we'll all collectively agree that those errors are within tolerance and don't need to be fixed (2).

(1) There will always be the "worst 5 things FSD does" except what those 5 things are will be less and less impactful over time. Today it takes red lights and eats your children; tomorrow it takes turns fast enough to make you spill your beer. Today's "great job" is tomorrow's "this is why FSD will never work with only 29 cameras." So these sorts of issues might get fixed some day anyway.

(2) Or maybe they won't cause we'll decide that sitting back and enjoying a mojito in the backseat is worth that jerky left turn which would let Tesla get out of ever having to fix it.
 
Yep. Either that or just have map data that is right.
FSD will never work without a better navigation system.
Apple Maps tells you exactly what lane to be in at all times when navigating. I don’t know why this is so hard. Seems much more simple than solving something crazy like chuck’s turn.
Apple uses LIDAR generated HD maps. Tesla does not and will never use HD maps, they're a crutch.
1663099587027.png
 
It’s just poorly coded. In many of these situations, the visualizations show the correct lane. It just does the wrong thing all the time, like changing to the next lane to the left immediately before a right turn it has to make.
There is no confusion on where the car is or where it is going. It just does non-sensical things with that input.

On my drive I have six turns and on the right turn to the freeway since there was a pickup in front if decided to try to change lanes to the left before the freeway on-ramp on the right. I just disengaged and reported.

3-4 disengagements this morning on 4-5 miles of surface streets.

Are there cases with bad perception? Sure. But not the explanation of most behavior.
You can even see what lane you are supposed to be in if you look at the navigation route planner. The car knows what lane it's supposed to be in. It just seems like:

1) The planner knows where it needs to be, but the other NNs that handle things like routing around parked cars or moving to faster lanes are weighed more heavily in the decision making process.
2) The planner is not taking mapping data into consideration - what we see in the route info is not being relayed to the planner properly.
3) The planner has the mapping information, but is having problems visually identifying the lane markings and deciding which lane is correct (ie: it can't interpret the arrows that are often painted in the center of lanes).

My gut, just from experience and not any formal education on this matter, is that #1 is the issue. My car routinely knows that a turn is coming up and starts making its lane changes usually at the 1 mile mark. When a right turn is coming up next, and the car is in the left lane, when the route info shows 1 mile to the turn, I start to see the little blue upcoming lane change notice. Without traffic, the car moves over nicely and prepares for the turn. However, every so often, if there is traffic, the car moves over to the right lane and then moves back over towards the left because there is a line of cars in the right lane waiting to turn. In this case the planner should be able to override the speed-based lane change NN.

To be fair - it's a hard problem to calculate. I, a well experienced and smart human, have had this happen to me several times in my years. I'm on a freeway and my exit lane is coming up, but there is a long line for it. I wait patiently in line (I hate line cutters with a passion, but that's another topic), and traffic moves very slowly. Come to find out, there was an accident that I could not see, and people start slowly exiting the lane and working around the accident. In the end, I wasted several minutes sitting in a lane that wasn't going to move, but how was I to know? Had I cut out of the lane and found it was just heavy congestion, I would have had to cut back in towards the front of the line, pissing everyone else off. I think the Tesla is doing this to some extent - assuming traffic isn't moving in your lane very well, but moving nicely in the lane next to you, so it tries to route around what could be an accident or mechanical failure. Otherwise, it'll just sit there and not move until the accident is cleared, or the driver takes over and moves around manually.
 
  • Like
Reactions: VanFriscia
Beta needs to be able to read the posted lane use signs as a human does.
Hopefully better than humans lol. I've had people almost run into me because they didn't understand the turn-lane guidance signs before (which are admittedly confusing since they don't cover all the lanes, at least not in Texas).
Yep. Either that or just have map data that is right.
FSD will never work without a better navigation system.
Apple Maps tells you exactly what lane to be in at all times when navigating. I don’t know why this is so hard. Seems much more simple than solving something crazy like chuck’s turn.
I just learned about Waze's "Far Lanes" feature, and I super wish FSD had access to that kind of information. That way it'd know well ahead of time which lane it needs to be in for the best route.

There's one road where I live where the left lane continues straight and the right lane is a turn-only, and the line to go straight backs up a couple of miles. I'm constantly cancelling the lane change into the right lane because FSD is tired of waiting in line and would rather wait near the front for someone to let us in (which they won't).
 
They are a crutch but perhaps a necessary one in a world where we have not solved generalized human-like AI

This conflates two entirely different issues.

Lidar is a crutch for perception if you can't solve vision

All of which feeds into simply understanding what is around you (same with radar, FLIR, etc)

Generalized AI would then be needed to decide what to do about what is around you, both immediately and in the near future.

Most complaints I've seen with FSDBeta are behavioral problems, not perception ones (see all the "the visualization is absolutely correct the car just decides to do the wrong thing" comments of late)
 
If you don’t disengage when FSD does something wrong but not deadly you are telling the system / program that it did it right. Meaning no changes necessary to that drive.
If you mean "system" as some automated loop, I don't think there's reinforcement training of the neural networks at the level of a whole maneuver as a lot of that code is still software 1.0 and lacks training data to bias towards this "wrong" behavior anyway. If "system" is just the feedback cycle to Tesla in its automated trip summary reports being skewed lower because some people push the limits for "no disengagement," this likely washes out to noise across the whole fleet.

Even without manually pushing the video snapshot button, Tesla can still have shadow mode triggers detecting patterns such as earlier mispredictions that were completely different from later predictions that might have resulted in "close calls." I would still highly recommend people push the snapshot button, but if you forget or choose not to, Tesla could still get the data even without disengagement.

Even if Tesla doesn't get the feedback signal "changes necessary" for a particular drive, there are a lot of signals from the whole fleet of similar-enough situations that result in general improvements to neural network predictions that might make a future version drive the original situation just fine.

This could also be another reason why Tesla might want to expand FSD Beta population with 10.69 as indeed existing testers could be biasing the feedback loop by not disengaging or not even activating FSD Beta in the first place for "known issues."
 
Most complaints I've seen with FSDBeta are behavioral problems, not perception ones (see all the "the visualization is absolutely correct the car just decides to do the wrong thing" comments of late)
You could argue that many of these are also caused by perception problems though. Any mistakes made in distance / velocity can and will translate to doing the wrong thing - if it thinks a pedestrian is moving when he/she isn’t, if it can’t deduce that a car is turning into your path instead of going straight, etc.
 
You could argue that many of these are also caused by perception problems though. Any mistakes made in distance / velocity can and will translate to doing the wrong thing - if it thinks a pedestrian is moving when he/she isn’t, if it can’t deduce that a car is turning into your path instead of going straight, etc.
We were talking (in this case) about basic navigation decisions in response to a strange post stating that the Tesla might not know where if is on the road.

But the perception nearly always shows the vehicle position correctly, shows the correct path at least for some period of time (often doing weird things as soon as the error starts to occur), and the navigation is providing the correct guidance on upcoming turns.

But it still screws it up. Just does completely the wrong thing for no apparent reason. It does seem to often occur when there is a lead car, but the behavior still makes absolutely zero sense.
 
Are there cases with bad perception? Sure. But not the explanation of most behavior.
Just checking, how would you classify this example of switching out of a right-turn lane for a right turn just 200 feet away but starts from 500 feet away:
kim 10.69.2 switch out right.jpg


(Here's a transcript 😜: No. It should not go… oh it's moving over for that jogger… interesting. Don't. NO! <disengage> Ugh! It's not supposed to go in that lane. <reengage> <disengage> Still wants to go over there though. <reengage> <disengage> Why? <reengage> <disengage> It wants to go around <reengage> those cars? I see the path. It's think… No! <disengage> No. Wow. This is SUPER annoying. <reengage> <disengage> I'm just gonna wait. I'm gonna wait until I start going forward to reengage. <reengage> <disengage> Wow… What is THAT? Is that a map thing? Is it just poor lane selection <reengage> going on here? What is it DOING? <disengage> Oh my gosh. I'm so annoyed. <reengage> So annoyed.)
 
I think you might have made a typo somewhere in there...
The only feedback the user can provide is accelerator input, disengagements, and the report button. Or are you saying that the car can detect when it doing something wrong without any user intervention?
I hope I didn't make a typo. No I am not saying it detects when it did something wrong. From what I understood (and maybe wrongly) is that the car captures braking and disengagements to report after a trip even without manually clicking the report button. Not sure if that is correct or not, so I don't want to be misleading without going out there on Friday and asking again about that in detail.
 
  • Like
Reactions: archae86
Just checking, how would you classify this example of switching out of a right-turn lane for a right turn just 200 feet away but starts from 500 feet away:
View attachment 852133

(Here's a transcript 😜: No. It should not go… oh it's moving over for that jogger… interesting. Don't. NO! <disengage> Ugh! It's not supposed to go in that lane. <reengage> <disengage> Still wants to go over there though. <reengage> <disengage> Why? <reengage> <disengage> It wants to go around <reengage> those cars? I see the path. It's think… No! <disengage> No. Wow. This is SUPER annoying. <reengage> <disengage> I'm just gonna wait. I'm gonna wait until I start going forward to reengage. <reengage> <disengage> Wow… What is THAT? Is that a map thing? Is it just poor lane selection <reengage> going on here? What is it DOING? <disengage> Oh my gosh. I'm so annoyed. <reengage> So annoyed.)
In looking at your screen...this seems "easy". Meaning...why didnt it/wouldnt it simply stay put..in..the right lane? This one CANT be that hard of a edge case, no?
 
So after my couple of trips to the pharmacy to pick up the hemorrhoid cream, I have found the following:

1) Still has issues with breaking for traffic lights that are on curved roads, either brakes late and hard or brakes hard and then let’s go and slows down to a stop. Need a better smoother use of regen.
2) Some places, it’ll still get into the right hand turning lane, despite the GPS indicating to go straight.
3) Weird mapping issue with the 4 way intersection by my community is still there. The car puts on a turn signal but goes straight when it has to go straight through the intersection. 🤔
4) The delayed braking for crossing traffic is extremely annoying. When driving on a road and a vehicle up ahead making a left-hand turn from a road or a business on the right, crossing through my lane of travel, makes the car hit the brakes, despite the crossing car already being finished with their turn and out of the way.

If I remember anything else I’ll add. It’s now time to apply the cream.
The delayed braking for crossing traffic can be found on the public builds as well (at least on 2022.20.8 for me). When I had the beta (from 10.2-10.12.1) this was never an issue, it handled crossing traffic perfectly. I was actually surprised at how well it would handle this, my first impression was that it would do exactly what it’s doing now on 2022.20.8 or, apparently, 10.69.2. But, I was pleased to see how natural it actually handled this.
 
  • Informative
Reactions: Ramphex
This conflates two entirely different issues.

Lidar is a crutch for perception if you can't solve vision

All of which feeds into simply understanding what is around you (same with radar, FLIR, etc)

Generalized AI would then be needed to decide what to do about what is around you, both immediately and in the near future.

Most complaints I've seen with FSDBeta are behavioral problems, not perception ones (see all the "the visualization is absolutely correct the car just decides to do the wrong thing" comments of late)
I'm not conflating anything. He was talking about Lidar "generated" HD maps. We're talking about maps not lidar. More detailed maps would improve planning, i.e. as you say, deciding what to DO. I think we're on the same page.

But I will give you a problem with perception i saw just today on 69.2 lol. FSDb Stopped at a stop sign, there were pedestrians crossing on the far side, but after they crossed, the visualizer continued to show them walking along the edge of the road, not sidewalk. So my car just stood frozen. But anyway, most (but not all) issues in normal clear conditions are planner based, agreed.
 
Last edited:
We were talking (in this case) about basic navigation decisions in response to a strange post stating that the Tesla might not know where if is on the road.

But the perception nearly always shows the vehicle position correctly, shows the correct path at least for some period of time (often doing weird things as soon as the error starts to occur), and the navigation is providing the correct guidance on upcoming turns.

But it still screws it up. Just does completely the wrong thing for no apparent reason. It does seem to often occur when there is a lead car, but the behavior still makes absolutely zero sense.
Ah, serves me right for only half-reading through the new posts. The viz does not include enough detail to see what is most relevant to the car's decision making - need the dev view to better understand. I suspect the most common issue is that it sees what it thinks is a valid lane but isn't (let alone other attributes like "is this a turn lane"/etc) and this screws up the decision tree further, even if the visualization itself looks ok.
 
  • Like
Reactions: AlanSubie4Life
I'm not conflating anything. He was talking about Lidar "generated" HD maps. More detailed maps would improve planning, i.e. as you say, deciding what to DO. I think we're on the same page.

How do you think LIDAR would be helpful in lane planning in a way vision is not- specifically?

LIDAR can't "see" things, esp. 2D things like lane lines or arrows NEARLY as well as vision can.