Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta 10.69

This site may earn commission on affiliate links.
I had several roundabouts in my drive today. I wanted to see if FSD was having difficulty deciding how to negotiate them or was just being overly cautious in doing it. For two of the, I used a slight accelerator pressure to keep the car moving and it negotiated the roundabout perfectly. For the next two I allowed FSD beta control the progress. It accomplished the task, however it was very awkward and slow. Conclusion some of the poor performance is due to the car being overly cautious.
OTOH - it doesn’t always stop for vehicles coming from the left - which can be dangerous. Sometimes it also stops for vehicles on the right.

Its as if the devs actually have never used roundabouts and don’t know the rules.
 
Are you “chill” ?

In limited driving I found 69.3 to be better than 69.2.

My guess is the planner has some logic about which lanes to use - but those don’t work well. Potentially they use some probabilistic logic that would making switching lanes more likely because small changes in circumstances.

Yes I'm in chill. I just finished a drive where fsd literally changed away from a turn, I took over, turned, then activated it again, and again, about 0.2-0.3 miles from another turn, it changes away.

These are all 4+ lane roads and mostly for right turns.

I'm in the bay area BTW.
 
  • Like
Reactions: FSDtester#1
If you let it change what does it do ?

It changes, drives, and then about 20-50m from the turn, it tries to change back to the correct lane to turn. It's just too awkward that I don't care to "test" it. It's clearly some big flaw in the planner that Tesla needs to sort out. We've seen it do this to some degree pre 3.1. I think 69.3.1 is the worst offender for me so far.
 
If you let it change what does it do ?
Like @powertoold, I drive in Chill and saw pretty much the same thing. I had a right exit and the car moved two lanes left a half mile before my exit. Then, a couple hundred feet before the exit, the car quickly moved across two lanes to take the exit (it was actually kinda smooth). The exit was also three lanes and I again needed the right lane. The car went directly into the leftmost lane and I took over at that point because I didn't think it would move right in time. Perhaps it would have repeated the whole process again by hurriedly moving right.

I've seen this on other roads and it was always FSD trying to manage a turn when dealing with three lanes. Both right and left turns.

This isn't new behavior. Again, as @powertoold says, it was doing this in earlier builds. Definitely in 2.3 and 2.4.
 
- a big one - it went through a red light! This was at night, and it slightly applied brake just before deciding to pass through.
Same thing happened to me today, too. I just happened to be looking at the screen as it started moving and the red lights were flickering, so the cameras weren’t rendering them properly. Scary though and luckily I was paying attention.

Also, would’ve been reassuring to be able to report this, even if the disengagement should handle that.
 
I also noticed 10.69.3.1 is very serious about driver monitoring. I spent a second too long looking down at the new Energy tab while on the highway, and I got a stern "Pay attention to the road" message and warning beep.
Do you remember about how long that was? I have no idea how long the nags take since I never get them. I try to not look away from the road for more than 2 seconds and to not let go of the steering wheel for more than 4 seconds. I’m not sure if I’m being overly cautious or not.
I don’t have a nag defeat device, but I also don’t see a reason why Tesla is wasting time on adding anti-defeat device measures. The people who are using them clearly don’t care about breaking the rules on purpose, and they’ll continue do it, regardless of what Tesla comes up with. Just sounds like a never ending cycle.
They want to avoid the press that comes with the FSD software being blamed for an accident or death. It doesn’t matter if the driver should have been paying more attention. It’s a bad look for Tesla either way.

Remember the smart summon/airplane incident? Clearly the owner’s fault, but the only thing reported was that a Tesla hit an airplane, not that the owner was stup—silly.
 
I don’t have a nag defeat device, but I also don’t see a reason why Tesla is wasting time on adding anti-defeat device measures. The people who are using them clearly don’t care about breaking the rules on purpose, and they’ll continue do it, regardless of what Tesla comes up with. Just sounds like a never ending cycle.
It's a necessary evil. Tesla has to show they are doing something to stop it to keep NHTSA off their back. When the buddy dropped, NHTSA was pissed and immediately banned its sale.
 
Tesla has to try to circumvent known gadgets like these. But there's always a counter measure.

I also don’t see a reason why Tesla is wasting time on adding anti-defeat device measures.

There may always be countermeasures, but they do have to be increasingly sophisticated, and that's fine. The more sophisticated the defeat the less defensible it is and the more likely it is that any news story will discuss it (e.g. Tesla driver had Shake Weight suspended from steering wheel).

A human can tell using the cabin camera when a user is paying attention in nearly all cases, so ultimately using neural nets that perform as well or better than a human’s should easily prevent abuse. No matter what people try, the superhuman AI will not be able to be defeated - it would even detect someone holding up a static picture. Always watching, like the Eye of Sauron.

Keeping hands on the wheel is an essential requirement outside the view of the camera, and I think that Tesla similarly should be able to use super-duper AI to figure out whether there is a human at the wheel in all cases, without fault - just like a skilled human could if they had a readout of the torque application vs. time. One thing I'd like to have them prohibit is occasional tugging at the wheel - they should be enforcing hands on the wheel at all times with torque application. This would help to defeat some of the “SEXY button” methods of satisfying the nags (they may also need to hobble volume control overrides - allow them to be used only occasionally, since that's a workaround as well that SEXY buttons could use if Autopilot hands-on was rendered useless).

So there are a lot of driver monitoring improvements likely still in the pipe, before wide release. I'm sure Tesla has a bunch of defeats of the defeats queued up.

Kind of a catch-22 (sort of) though, since if the AI were that good, then you might not need monitoring. (Though even with human-equivalent-or-superior perception AI there's much more to autonomous driving than just perception...)

Be careful all - if you care at all about your wheels and tires!

I am going to laugh at myself so hard if I ever crunch a wheel using FSD Beta. It really is kind of funny to see this happening over and over again. Be careful out there, and laugh if you curb your wheel. Life is short. And remember there are plenty of curbs FSD can't see. Just have to trick it, like with child-like mannequins.

It's clearly some big flaw in the planner that Tesla needs to sort out.
Robotaxis soon? [EDIT: "soon" leaves a bit of latitude for interpretation, haha]
Remove what feature permanently? The steering wheel weight detection feature or some kind of feature tied to resetting strikes?
Feature = FSD. FSD access is removed after strike limit has been reached.

In the event FSD (Beta) goes to wide release, Tesla will more likely than not allow owners to restore access to their purchased feature via a more user controllable and much faster means than a strike reset.

It’s likely one of the hurdles that we need to see before a wide release is done. That's what I was trying to say.

We've seen the report button go away, we've seen a widened narrow release, and enhanced driver monitoring. But still a few hurdles to leap.
 
Last edited:
Robotaxis soon?

Actually, yes, lol. I'm feeling good about this version.

There are obvious planner issues (e.g. 69.3.1 tries to blow past unprotected rights with islands, aggressive creeps, nonsensical lane decisions), but it's stuff that Tesla can fix with a point release.

There are a couple of turns where the car needs to drive into a hashed area / large bike lane, and this is the first version that has done it adequately. Prior, it would avoid the hash area and turn at the last minute, which is unnatural.

Progress is slower than I'd like, but things are moving towards refined drives.
 
Actually, yes, lol. I'm feeling good about this version.

There are obvious planner issues (e.g. 69.3.1 tries to blow past unprotected rights with islands, aggressive creeps, nonsensical lane decisions), but it's stuff that Tesla can fix with a point release.

There are a couple of turns where the car needs to drive into a hashed area / large bike lane, and this is the first version that has done it adequately. Prior, it would avoid the hash area and turn at the last minute, which is unnatural.

Progress is slower than I'd like, but things are moving towards refined drives.

@EVNow should start a plot showing miles per disengagement (disengagements per mile?) vs. time for all the releases and also for 10.69.x. We can extrapolate this line to the ultimate requirement for robotaxis. If it's best fit by an exponential we can fit an exponential. I'm curious what year it would end up in. 2500? Not sure. Hopefully the curve looks like an exponential; I think that kind of fit will be needed to bring in the date.
 
  • Like
Reactions: FSDtester#1
@EVNow should start a plot showing miles per disengagement (disengagements per mile?) vs. time for all the releases and also for 10.69.x. We can extrapolate this line to the ultimate requirement for robotaxis. If it's best fit by an exponential we can fit an exponential. I'm curious what year it would end up in. 2500? Not sure. Hopefully the curve looks like an exponential; I think that kind of fit will be needed to bring in the date.

Someone mentioned it before, think it was Mardak, but Tesla has been slowly incorporating their 4D video NNs once they've reached a certain performance threshold. Although it may seem like Tesla has been stagnant for a while, they've been progressing towards their ideal architecture, which will unlock the local minima that allows them to achieve reliable human-like drives.

When Elon says 69.3.1 (or whatever version) is a big architectural update, you can expect some regressions, because Tesla has been working for many months to bring that architecture to parity with the current performance.

The reason 69.3.1's object perception seems to be a step back in many ways (like dancing cars) is that video autolabeling takes a lot of processing, and to accurately autolabel 100,000s of clips takes a lot of time (due to the limitations of processing).

Autolabeling lane and road geometry is easier than autolabeling objects in 4D. That's why we got great birds eye view first, and now we're at the first stages of 4D object NNs.
 
It IS a white knuckle ride. It's a question of getting used to that. Those who can will use the system. Tesla still needs another level of improvement before your average person is going to accept that FSD can actually drive competently. Until then it will freak out a lot of people.
As others have noted, the current builds are FAR less "white knuckle" than the builds of (say) a year ago. Overall the car is now FAR smoother and human-like, but, as you note, there is still w ways to go.
 
Just tried out 69.3.1 today, and I'm not that impressed. Right turns were still wide and felt a little out of control. Also, first time this has ever happened to me but FSD stopped me right in the middle of a set of train tracks - like dead-center as if it was telling me, "Die human, die!" Don't know if this has happened to anyone else lately. I also don't have the "save the clip" icon so how do I even report that? If reporting was discussed in the previous pages, sorry…tl dr. 😂😂
 
Do you remember about how long that was? I have no idea how long the nags take since I never get them. I try to not look away from the road for more than 2 seconds and to not let go of the steering wheel for more than 4 seconds. I’m not sure if I’m being overly cautious or not.

Maybe about a solid 5-10 seconds. And I was specifically looking at the bottom edge of the screen where it gives recommendations on how to improve your efficiency, so my glance was probably very recognizably down.

I was in stop and go traffic on a highway on NOA, though, so I thought playing with the Energy tab for a few seconds couldn't hurt. I don't think it resulted in a strike, just a blue flash on the screen and a red-handed warning message.
 
Same thing happened to me today, too. I just happened to be looking at the screen as it started moving and the red lights were flickering, so the cameras weren’t rendering them properly. Scary though and luckily I was paying attention.

Also, would’ve been reassuring to be able to report this, even if the disengagement should handle that.
Twice on the same drive I had this exact behavior. Car was proceding to run the red light, flickering red/green depiction on screen. I disengaged both times and re engaged while right before the crosswalk, car still tried to run the red light.

For clarity, these were not yellow light approaches. Full red, clear view, first car at the light on approach.
 
I had two interesting take over immediately warnings today. Both were a result of me taking over after FSDb did something wrong and as such the car was straddling a lane control line. The first incident the car moved over the right lane line partially onto the shoulder to make a right turn at a stop sign with a vehicle in front of me going straight. I disengaged and immediately received a warning. The second was similar as FSDb partially crossed a solid lane marker on an entrance ramp. When I took over again I immediately got a take over warning. It was interesting that FSDb was happy doing what it did not want me to do.
 
  • Informative
Reactions: FSDtester#1
Do you remember about how long that was? I have no idea how long the nags take since I never get them. I try to not look away from the road for more than 2 seconds and to not let go of the steering wheel for more than 4 seconds. I’m not sure if I’m being overly cautious or not.
under ‘normal’ circumstances it’s about 10 seconds for the first ‘apply torque’ notice to show up at the bottom of the screen. I haven’t timed how long it takes between the message showing up, the first blue flash, the first beep and total disengagement, though.

On the AP/NOA stack it’s more like 20-30 seconds between nags.